Don't have an account yet? Sign up as a New User
Lost your password?
Welcome to ConsortiumInfo.org
Sunday, July 24 2016 @ 07:48 AM CDT
Wednesday, February 12 2014 @ 08:42 AM CST
Contributed by: Andy Updegrove
For more than a decade there has been active resistance in some quarters to the continuing custody by the U.S. of the root domain registries of the Internet. Those directories (which control the routing of Internet traffic into and out of nations) are administered by ICANN, which in turn exists under the authority of the U.S. Department of Commerce. Today, Neelie Kroes, the strong-willed European Commission Vice-President in charge of the E.C.’s Digital Agenda, has put the question of “Internet Governance” (read: control of these registries) back into the news. Specifically, Kroes announced in a press release that the Commission will pursue a “role as honest broker in future global negotiations on Internet Governance.”
Monday, October 14 2013 @ 04:36 PM CDT
Contributed by: Andy Updegrove
The unexpected disclosures of NSA activities by Edward Snowden presents a splendid example of U.S. government, as well as popular, indifference to world opinion. As part of its efforts to control the political damage of the embarrassing revelations, the Obama administration repeatedly stressed that only foreign nationals had been the targeted. As the breathtaking breadth of the data accessed and analyzed became clear, this rationale raised the question of how the foreign citizens - and even leaders - of U.S. allies might feel about being considered to be fair game for the NSA’s attention.
The answer to that question is that they weren’t happy. Nor, as we will see, were a group of NGOs that had no reason to think they were targeted at all.
Saturday, November 11 2006 @ 05:18 PM CST
Contributed by: Andy Updegrove
Once upon a time, there was something new called "the Internet," and it was an unknown quantity. While some guessed what it could become, most did not. Famously, Mark Andreessen - of Mosaic, and later Netscape fame - and Tim Berners-Lee did, while Bill Gates did not. Less publicly, those that helped to create something that came to be called the Internet Corporation for Assigned Names and Numbers - or ICANN - did, and the standards analogue of Bill Gates - the International Telecommunications Union - or ITU - did not.
The result was that ICANN came to control a small but vital piece of the Internet, called the root directories, while the ITU, a venerable global telecommunications standards organization existing under the aegis of the United Nations, and tracing its origins to 1865, did not, although perhaps it could have laid claim to those essential elements had it appreciated their future importance at the time.
And that road not taken, as Robert Frost once said, has made all the difference.
The almost haphazard way in which the future control of the root directories of the Internet was decided has become almost the stuff of legends (one of many versions may be found here). By some lights, the ITU would have been the logical home for the directories to reside, but regardless of your favorite interpretation of the actual events, that was not to be, and the ITU lost out.
Thursday, July 27 2006 @ 09:23 PM CDT
Contributed by: Andy Updegrove
A topic I've been following for about a year now is the struggle over "Internet governance," which has translated most directly during that time period into the following question: "will the US Department of Commerce give up control of the root directories of the Internet or won't it?" The debate over that question sadly monopolized the World Summit on the Internet Society (WSIS) for most of the life of that initiative (to date), and promises to continue to do so.
That's a shame, because the WSIS initiative was founded to bring the benefits of information technology and Internet access to all of the peoples of the world. Appropriately, it's administered by the International Telecommunication Union (ITU) under the auspices of the United Nations, and if you're interested you can follow what's happened (and hasn't happened) over the past year by scrolling through the news stories, comments and blog entries availalbe in this folder, or by scanning this issue of the Consortium Standards Bulletin.
As you'll see from the materials in either location, ICANN's stewardship of the root directories is up for renewal (or termination) at the end of September of this year. Comments were earlier submitted on what to do when September has run its course, and a public meeting was held two days ago on the question of whether or not to renew the ICANN Memorandum of Understanding, or to put the job out to bid.
According to The Register's Kieren McCarthy, that meeting "should go down in Internet history," as the moment in time when the U.S. government "conceded that it can no longer expect to maintain its position as the ultimate authority over the internet." But the article then goes on to say:
However, assistant commerce secretary John Kneuer, the US official in charge of such matters, also made clear that the US was still determined to keep control of the net's root zone file - at least in the medium-term.
"The historic role that we announced that we were going to preserve is fairly clearly articulated: the technical verification and authorisation of changes to the authoritative root," Kneuer explained....
Tuesday, June 27 2006 @ 09:51 PM CDT
Contributed by: Andy Updegrove
For some time I have been covering the topic of Internet Governance, both in the macro (and more meaningful) sense of ensuring that both the Internet and the Web fulfill the incredible promise that they hold for the advancement of all humanity everywhere, as well as in the micro, and more political sense of who should control the root directories of the Internet - a more symbolic than substantive question of control.
My most detailed coverage can be found in the November 2005 issue of the Consortium Standards Bulletin, titled WSIS and the Governance of the Internet, which I wrote in the run up to the second plenary meeting, and closing event of the second phase of the World Summit on the Information Society (WSIS), an ambitious initiative launched by the United Nations and administered by the ITU to to bridge the digital divide between the haves and the have-nots.
That second meeting was held in Tunis, Tunisia, and was overshadowed by the ongoing political spat over who should control the root directories of the Internet - small databases that include the two letter national identifiers that end domain names and help direct Internet traffic to the appropriate geographical target. Currently, those domains are under the control of ICANN, which is in turn empowered to administer the directories under a Memorandum of Understanding (MOU) with National Telecommuncations and Information Administration, a branch of of the United States Department of Commerce.
That subjection of a vital, if small, element of the Internet infrastructure to the control of a single nation achieved increasing significance as the Bush administration adopted an increasingly "go it alone" attitude in the post-9/11 world, and the political brouhaha that built up over the issue after the Department of Commerce announced in the summer before the Tunis summit that it would not, as earlier promised, relinquish control of the root directories built into a resounding crescendo that opershadowed, and indeed overpowered, any real progress that might otherwise have been accomplished at Tunis.
The upshot was that the opposition caved to the U.S. on the eve of the summit, taking away as a sop the formation of a new Working Group on Internet Governance, which is now in formation, leaving control of the root directories in U.S. hands.
Now, however, another time-sensitive event is looming: the expiration of the MOU itself, opening the door for debate over whether ICANN itself should remain the indirect custodian of the root domains (the domains are actually administered by the Internet Assigned Numbers Authority, or IANA), or whether the contract should be turned over to another contractor (if you'd like to know the full details of how things operate, see the Feature Article from the September CSB, titled WSIS, ICANN and the Future of the Internet).
Sunday, December 04 2005 @ 11:10 AM CST
Contributed by: updegrove
Two weeks ago, the U.S. pulled off an Internet governance coup in Tunisia. Today, ICANN's Board of Directors is meeting in Vancouver, British Columbia. In between, among other things, ICANN was hit with three new law suits relating to how it does its job. If it's not one thing, it's another.
It's been just over two weeks since the World Summit for the Information Society folded up its tents (literally) in Tunis. I've been following the WSIS process for two years, and cumulating blog entries and news items for the last six months here. I also dedicated this November's issue of the Consortium Standards Bulletin to the "compromise" that left the root zone of the Internet to the management of the U.S., and created a new Internet Governance Forum to accommodate the desires of the rest of the world to participate in decision making regarding the future use and impact of the Internet.
Now that everyone is back home, how is it going? Here are a few notes and reports from all over that give a sense of what's been happening.
Sunday, November 20 2005 @ 11:26 AM CST
Contributed by: updegrove
19,000 people went to Tunis to figure out how to bridge the Digital Divide between the first and the third world. How could the hundreds of press representatives there have found virtually nothing about open source worth reporting?
Friday, November 18 2005 @ 11:28 AM CST
Contributed by: updegrove
In the run up to the Tunis Summit, someone blinked on the face-off over
Internet governance. The questions is, who - the U.S.? The opposition?
Or maybe both? For now, its all spin.
Saturday, October 29 2005 @ 01:30 PM CDT
Contributed by: Andy Updegrove
The future of the Internet won't be decided in Tunis in a few weeks, but who will decide the future of the Internet may be. Here's how you can tell the U.S. Ambassador what you think about that.
Saturday, October 01 2005 @ 11:33 AM CDT
Contributed by: updegrove
In an action which the White House will probably call an another
example of "Old Europe" in action, the EU has broken ranks with the US
over Internet governance.
First | Previous | 1 2
Quote of the Day
“We are certain that the Internet of Things will only be successful if it is built on open technologies
-Eclipse Foundation Executive Director Mike Milinkovich See all Quotes
Latest NewsA Data Model to Support the Publishing of Legislation as Linked Open DataJens ScheerlinckEU Joinup
July 22, 2016 - Citizens, professionals in the legal domain, businesses as well as civil servants need to know what legislation is in force. Legislation is often amended, repealed and codified, making it difficult to have a clear view of what text is in force at any specific point in time. In this context, the Hellenic Ministry of Interior and Administrative Reconstruction and the Italian Anti-corruption Agency contacted the ISA Programme of the European Commission to develop a pilot that has the two fold objective of making legislation available in both human and machine readable format and visualising the evolution of legislation over time, to enable user friendly consultation.
In order to allow legislative information to be published as Open Data, a data model was proposed to support this publishing process. The suggested data model is based on the ELI ontology and extended with concepts from Akoma Ntoso and the Core Public Organisation Vocabulary, thereby facilitating interoperability with other EU Member States. The full pilot can be downloaded or forked from the SEMICeu Github repository and the documentation on the data model can be consulted on the pilot website.
The data model has been put in public deliberation by the Ministry until 15 July 2016. ...Full Story
TC260 Drafts New Standard for China's Cloud Security Review Regime
USITO.org Weekly July 21, 2016 - Recently, TC260 has published the draft "Information Security Technology - Security Capability Evaluation Methods of Cloud Computing Services" for comments. The public comment period will end on August 11. This draft standard aims to provide guidance for third-party agencies on how to conduct cloud service capability evaluation via interviews, inspections and testing.
This standard, along with two others, cover guidelines for cloud service provider's size and operational experience, business dealings between cloud service providers and government customers, cloud computing services cybersecurity management and a range of other issues. The three standards have also been adopted as main references in the CAC's Cloud Computing Services Cybersecurity Review, which was announced on June 26, 2015 and targets services for Party and government departments. ...Full Story
IoT Security: What IoT Can Learn From Open Source Businesses are hugely concerned about IoT
Datamation July 20, 2016 - When personal computers were introduced, few manufacturers worried about security. Not until the early 1990s did the need for security become widely understood. Today, the Internet of Things (IoT) is following the same pattern -- except that the need for security is becoming obvious far more quickly, and manufacturers should have known better, especially given the overwhelming influence of open source.
The figures speak for themselves. In 2014, a study by Hewlett-Packard found that seven out of ten IoT devices tested contained serious security vulnerabilities, an average of twenty-five per device. In particular, the vulnerabilities included a lack of encryption for local and Internet transfer of data, no enforcement of secure passwords, and security for downloaded updates. The devices test included some of the most common IoT devices currently in use, including TVs, thermostats, fire alarms and door locks.
Given that Gartner predicts that 25 billion smart devices will be in use by 2020, no one needs to be a prophet to foresee a major security problem that will make even the security problems of the basic Internet seem insignificant....how have IoT manufacturers failed to be more security conscious?...
That smart devices, like OpenStack before it, are being built on the shoulders of open source, is too obvious for anyone to doubt. In early 2015, VisionMobile's survey of 3,700 IoT developers indicated that 91% used open source in their work.
This figure suggests that, without open source, the development of the IoT would be much slower if it happened at all. If nothing else, the use of open source and open standards helps to reduce compatibility problems between manufacturers' devices.... ...Full Story
Ultracode Standard Introduced by AIM
AIM July 19, 2016 - AIM announced today the release of the Ultracode international standard, establishing a significant enhancement in barcode technology for the automatic identification and data capture (AIDC) industry and consumerization.
Ultracode is the first 2D, error-correcting color barcode which can either be displayed on smartphones or printed by using a digital color camera or smartphone app. Its development was motivated by the ubiquitous use of color electronic displays, digital cameras and especially the development of the smartphone. Using Ultracode, standard color technology can create an image that encodes the same data in less than half the area of a QR Code, minimizing display space required.
The effort to develop Ultracode as a formal standard began more than a decade ago.... ...Full Story
ITU announces new standard for High Dynamic Range TV
ITU July 18, 2016 - ITU has announced a new standard for High Dynamic Range Television that represents a major advance in television broadcasting. High Dynamic Range Television (HDR-TV) brings an incredible feeling of realism, building further on the superior colour fidelity of ITU’s Ultra-High Definition Television (UHDTV) Recommendation BT.2020. ITU’s Radiocommunication Sector (ITU-R) has developed the standard – or Recommendation – in collaboration with experts from the television industry, broadcasting organizations and regulatory institutions in its Study Group 6.
This latest ITU-R HDR-TV Recommendation BT.2100 brings a further boost to television images, giving viewers an enhanced visual experience with added realism. The HDR-TV Recommendation allows TV programmes to take full advantage of the new and much brighter display technologies. HDR-TV can make outdoor sunlit scenes appear brighter and more natural, adding highlights and sparkle. It enhances dimly lit interior and night scenes, revealing more detail in darker areas, giving TV producers the ability to reveal texture and subtle colours that are usually lost with existing Standard Dynamic Range TV.... ...Full Story
New NERC Rules for Critical Cyber Assets Expand the Scope of U.S. Federal Regulation to New Facilities and Practices
Lexology July 15, 2016 - As a result of federal legislation enacted after the large Northeast/Midwest blackout in 2003, electric utilities and other electric market participants in the United States are subject to mandatory reliability standards developed through stakeholder processes by the North American Electric Reliability Corporation (NERC) and enforced by the Federal Energy Regulatory Commission (FERC) with substantial financial penalties of up to US$1million per day for each standard violation.
Among the categories of mandatory electric reliability standards are Critical Infrastructure Protection (CIP) standards that were first adopted in 2008. Those standards required owners and operators of “Critical Cyber Assets” (CCA)1 to develop, maintain, and implement cybersecurity policies that cover, among other things, training and access restrictions for personnel with access to CCAs, procedures for managing electronic and physical security perimeters, software security, incident reporting and response planning, and recovery plans to restore CCAs following an incident.
In 2013, NERC proposed and FERC approved version 5 of the CIP standards, a wholesale revision and significant change in approach under the standards. The new standards will be phased in, starting on 1 July 2016. The most significant change in the version 5 standards is the methodology to be used and the requirements for identifying assets subject to the standards, as described below for standard CIP-002-5. The scope of the new standards are significantly broader than the prior version and owners and operators of smaller electric generation and transmission facilities and generation control centers will now be subject to the CIP standards for the first time.... ...Full Story
Automotive Grade Linux wants to help open source your next car
Tech Republic July 15, 2016 - ...The [Linux Foundation] started Automotive Grade Linux (AGL) to create open source software solutions for automotive applications. Their initial focus is on In-Vehicle-Infotainment (IVI) and their long-term goals include the addition of instrument clusters and telematics systems. Already AGL has the likes of Ford, Jaguar, Land Rover, Mazda, Mitsubishi Motors, Nissan, Subaru, and Toyota on board and that list will only continue to grow....Instead of depending on a separate device to serve as the operating system to drive the platform, AGL will be a stand-alone platform...Because AGL is open source, car manufacturers won't be dealing with a collection of proprietary code that will work for a single model, only to have to turn around and purchase another collection of proprietary code for the next model. Instead, the manufacturer downloads the source for AGL and makes it work to their exact specifications each time. Couple this with the idea that, according to Emily Olin, senior PR representative for the Linux Foundation, most auto manufacturers don't want to hand over control to the likes of Google or Apple and AGL starts to make a lot of sense.... ...Full Story
A Call for Developing—and Using—Consensus Standards to Ensure the Quality of Cell Lines
NIST July 14, 2016 - Mainstays of biomedical research, permanent lines of cloned cells are used to study the biology of health and disease and to test prospective medical therapies. Yet, all too often, these apparent pillars of bioscience and biotechnology crumble because they are crafted from faulty starting materials: misidentified or cross-contaminated cell lines.
Writing in the June 2016 issue of PLOS Biology, scientists from the National Institute of Standards and Technology (NIST) call for “community action” to assemble a “comprehensive toolkit for assuring the quality of cell lines,” employed at the start of every study.
As important, they assert, more researchers and laboratories should use the tools that already exist. The NIST authors point to the American National Standard for authentication of human cell lines, which can be implemented to detect cell-line mix-ups and contamination before embarking on studies of cancer or other research using human cells.
Unfortunately, the four-year-old standard has not been widely adopted, even though cell-line authentication is a growing priority among funders and publishers of research.
Cell lines are populations of clones: genetically uniform animal or plant cells that are bioengineered to proliferate indefinitely in culture....A “high level of confidence” in published research results requires valid underpinning data on methods and materials—cell lines, instrument performance and more, explain the researchers, who work in the Biosystems and Biomaterials Division of NIST’s Material Measurement Laboratory. “One might argue that these control data are as important as the study data themselves.”...The authors advocate using inclusive, consensus standards-setting processes—like the one used for human cell-line authentication—to address these needs as well as to seize new opportunities that are arising with the commercialization of genome-sequencing technologies.... ...Full Story
Automotive Grade Linux Releases Unified Code Base 2.0
AGL/Linux Foundation July 13, 2016 - Automotive Grade Linux (AGL), a collaborative open source project developing a Linux-based, open platform for the connected car, today announced the release of AGL Unified Code Base (UCB) 2.0. Built from the ground up through a joint effort by automakers and suppliers, the AGL UCB is an In-Vehicle-Infotainment (IVI) platform that can serve as the de facto standard for the industry.
The latest version of the Linux distribution includes new features such as audio routing and rear seat display. Ideal for deploying navigation, communications, safety, security and infotainment functionality, the AGL UCB distribution is supported by a broad community of participants with significant contributions from AGL members.... ...Full Story
Agencies push for open standards across cloud services
GCN.com July 13, 2016 - Agencies are adopting a growing range of cloud solutions, but more-robust open standards would better support hybrid clouds and integrate cross-vendor workflows [according to the International Trade Administration’s CIO Joe Paiva]....While the open standards for web services and application programming interfaces allow ITA to easily move and exchange data on the web, Paiva said, there are no standards for workflows across multiple clouds’ application programming interfaces....Additionally, Paiva said he is pushing industry for open standards for workflow metadata as well, which would eliminate the need to use each vendor’s proprietary coding and settings and allow agencies to easily modify workflows, data and the way data is presented.... ...Full Story