Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“We are certain that the Internet of Things will only be successful if it is built on open technologies”
-Eclipse Foundation Executive Director Mike Milinkovich
ETSI bundles standards for EU eID regulation Gijs Hillenius EU Joinup July 26, 2016 - The European Telecommunications Standards Institute has published a collection of standards for electronic signatures, electronic seals, electronic time-stamps, and for trust services providers. The publication coincides with the entering into force on 1 July of the European Union's eIDAS regulation on eID and trust services for electronic transactions.
The bundle of standards was created in April by ETSI’s Technical Committee on Electronic Signatures and Infrastructures. “The set includes a total of 19 European Standards along with guidance documents and test specifications”, ETSI writes.
The standards can be used to audit trust service providers and to assess their conformity with eIDAS Regulation requirements. Others cover the creation and validation of digital signatures and seals.
In April, ETSI updated its ‘technical report on Electronic Signatures and Infrastructures’. This report details the standards that are involved, or could be involved, in electronic signatures.... ...Full Story
Analog 2.0 Specification Available Now Press Release NFC Forum July 25, 2016 - The NFC Forum published the adopted Analog 2.0 Technical Specification today. Members may download the specification from the Adopted Specifications page.
The Analog 2.0 Candidate Specification was published in October 2015 and introduced Active Communication Mode for P2P data exchange and NFC-V technology in poll mode.The Analog 2.0 Technical Specification ensures full interoperability with devices conformant to ISO/IEC 14443 or ISO/IEC 18092 by harmonizing the analog parameter for the contactless communication. This interoperability is important to enable the reliable usage of NFC devices with existing infrastructure using ISO compatible RF readers and/or cards (e.g. for contactless public transport applications).
NFC Forum, 401 Edgewater Place, Suite 600, Wakefield USA, MA 01880
Forward this email | About our service provider
Sent by firstname.lastname@example.org ...Full Story
A Data Model to Support the Publishing of Legislation as Linked Open Data Jens Scheerlinck EU Joinup July 22, 2016 - Citizens, professionals in the legal domain, businesses as well as civil servants need to know what legislation is in force. Legislation is often amended, repealed and codified, making it difficult to have a clear view of what text is in force at any specific point in time. In this context, the Hellenic Ministry of Interior and Administrative Reconstruction and the Italian Anti-corruption Agency contacted the ISA Programme of the European Commission to develop a pilot that has the two fold objective of making legislation available in both human and machine readable format and visualising the evolution of legislation over time, to enable user friendly consultation.
In order to allow legislative information to be published as Open Data, a data model was proposed to support this publishing process. The suggested data model is based on the ELI ontology and extended with concepts from Akoma Ntoso and the Core Public Organisation Vocabulary, thereby facilitating interoperability with other EU Member States. The full pilot can be downloaded or forked from the SEMICeu Github repository and the documentation on the data model can be consulted on the pilot website.
The data model has been put in public deliberation by the Ministry until 15 July 2016. ...Full Story
TC260 Drafts New Standard for China's Cloud Security Review Regime USITO.org Weekly July 21, 2016 - Recently, TC260 has published the draft "Information Security Technology - Security Capability Evaluation Methods of Cloud Computing Services" for comments. The public comment period will end on August 11. This draft standard aims to provide guidance for third-party agencies on how to conduct cloud service capability evaluation via interviews, inspections and testing.
This standard, along with two others, cover guidelines for cloud service provider's size and operational experience, business dealings between cloud service providers and government customers, cloud computing services cybersecurity management and a range of other issues. The three standards have also been adopted as main references in the CAC's Cloud Computing Services Cybersecurity Review, which was announced on June 26, 2015 and targets services for Party and government departments. ...Full Story
IoT Security: What IoT Can Learn From Open Source Businesses are hugely concerned about IoT Bruce Byfield Datamation July 20, 2016 - When personal computers were introduced, few manufacturers worried about security. Not until the early 1990s did the need for security become widely understood. Today, the Internet of Things (IoT) is following the same pattern -- except that the need for security is becoming obvious far more quickly, and manufacturers should have known better, especially given the overwhelming influence of open source.
The figures speak for themselves. In 2014, a study by Hewlett-Packard found that seven out of ten IoT devices tested contained serious security vulnerabilities, an average of twenty-five per device. In particular, the vulnerabilities included a lack of encryption for local and Internet transfer of data, no enforcement of secure passwords, and security for downloaded updates. The devices test included some of the most common IoT devices currently in use, including TVs, thermostats, fire alarms and door locks.
Given that Gartner predicts that 25 billion smart devices will be in use by 2020, no one needs to be a prophet to foresee a major security problem that will make even the security problems of the basic Internet seem insignificant....how have IoT manufacturers failed to be more security conscious?...
That smart devices, like OpenStack before it, are being built on the shoulders of open source, is too obvious for anyone to doubt. In early 2015, VisionMobile's survey of 3,700 IoT developers indicated that 91% used open source in their work.
This figure suggests that, without open source, the development of the IoT would be much slower if it happened at all. If nothing else, the use of open source and open standards helps to reduce compatibility problems between manufacturers' devices.... ...Full Story
Ultracode Standard Introduced by AIM Press Release AIM July 19, 2016 - AIM announced today the release of the Ultracode international standard, establishing a significant enhancement in barcode technology for the automatic identification and data capture (AIDC) industry and consumerization.
Ultracode is the first 2D, error-correcting color barcode which can either be displayed on smartphones or printed by using a digital color camera or smartphone app. Its development was motivated by the ubiquitous use of color electronic displays, digital cameras and especially the development of the smartphone. Using Ultracode, standard color technology can create an image that encodes the same data in less than half the area of a QR Code, minimizing display space required.
The effort to develop Ultracode as a formal standard began more than a decade ago.... ...Full Story
ITU announces new standard for High Dynamic Range TV Press Release ITU July 18, 2016 - ITU has announced a new standard for High Dynamic Range Television that represents a major advance in television broadcasting. High Dynamic Range Television (HDR-TV) brings an incredible feeling of realism, building further on the superior colour fidelity of ITU’s Ultra-High Definition Television (UHDTV) Recommendation BT.2020. ITU’s Radiocommunication Sector (ITU-R) has developed the standard – or Recommendation – in collaboration with experts from the television industry, broadcasting organizations and regulatory institutions in its Study Group 6.
This latest ITU-R HDR-TV Recommendation BT.2100 brings a further boost to television images, giving viewers an enhanced visual experience with added realism. The HDR-TV Recommendation allows TV programmes to take full advantage of the new and much brighter display technologies. HDR-TV can make outdoor sunlit scenes appear brighter and more natural, adding highlights and sparkle. It enhances dimly lit interior and night scenes, revealing more detail in darker areas, giving TV producers the ability to reveal texture and subtle colours that are usually lost with existing Standard Dynamic Range TV.... ...Full Story
New NERC Rules for Critical Cyber Assets Expand the Scope of U.S. Federal Regulation to New Facilities and Practices Hogan Lovells Lexology July 15, 2016 - As a result of federal legislation enacted after the large Northeast/Midwest blackout in 2003, electric utilities and other electric market participants in the United States are subject to mandatory reliability standards developed through stakeholder processes by the North American Electric Reliability Corporation (NERC) and enforced by the Federal Energy Regulatory Commission (FERC) with substantial financial penalties of up to US$1million per day for each standard violation.
Among the categories of mandatory electric reliability standards are Critical Infrastructure Protection (CIP) standards that were first adopted in 2008. Those standards required owners and operators of “Critical Cyber Assets” (CCA)1 to develop, maintain, and implement cybersecurity policies that cover, among other things, training and access restrictions for personnel with access to CCAs, procedures for managing electronic and physical security perimeters, software security, incident reporting and response planning, and recovery plans to restore CCAs following an incident.
In 2013, NERC proposed and FERC approved version 5 of the CIP standards, a wholesale revision and significant change in approach under the standards. The new standards will be phased in, starting on 1 July 2016. The most significant change in the version 5 standards is the methodology to be used and the requirements for identifying assets subject to the standards, as described below for standard CIP-002-5. The scope of the new standards are significantly broader than the prior version and owners and operators of smaller electric generation and transmission facilities and generation control centers will now be subject to the CIP standards for the first time.... ...Full Story
Automotive Grade Linux wants to help open source your next car Jack Wallen Tech Republic July 15, 2016 - ...The [Linux Foundation] started Automotive Grade Linux (AGL) to create open source software solutions for automotive applications. Their initial focus is on In-Vehicle-Infotainment (IVI) and their long-term goals include the addition of instrument clusters and telematics systems. Already AGL has the likes of Ford, Jaguar, Land Rover, Mazda, Mitsubishi Motors, Nissan, Subaru, and Toyota on board and that list will only continue to grow....Instead of depending on a separate device to serve as the operating system to drive the platform, AGL will be a stand-alone platform...Because AGL is open source, car manufacturers won't be dealing with a collection of proprietary code that will work for a single model, only to have to turn around and purchase another collection of proprietary code for the next model. Instead, the manufacturer downloads the source for AGL and makes it work to their exact specifications each time. Couple this with the idea that, according to Emily Olin, senior PR representative for the Linux Foundation, most auto manufacturers don't want to hand over control to the likes of Google or Apple and AGL starts to make a lot of sense.... ...Full Story
A Call for Developing—and Using—Consensus Standards to Ensure the Quality of Cell Lines NIST July 14, 2016 - Mainstays of biomedical research, permanent lines of cloned cells are used to study the biology of health and disease and to test prospective medical therapies. Yet, all too often, these apparent pillars of bioscience and biotechnology crumble because they are crafted from faulty starting materials: misidentified or cross-contaminated cell lines.
Writing in the June 2016 issue of PLOS Biology, scientists from the National Institute of Standards and Technology (NIST) call for “community action” to assemble a “comprehensive toolkit for assuring the quality of cell lines,” employed at the start of every study.
As important, they assert, more researchers and laboratories should use the tools that already exist. The NIST authors point to the American National Standard for authentication of human cell lines, which can be implemented to detect cell-line mix-ups and contamination before embarking on studies of cancer or other research using human cells.
Unfortunately, the four-year-old standard has not been widely adopted, even though cell-line authentication is a growing priority among funders and publishers of research.
Cell lines are populations of clones: genetically uniform animal or plant cells that are bioengineered to proliferate indefinitely in culture....A “high level of confidence” in published research results requires valid underpinning data on methods and materials—cell lines, instrument performance and more, explain the researchers, who work in the Biosystems and Biomaterials Division of NIST’s Material Measurement Laboratory. “One might argue that these control data are as important as the study data themselves.”...The authors advocate using inclusive, consensus standards-setting processes—like the one used for human cell-line authentication—to address these needs as well as to seize new opportunities that are arising with the commercialization of genome-sequencing technologies.... ...Full Story