Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“A difficult issue that needs to be solved”
-Ian Skerrett, VP of Marketing and Ecosystem at the Eclipse Foundation, commenting on the challenge of making the IoT secure
Audit: DOT needs to act on vehicle cybersecurity Grayson Ullman FedScoop May 4, 2016 - The Department of Transportation needs a clearer idea of what its responsibilities would be in a real-world cyberattack on connected cars or other vehicles, according to a new Government Accountability Office study.
The report, completed last month but only released on Monday, concludes that numerous interfaces standard in modern vehicles are susceptible to exploits that would allow hackers to gain control of safety-critical systems, including braking and steering.
The study surveyed 32 stakeholders in the automotive industry, including eight automakers, three vehicle cybersecurity firms and seven vehicle cybersecurity researchers. A chief concern among experts was that although the National Highway Transportation Safety Administration has established a vehicle cybersecurity program, the DOT at large has not determined a response method in the case of a catastrophic vehicle hack.... ...Full Story
Let's Encrypt Reaches 2,000,000 Certificates Seth Schoen Electronic Frontier Foundation May 3, 2016 - The Let's Encrypt certificate authority issued its two millionth certificate on Thursday, less than two months after the millionth certificate....each certificate can cover several web sites, so the certificates Let's Encrypt has issued are already protecting millions and millions of sites.
This rapid adoption has made Let's Encrypt one of the world's largest public certificate authorities by number of certificates issued, and almost all of them are protecting domains that never supported HTTPS before. The Internet needs to migrate away from the insecure HTTP protocol, and we're very pleased to be helping to make that possible....EFF co-founded the Let's Encrypt CA with Mozilla and researchers from the University of Michigan. Akamai and Cisco joined the project as founding sponsors, and many other organizations have stepped up to sponsor the project since launch.... ...Full Story
Web Storage (Second Edition) is a W3C Recommendation Press Release W3C.org May 2, 2016 - The Web Platform Working Group has published a W3C Recommendation of "Web Storage (Second Edition)." This specification defines an API for persistent data storage of key-value pair data in Web clients. It introduces two related mechanisms, similar to HTTP session cookies, for storing name-value pairs on the client side. The first mechanism is designed for scenarios where the user is carrying out a single transaction, but could be carrying out multiple transactions in different windows at the same time. The second mechanism is designed for storage that spans multiple windows, and lasts beyond the current session.... ...Full Story
Survey Highlights Security Concern Among IoT Developers Patricio Robles Programmable Web April 29, 2016 - According to the second annual IoT Developer Survey, security is the top concern of IoT developers. The survey, which polled 528 IoT developers, was conducted by the Eclipse IoT Working Group in partnership with the IEEE IoT and the AGILE-IoT research project.
Of developers working in organizations that have deployed IoT solutions, nearly half (48.3%) identified security as their leading concern. In the same group of respondents, interoperability and performance were the second and third biggest concerns, with 31.9% and 21%, respectively....
Not only can vulnerabilities in IoT applications be the source of privacy breaches, as the IoT extends its reach to things like cars, security vulnerabilities could theoretically put lives in danger....In this year's IoT Developer Survey, nearly half (46%) of those polled indicated that their company is developing and deploying IoT solutions, and 29% indicated that their company plans to within the next 18 months, suggesting that adoption of IoT technologies is accelerating.... ...Full Story
The advantages of open source in Internet of Things design Jennifer Calhoon DesignWorldOnline April 28, 2016 - The Internet of Things is booming and with millions of devices to be connected over the coming years, many developers are focusing on the IoT opportunity....There are many commonalities between IoT solutions across different applications—the need for wireless connections, communication between devices and back-end systems, and data collection/interpretation are a few examples. But the proliferation of proprietary systems that are often in silos makes developing and building these solutions more complex and time consuming than needed. In a fast-moving, fragmented industry, open source technologies will play an increasingly fundamental role in mitigating these challenges and enabling seamless systems to further fuel innovation.
One way to circumvent the interoperability challenge is by establishing and using standards. Thoughtful and collaborative standardization improves choice and flexibility. As a result, developers can use devices from multiple vendors to build a solution that is innovative and meets their specific needs. We’ve outlined a few key channels that are essential to unlocking the potential of open source in IoT development.
Standards are necessary across the whole ecosystem and are being addressed by the industry in multiple ways. For example, industry standards organizations, like oneM2M (a consortium of industry stakeholders), has developed technical specifications to address the need for a common M2M Service Layer that can be embedded within various hardware and software and relied on to connect a wide range of devices to M2M application servers.
Another complementary approach to standards development is the release of designs and specifications into the open source community as open hardware and interface standards for others to adopt. Examples include Arduino, Raspberry Pi, and Beaglebone, which enable quick prototyping, as well as the mangOH open hardware reference design, an open source design that is more easily scalable in commercial settings and is built specifically for IoT cellular connectivity.
Open source platforms like these enable developers that may have limited hardware, wireless or low-level software expertise to start developing IoT applications in days—rather than months. If executed properly, these can significantly reduce the time and effort to get prototypes from paper to production by ensuring that various connectors and sensors work together automatically with no additional coding required. With industrial-grade specifications, these next-generation platforms not only allow quick prototyping, but also rapid industrialization of IoT applications.
On the software side, using widely supported open source software application frameworks and development environments, such as Linux—itself an open source solution—can be extremely helpful by providing developers the head start that is required to get a product to market faster. When it comes to proprietary solutions, support for its development framework tends to rest on the original vendor, whose agenda may not align with the needs of the community. Open source solutions ensure a future-proof investment and longevity, so that resources and tools are available and continually enhanced for years to come....
To further advance the industry, we must commit to a standards-based and open-source strategy. Not only will it continue to be critical to the health of IoT innovation, but it will lay the groundwork for real innovation. Just as it supported many other areas of technology development—including nothing less than the Internet itself—open standards are the key to realizing the unforeseen benefits of a more connected world. ...Full Story
ANSI Energy Efficiency Standardization Coordination Collaborative (EESCC) Releases Roadmap Progress Report Press Release ANSI.org April 27, 2016 - The American National Standards Institute (ANSI) Energy Efficiency Standardization Coordination Collaborative (EESCC) announced today the publication of a Progress Report detailing the standardization community’s activity to advance recommendations outlined in the EESCC’s Standardization Roadmap: Energy Efficiency in the Built Environment. Published in June 2014 to serve as a national framework for action and coordination, the roadmap identified gaps where standards and codes were needed to improve energy and water efficiency in the built environment.
Available as a free resource, the Progress Report features updates on 71 of the 109 standards-based gaps identified in the roadmap, demonstrating significant progress within the standardization community to advance energy and water efficiency through standards-based solutions. The report also includes a summary of all of the standards-based roadmap gaps, including those for which there is no known progress at this time, so that readers may easily identify opportunities to take action on closing the gaps.... ...Full Story
Anti-innovation: EU excludes open source from new tech standards Glyn Moody Ars Technica April 27, 2016 - As part of its Digital Single Market strategy, the European Commission has unveiled "plans to help European industry, SMEs, researchers and public authorities make the most of new technologies." In order to "boost innovation," the Commission wants to accelerate the creation of new standards for five buzzconcepts: 5G, cloud computing, internet of things, data technologies, and cybersecurity.
The key document is one entitled "ICT Standardisation Priorities for the Digital Single Market," which says: "Open standards ensure ... interoperability, and foster innovation and low market entry barriers in the Digital Single Market, including for access to media, cultural and educational content." The word "open" occurs 26 times in the document, and is also frequently found in the other "communications" just released by the European Commission: on digitising European industry (9 times), and on the European Cloud Initiative (50 times).
"Open" is generally used in the documents to denote "open standards," as in the quotation above. But the European Commission is surprisingly coy about what exactly that phrase means in this context. It is only on the penultimate page of the ICT Standardisation Priorities document that we finally read the following key piece of information: "ICT standardisation requires a balanced IPR [intellectual property rights] policy, based on FRAND licensing terms."...
The problem for open source is that standard licensing can be perfectly fair, reasonable, and non-discriminatory, but would nonetheless be impossible for open source code to implement. Typically, FRAND licensing requires a per-copy payment, but for free software, which can be shared any number of times, there's no way to keep tabs on just how many copies are out there. Even if the per-copy payment is tiny, it's still a licensing requirement that open source code cannot meet....Ars has asked the European Commission for comment on its decision to use FRAND, rather than a royalty-free approach. We'll update this story when the EC responds.... ...Full Story
Open Data Barometer 2015: 5 European countries in the Top 10 Cyrille Chausson EU Joinup April 26, 2016 - Five European countries ranked in the top 10 of the 2015 Open Data Barometer, recently published by the World Wide Web Foundation.
The UK is still at the top of the barometer, but is now followed by the USA and France, both ranked second. France, which was third in 2014, received good marks in three criteria: government action, political impact and, citizens and civil rights.
Denmark ranked 5th and moved up by four positions. The Netherlands ranked 7th and Sweden 9th, with both losing ground (-1 for the former, -6 for the latter)....Other conclusions from 2015 include the fact that “Open Data is entering the mainstream”, with 55% of the 92 countries listed in the survey now having an open data initiative in place. However, almost 90% of data are still locked, the report said. Only 10% of the published data are open (following the open data definition) but are also of poor quality, “making it difficult for potential data users to access, process, and work with it effectively”.
Lastly, this Open Data Barometer warns about “open-washing” behavior, which is “jeopardizing progress”. “Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and are supported by a legal framework”, the report said. “Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries.” ...Full Story
European Cloud Initiative to give Europe a global lead in the data-driven economy Press Release European Commission April 25, 2016 - Europe is the largest producer of scientific data in the world, but insufficient and fragmented infrastructure means this 'big data' is not being exploited to its full potential. By bolstering and interconnecting existing research infrastructure, the Commission plans to create a new European Open Science Cloud that will offer Europe's 1.7 million researchers and 70 million science and technology professionals a virtual environment to store, share and re-use their data across disciplines and borders. This will be underpinned by the European Data Infrastructure, deploying the high-bandwidth networks, large scale storage facilities and super-computer capacity necessary to effectively access and process large datasets stored in the cloud. This world-class infrastructure will ensure Europe participates in the global race for high performance computing in line with its economic and knowledge potential.
Focusing initially on the scientific community - in Europe and among its global partners -, the user base will over time be enlarged to the public sector and to industry. This initiative is part of a package of measures to strengthen Europe's position in data-driven innovation, to improve competitiveness and cohesion and to help create a Digital Single Market in Europe (press release)....The European Cloud Initiative will make it easier for researchers and innovators to access and re-use data, and will reduce the cost of data storage and high-performance analysis. Making research data openly available can help boost Europe's competitiveness by benefitting start-ups, SMEs and data-driven innovation, including in the fields of medicine and public health. It can even spur new industries, as demonstrated by the Human Genome Project.... ...Full Story
ANAB and ASCLD/LAB Merge Forensics Operations Press Release ANSI.org Weekly News April 25, 2016 - The ANSI-ASQ National Accreditation Board (ANAB) has signed an affiliation agreement with the American Society of Crime Laboratory Directors/Laboratory Accreditation Board (ASCLD/LAB), merging ASCLD/LAB into ANAB.
Like ANAB, ASCLD/LAB provides accreditation based on international standards for public and private sector crime laboratories. Both ANAB and ASCLD/LAB are grounded in conducting scientific and technical assessments and committed to assuring competent and credible test and inspection results. The merger with ASCLD/LAB allows ANAB to enhance its expertise in the field of forensics accreditation while providing uninterrupted service to the customers of both organizations.... ...Full Story