Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“Open standards are simply better for developers”
-Professor William Webb, CEO of the Weightless SIG, announcing the SIG's first standard
Microsoft bullied MPs over government switch to open source standards Carly Page The Inquirer May 29, 2015 - Microsoft bullied MPs over government switch to open source software
MICROSOFT reportedly threatened to move its research facilities out of the UK if the government went ahead with plans to promote open source standards....Cabinet Officer Francis Maude outlined plans at the time to shift the UK to the .odf Open Document Format and away from Microsoft's proprietary .doc and .docx formats.
Maude said: "The software we use in government is still supplied by just a few large companies. A tiny oligopoly dominates the marketplace.
"I want to see a greater range of software used,..."As reported at Bloomberg, Steve Hilton, who was the prime minister's director of strategy until 2012, revealed at an event that..."Microsoft phoned Conservative MPs with Microsoft R&D facilities in their constituencies and said we will close them down in your constituencies if this goes through," Hilton said. "We just resisted. You have to be brave."... ...Full Story
Standard Knowledge for Robots NIST Techbeat May 29, 2015 - What do you know? There is now a world standard for capturing and conveying the knowledge that robots possess—or, to get philosophical about it, an ontology for automatons.
Crafted by a working group of 166 experts from 23 nations, the IEEE Standard for Ontologies for Robotics and Automation (IEEE P1872) is designed to simplify programming, extend the informationprocessing and reasoning capabilities of robots, and enable clear robot-to-robot and human-to-robot communication....The working group’s core ontology for robotics and automation, or CORA, is an important step toward achieving this shared understanding. It establishes a formal way of representing knowledge that robots possess to perform tasks in their particular area of activity such as manufacturing plants or hospitals. This “common ground” enables efficient and reliable exchanges of information and integration of new data....With this structured base of knowledge, a manufacturing robot, for example, will know what tasks it can do, how much it can lift, whether it can work around people, and other performance-defining features. So when a new order comes, the robot will be able to assess whether it can do the required work.... ...Full Story
SAC Reveals Next Steps of Enterprise Standards Reform USITO.org Weekly May 28, 2015 - On May 19, the Standardization Administration of China (SAC) Local Standardization Department (i.e. SAC Department of Service Industry Standards) hosted the second meeting reviewing the pilot work of the enterprise product standard self-disclosure system. SAC Deputy Director Cui Gang attended the meeting and outlined four SAC key tasks for further reform work:
- Release soon the Guiding Opinions of Establishing Enterprise Product and Service Standards Self-disclosure and Supervision System
Initiate the revision work of the Enterprise Standardization Management Measures
- Strengthen standards information services for enterprises
- Provide training of the new enterprise standards system... ...Full Story
Vatican library: open source for long-term preservation Gijs Hillenius EU Joinup May 27, 2015 - The combination of open source and open standards ensures long-term preservation of electronic records and prevents IT vendor lock-in, says Luciano Ammenti, head of the IT department at the Vatican Library (Biblioteca Apostolica Vaticana) in Vatican City.
Open standards and open source solutions are a key part of the Vatican Library’s long-term digital conservation project. The library stores tens of thousands of manuscripts and documents, including the main ancient sample of the Greek version of the Bible, monastic collections from the medieval period, the Codex Borgianus and a fifteenth-century copy of the Mishneh Torah.... ...Full Story
How open data is transforming the business landscape Benn Rossi Information Age May 26, 2015 - Despite pledges by the G7 and G20 to boost transparency by opening up government data, fewer than 8% of countries publish data sets in open formats and under open licences on public sector budgets, spending and contracts.
Realisations about the potential of open data have grown steadily in recent years as successful examples of its use have emerged. The internet and other technological developments have led to an explosion in the amount of data collected – the case for open data is to make it publicly available for anyone to use to create social, environmental and economical impact... ...Full Story
ITU marks 150th anniversary with global celebrations Press Release ITU.org May 20, 2015 - ITU celebrated its 150th anniversary on 17 May, marking a long and illustrious history at the cutting edge of communication technologies....ITU was established on 17 May 1865 with the signing of the first International Telegraph Convention in Paris to facilitate the transmission of telegraphy across international borders. ITU was initially headquartered in Berne and moved to Geneva in 1948, soon after it became a specialized agency of the United Nations in 1947.... ...Full Story
The merger was put to a vote on GitHub by io.js developer Mikeal Rogers, who initially proposed the merger in February, and the io.js technical committee voted to approve the merger yesterday. According to Rogers, the team will continue releasing io.js versions while the convergence takes place, but after the merger is complete, the io.js working groups and technical committee will join the Node.js Foundation under renamed titles....The Node.js Foundation was established with the help of the Linux Foundation back in February, and had its important organizational structure and stewardship questions hashed out at the Node Summit soon after. ...Full Story
Biometrics Institute forms new alliance Planet Biometrics May 18, 2015 - The Biometrics Institute, Mobey Forum and Natural Security Alliance have revealed plans to cooperate on promoting the use of biometrics in digital services.
The new tripartite group will hold an inaugural meeting on biometrics for non-government services in Paris on 1 July.
The international Biometrics Institute is an independent association working to bring the industry together as a whole including end users, suppliers and academics. Mobey Forum, meanwhile, is an association empowering banks and other financial institutions to lead in the future of mobile financial services.
Natural Security Alliance has developed biometric authentication standards that make user transactions possible which guarantee the sanctity of biometric data and privacy.... ...Full Story
Linux Foundation's SPDX Workgroup Announces New Open Compliance Standard Press Release Linux Foundation May 15, 2015 - The SPDX® workgroup, hosted by The Linux Foundation, today announced the release of version 2.0 of its Software Package Data Exchange® (SPDX) specification, which includes a three-dimensional view of license dependencies that will make exchange of open source and license data more simple and compliance with open source licenses much easier....New features include the ability to relate SPDX documents to each other, making it more useful for a broader range of uses, including exchanging clear data about software and modules in companies' supply chains. For example, with SPDX 2.0 a device manufacturer can easily understand what open source software has been used to build the device components, what versions of that software are being used and what modules have been integrated. This allows companies to more efficiently understand the open source compliance obligations or vulnerabilities and address them before shipment.
The relationship view of license dependencies is made possible through new features that include a deeper level of description and context in files and packages, including those external to the SPDX specification. This allows managers to better understand the open source code in their products, as well as third-party open source code bases that have been integrated with the existing software. This helps to create taxonomy for modules that can be used not only for compliance but identifying potential security vulnerabilities....The Software Package Data Exchange® (SPDX®) specification is a standard format for communicating the components, licenses and copyrights associated with a software package. The SPDX specification helps facilitate compliance with free and open source software licenses by providing a uniform way license information is shared across the software supply chain. The SPDX specification is developed by the SPDX workgroup, which is hosted by The Linux Foundation.... ...Full Story
OGC announces standard for concise description of Earth coordinate reference systems Directions Magazine May 15, 2015 - Open Geospatial Consortium (OGC®) membership has adopted the OGC Well-known Text (WKT) Representation of Coordinate Reference Systems Encoding Standard.
Well-Known Text (WKT),...describes a compact machine- and human-readable representation of geometric objects....The text strings specified in the new standard provide a means for humans and machines to correctly and unambiguously interpret and utilise a coordinate reference system definition.... ...Full Story