Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“[O]occasional lulls in momentum are not uncommon”
-Marcus Lange, VP of Apache OpenOffice, on the occasion of the first AOO release in some time
Study: ‘Open source coders more aware of security’ Gijs Hillenius EU Joinup October 19, 2016 - Developers of open source software are generally more aware of code security issues than developers working for the European institutions, according to a study conducted on behalf of the European Commission and European Parliament. Developers working for the European institutions have more tools available for management and testing of code security, but using them is not yet standard practice.
Open source developers should have more testing environments, and should perform more security testing, the study recommends....To compare code security methods used by open source communities and software development projects in the European institutions, the study looks at ten segments commonly found in software development, such as project management, release management, software testing, and incident management. For each segment, the report lists conclusions and recommendations. For example: project management is more efficient at the European institutions, and the study recommends that, if possible, free software groups improve in this area.
To shore up software security, the authors suggest that the European institutions and free software groups standardise their security definitions and that both use standard authentication mechanisms.... ...Full Story
ETSI releases first SDN software stack as open source Adrian Offerman EU Joinup October 18, 2016 - [Last] week, standardisation organisation ETSI published OSM Release ONE, an open-source software stack to implement Software-Defined Networking (SDN). SDN, or network virtualisation, brings the management of computer networks to a higher level by abstracting the physical infrastructure. This allows network administrators to manage their networks in a more flexible, or even a fully automated, dynamic way.
The OSM software was developed by ETSI's Management and Orchestration (MANO) group in close alignment with the Network Functions Virtualisation (NFV) Industry Specification Group, in which industry and ETSI collaborate on standards for SDN.
The OSM community aims to deliver a production-quality open-source MANO stack that meets the requirements of commercial NFV networks. According to ETSI, the platform has been tested and documented to allow rapid installation in operator labs. The OSM group is currently building a network of remote labs connected over a virtual network to test the compatibility and interoperability of multiple types of infrastructures. ...Full Story
France to develop a toolbox for Open Government Cyrille Chausson EU Joinup October 17, 2016 - Etalab, the French government agency in charge of Open Data and Open Government, and the French authorities are currently working, in collaboration with other OGP members, on an Open Government toolkit....Etalab said that the toolbox should include Open Data portals, forums, tools to assess the implementation of commitments drafted in the Action Plan and some civic tech. A free public consultation platform will also developed to be part of the toolbox,... ...Full Story
IBM, Microsoft, Oracle beware: Russia wants open source, sees you as security risk Liam Tung ZDNet October 14, 2016 - Russia is drafting a new law requiring Russian government agencies to give preference to open source and to block US software from computer systems, citing security concerns.
Just weeks after Moscow committed to removing Microsoft Outlook and Exchange on 600,000 systems under orders from Russian president Vladimir Putin, the nation's lower house, the State Duma, is drafting a bill to make it harder for agencies even to buy Russian software products that are based on foreign-made proprietary middleware and programming frameworks.
The bill marks Russia's latest attempt at substituting imported software with local products, but casts a wider net than existing restrictions on IT procurement by agencies and state-run enterprises.
If passed, the law will require local agencies to give preference to open-source software and justify any purchases of proprietary software. As reported by Russian news site, Kommersant, the Duma views products based on closed-source software as costly and unsafe to public IT infrastructure.... ...Full Story
The Apache OpenOffice Project Announces Apache® OpenOffice™ v4.1.3 Press Release Apache Foundation October 13, 2016 - Apache OpenOffice,...announced today Apache® OpenOffice™ v4.1.3, now available in 41 languages on Windows and OS X...."As an Open Source project led by an all-volunteer community, occasional lulls in momentum are not uncommon," said Marcus Lange, Vice President of Apache OpenOffice. "Such was the case with OpenOffice until recently. We wanted to change this, starting with a new bugfix release."Apache OpenOffice 4.1.3 features include:
- Key security vulnerability fixes;
- Support for new language dictionaries;
- Numerous bug fixes, including installer and database support on Mac OS X; and
- Enhancements to the build tools (for developers)
"This release symbolizes a resurgence in the project," said Patricia Shanahan, Release Manager for Apache OpenOffice 4.1.3. "We are proud to continue development of one of the most visible and widely used Apache projects."... ...Full Story
NFC Forum Technical Specifications Improve RF Communication and NFC Tag Interoperability with NFC Devices Press Release NFC Forum October 12, 2016 - The NFC Forum announced today the availability of one adopted and four candidate technical specifications, following approval by the Board of Directors. The specifications are available on the NFC Forum website. Formerly a candidate specification, the Analog 2.0 Technical Specification delivers new capabilities that support improved RF communication to ensure interoperability between Near Field Communication (NFC) devices and existing RF infrastructure and cards based on the ISO/IEC 14443 and ISO/IEC 18092 standards.
The NFC Forum Type 1-4 Tag Candidate Specifications, currently open for industry comment, allow for enhanced communications between an NFC-enabled device and different existing tag hardware.... ...Full Story
21 Open Source Projects for IoT Erik Brown Linux.com September 30, 2016 - The Internet of Things market is fragmented, amorphous, and continually changing, and its very nature requires more than the usual attention to interoperability. It’s not surprising then, that open source has done quite well here -- customers are hesitant to bet their IoT future on a proprietary platform that may fade or become difficult to customize and interconnect.
In this second entry in a four-part series about open source IoT, I have compiled a guide to major open source software projects, focusing on open source tech for home and industrial automation. Next week, I’ll cover hardware projects -- from smart home hubs to IoT-focused hacker boards -- and in the final part of the series, I’ll look at distros and the future of IoT.
The list of 21 projects below includes two major Linux Foundation hosted projects -- AllSeen (AllJoyn) and the OCF (IoTivity) -- and many more end-to-end frameworks that link IoT sensor endpoints with gateways and cloud services. I have also included a smattering of smaller projects that address particular segments of the IoT ecosystem. We could list more, but it’s increasingly difficult to determine the difference between IoT software and just plain software. From the embedded world to the cloud, more and more projects have an IoT story to tell.... ...Full Story
Thou shalt not kill: Official guidelines to keep humans safe from robots are published by standards authority Richard Gray DailyMail.com September 29, 2016 - The science fiction author Isaac Asimov first proposed the 'Three Laws of Robotics' in a short story published in 1942 as a way of ensuring the machines would not rise up to overthrow humanity.
But with robots now starting to appear in people's homes and artificial intelligence developing, a group of experts have drawn up a new list of rules to protect humanity from their creations.
The British Standards Institution, which develops technical and quality guidelines for goods sold in the UK and issues the famous Kitemark certificate, has drawn up a new standard for robots.... ...Full Story
New, stronger crypto standard lacks backward compatibility Shaun Waterman FedScoop September 28, 2016 - The Internet Engineering Task Force is on the verge of approving a new standard for encrypted internet traffic that will make the web a safer place to shop, bank and browse — but it could also break a lot of stuff for people who don't update their browsers.
Transport Layer Security, or TLS, is an encryption protocol that works with web browsers. It's the math, and the shared standards, that underlie the green padlock users see — the symbol which gives users the confidence that they are connected to the right site and is private enough to share personal or financial data.
TLS supersedes SSL, or Secure Sockets Layer — a protocol dating back to 1995 that has proven to be thoroughly broken. But the latest TLS version was finalized in 2008 and in recent years has been the subject of many high profile attacks and newly discovered bugs...."There's no timeline" for the IETF working group to finish drafting the standard, task force spokesman Greg Wood told FedScoop. The 15th draft was published last month.....Crypto experts agree 1.3 will be faster and much more secure. Older versions of TLS typically require at least three exchanges between the server hosting web content and the browser viewing it before any actual traffic can move. This is known as 3-RTT, for Round Trip Time, and contributes to the latency that sometimes plague encrypted sites.
The lower the RTT, the faster the web connection. TLS 1.3 aims for a maximum of 1-RTT, according to engineers.However, one of the ways TLS 1.3 is being made more secure is to eliminate what engineers call backwards compatibility — the ability of websites using the new standard to be viewed with outdated browsers....
Backwards compatibility is at the root of many vulnerabilities in earlier versions of TLS — like the POODLE and FREAK attacks.... ...Full Story