Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“The most absurd thing you can say”
-Document Foundation co-founder Italo Vignoli on the belief of some developers that good products don't need marketing
The Impact Of The New HDcctv AT 2.0 Standard Todd Rockoff SourceSecurity.com August 22, 2014 - Editor's Note: HDcctv Alliance has announced that Dahua has opened its patented HDCVI technology to the global video surveillance industry as the basis for HDcctv's AT 2.0 standard. For additional elaboration on what the move means to the growing market for higher-resolution CCTV, we approached Todd Rockoff, chairman and executive director of HDcctv Alliance.... SourceSecurity.com: Given that Hikvision, the number one competitor in the video market, is unveiling a different technology (i.e., HDTVI), is there any plan to “converge” the two technologies or make them compatible? What might the HDcctv Alliance’s role be to accomplish that?
TR: We are delighted that Hikvision shares our recognition of the growing importance of plug ‘n’ play (PnP) analog HD surveillance equipment....
SourceSecurity.com: Might not a proprietary non-standard technology from the market’s largest player undermine the positive impact of the standard? (i.e., set up a Beta vs. VHS type competition?)
TR: Absolutely! It compares to having to stock inventory in multiple formats (Beta/VHS or DVD/Blu-Ray/3D Blu-Ray) which inevitably multiplies the costs of running a video shop. And format confusion decreases revenues. A good example is a customer who accidentally brings a 3D Blu-Ray disc home but can't watch it on his DVD player.
Format confusion inevitably has the same kind of impact on the video surveillance market. Therefore, it is in the commercial interest of every company who has invested in HD surveillance equipment to fully support the open, global PnP standards for local-site transport of HD surveillance signals.... ...Full Story
The Connected Car, Part 3: No Shortcuts to Security Jack M. Germain TechNewsWorld August 21, 2014 - The connected car is becoming a reality, but the gadget-filled roadways it travels will be paved with several options for in-car technologies. These choices pose challenges for carmakers. Whichever technology wins the race, one of the biggest concerns for OEMs is their electronic security.
The Linux Foundation wants an open source platform in the pole position. The nonprofit consortium already has a fully functional Linux distribution, called "Automotive Grade Linux," or AGL. It is a customizable, open source automotive software stack with Linux at its core.
Google has its own plan for connecting cars to mobile devices and the Internet. Google's Android Auto is a dashboard navigation and entertainment system powered by an Android smartphone. It is very similar in concept to competing designs from Apple and Microsoft....To handle this traffic jam of data, car manufacturers are testing technologies like Broadcom's Automotive Ethernet and The Car Connectivity Consortium (CCC)'s MirrorLink among others. Similarly, QNX Software Systems has a foot or two in some vehicles with its QNX Car Platform for Infotainment.... ...Full Story
UK government backs consortium's search for IOT standard Caroline Baldwin ComputerWeekly.com August 20, 2014 - The UK government has awarded £1.6m to a consortium of 40 British companies tasked with finding a standard specification for the internet of things (IOT).
The money – awarded by the government’s innovation agency, the Technology Strategy Board – will go towards developing a publicly available universal standard for interoperability for IOT....The HyperCat consortium, comprising BT, ARM, KMPG and other UK companies including Flexeye, will focus on IOT solutions for business....The British Standards Institution will publish an independent publicly available specification (PAS) based on the consortium’s specification.... ...Full Story
6 emerging standards battling it out for the Internet of Things The Boardroom Amy-jo Crowley CBR.com August 20, 2014 - Which group will solve the interoperability problem?
1. The Thread Group
Developed by Google's Nest Labs, ARM and Samsung, Thread is designed to build a low-power mesh network as an alternative to Wi-Fi, Bluetooth and more.
Thread, which uses 2.4GHz unlicensed spectrum, is built on existing standards, such as IEEE 802.15.4, IETF IPv6 and 6LoWPAN, meaning that existing devices which use ZigBee / 6LoWPAN etc. can easily migrate to Thread.... ...Full Story
Army turns to open architecture to plot its future in robotics Jared Serbu FederalNewsRadio August 19, 2014 - The Army's emerging strategy for buying and modernizing its ground based robotics systems relies heavily on open architectures, open standards and open source software....to keep costs down and maximize flexibility, the service is employing a strategy that emphasizes open architectures, reusable, interchangeable components and common, publicly defined interfaces between individual subsystems,...One significant employment of the strategy, which Shyu formally approved last week, will be the replacement of the Army's TALON system, which is nearing the end of its service life. A full-fledged successor, increment two of the Man Transportable Robotic System (MTRS) isn't expected to be fully deployed until 2021. It will enforce a set of open standards and interfaces the Army is adopting, but a "bridging strategy" will also insist on the use of open architectures and technology reuse, while the Army fields intermediate systems in the meantime...."The architectural standard was worked in conjunction with industry. It wasn't something we just thought up ourselves and threw up on the table," she said. "And the plug-and-play interface will be provided to industry to enable competition."
The Army has been working since 2011 to build what it calls the Unmanned Ground Vehicle Interoperability Profile (IOP), a collection of both hardware and software standards that will define how individual subsystems like radios and cameras within a robotic system communicate with one another, plus the hardware specifications those components will need to meet.
The first version of the still-evolving standard set aimed to document all of the interfaces in the systems the service already owns in an effort to break them down into discrete, identifiable modules.... ...Full Story
RISC-V: An Open Standard for SoCs The case for an open ISA Krste Asanović & David Patterson EETimes August 14, 2014 - Systems-on-a-chip (SoCs), where the processors and caches are a small part of the chip, are becoming ubiquitous. Thus many more companies today are making chips that include processors than in the past. Given that the industry has been revolutionized by open standards and open-source software -- like TCP/IP and Linux -- why is one of the most important interfaces proprietary?
While instruction set architectures (ISAs) may be proprietary for historical or business reasons, there is no good technical reason for the lack of free, open ISAs....We conclude that the industry would benefit from viable, freely open ISAs just as it has benefited from freely open versions of the software stack. For example, it would enable a real, free, open market of processor designs, which patents on ISA quirks prevent. This could lead to:
Greater innovation via free-market competition from many more designers, including open vs. proprietary implementations of the ISA.
-Shared, open core designs, which would mean shorter time to market, lower cost due reuse, fewer errors given many more eyeballs, and transparency that would make it hard, for example, for government agencies to add secret trap doors.
-Affordable processors for more devices, which would help expand the Internet of Things, whose target cost could be only $1.... ...Full Story
Updated NIST Guide Provides Computer Security Assessment Procedures for Core Security Controls NIST Techbeat August 13, 2014 - The National Institute of Standards and Technology (NIST) has issued for public comment a draft update of its primary guide to assessing the security and privacy controls that safeguard federal information systems and networks. Public comments are due by Sept. 26, 2014.
NIST publishes two complementary publications that together provide its basic guidance and recommendations for ensuring data security and privacy protection in federal information systems and organizations, a role assigned to NIST under the Federal Information Security Management Act (FISMA). The publications are so famous they are generally known just by their numbers.... ...Full Story
LibreOffice is coming to Android Jack Wallen TechRepublic August 12, 2014 - I've been hoping to see this headline for some time now. At the first LibreOffice Conference, the Document Foundation announced its plans to migrate LibreOffice to mobile devices. The plan didn't include a total rewrite of the code, but repurposing at least 90% of the current code base. That meant the majority of the work was already done. That last remaining 10%? The user interface. The 90% already compiles on Android -- so there is a working model. Of course, what good is a working model without an interface to go along with it?
But the single most important question to ask is "why"? Why is it so important for LibreOffice to make it to the mobile platform? I can answer that with three simple words:
Open Document Format... ...Full Story
The Connected Car, Part 1: The Future Starts Now - Will Linux Drive It? Jack M. Germain LinuxInsider August 11, 2014 - Tomorrow's connected cars will go beyond infotainment apps provided by Microsoft, Google or Apple. They will combine cloud-based services that enhance automotive safety and driving convenience with a broad range of supplemental services. These connected cars will be shaking handles with all the appliances in the Internet of Things that car owners will be able to control from behind the wheel.... ...Full Story
TIA is Pioneering New Standard to Address Cybersecurity Concerns of Telecom Networks Press Release Telecom Reseller August 6, 2014 - The Telecommunications Industry Association (TIA), the leading association representing the manufacturers and suppliers of high-tech communications networks, today announced that its TR-42.1 Engineering Committee on Commercial Building Telecommunications Cabling is developing an American National Standards Institute (ANSI)-accredited standard, known as TIA-5017, to address the physical network security of information and communications technology (ICT) networks.... ...Full Story