Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“Sometimes upholding constitutional ideas just isn't enough; sometimes you have to uphold the actual Constitution”
-Excerpt from the dedication of a new "dark email" protocol to the NSA by PGP developer Ladar Levison
New NIST Tools to Help Boost Wireless Channel Frequencies and Capacity NIST Techbeat February 27, 2015 - Smartphones and tablets are everywhere, which is great for communications but a growing burden on wireless channels. Forecasted huge increases in mobile data traffic call for exponentially more channel capacity. Boosting bandwidth and capacity could speed downloads, improve service quality, and enable new applications like the Internet of Things connecting a multitude of devices.To help solve the wireless crowding conundrum and support the next generation of mobile technology—5G cellular—researchers at the National Institute of Standards and Technology (NIST) are developing measurement tools for channels that are new for mobile communications and that could offer more than 1,000 times the bandwidth of today’s cell phone systems.... ...Full Story
HTTP/2 Will Make The Web ‘Faster And Safer’ Steve McCaskill Tech Week Europe February 27, 2015 - The Internet Engineering Steering Group (IESG) has approved the final standard for the HTTP/2 protocol, which could make browsing the Internet quicker and safer.
HTTP/2 is a major update to the Hypertext transfer protocol (HTTP), which is the foundation of data communication for the World Wide Web. The most widely used version of the standard, HTTP/1.1 was defined in 1999.
A working group has been developing HTTP/2 since 2012 and adopted Google’s SPDY protocol as an initial blueprint, with community feedback resulting in “substantial changes” to the standard, such as the compression scheme and the format of protocol.... ...Full Story
NIST Releases Update of Industrial Control Systems Security Guide for Final Public Review NIST Techbeat February 26, 2015 - The National Institute of Standards and Technology (NIST) has issued proposed updates to its Guide to Industrial Control Systems (ICS) Security (NIST Special Publication 800-82) for final public review and comment....Downloaded more than 3 million times since its initial release in 2006, the ICS security guide advises on how to reduce the vulnerability of computer-controlled industrial systems to malicious attacks, equipment failures, errors, inadequate malware protection and other threats. Industrial control systems encompass the hardware and software that control equipment and the information technologies that gather and process data. They are commonly used in factories and by public utilities and other owners and operators of major infrastructure.
Most industrial control systems began as proprietary, stand-alone collections of hardware and software that were walled off from the rest of the world and isolated from most external threats. Today, widely available software applications, Internet-enabled devices and other nonproprietary IT offerings have been integrated into most such systems. This connectivity has delivered many benefits, but it also has increased the vulnerability of these systems.... ...Full Story
Big Data, Hadoop Standards Group: Who's In, Who's Missing? Joe Panettieri Information Management February 25, 2015 - All eyes in the big data world are on the Open Data Platform -- a new association that strives to promote big data technologies and open source platforms like Hadoop. While promising and backed by big names like GE and IBM, the Open Data Platform initiative also lacks some key names....
Several industry giants and startups are driving the Open Data Platform group -- including Altiscale, Capgemini, CenturyLink, EMC, GE, Hortonworks, IBM, Infosys, Pivotal, SAS, Splunk, Teradata Verizon and VMware.
Still, some key names also are missing from effort.... ...Full Story
Security Standard Proposed for Bitcoin Exchanges and Wallets Stan Higgins Coindesk February 25, 2015 - A group composed of developers and security professionals has proposed a set of rules aimed at standardizing security protocols used by companies that handle or store digital currencies for their clients.
The proposal, created by the Cryptocurrency Certification Consortium (C4)...aims to provide an industry-level standard by which exchanges and wallet providers can operate.
The Cryptocurrency Security Standard (CCSS) draft proposal calls for 10 standardized approaches to key and seed generation, storage and usage, proof-of-reserve and security audits, among other areas. The framework consists of three levels per section, with each grade signifying a higher degree of security based on the proposed guidelines.... ...Full Story
How can you tell when the standards process isn't working? Perhaps the best indication is when a vendor decides it has to go to the time and cost (passed through to customers) of implementing two different standardized technologies in the same product. Hopefully this approach doesn't represent the future of wireless charging.
Samsung's Solution To Wireless Charging Fragmentation: Use All The Standards Lucian Armasu Giga.om February 24, 2015 - In a recent post on one of its websites, Samsung talked about the recent history of wireless charging and how the company has been working on bringing this technology to market since late 2000. It finally did it in 2011 when the company brought wireless charging support for its Droid Charge smartphone....Because we're talking about a brand new type of technology, having multiple standards can hurt adoption, so Samsung, which is a member of both consortiums, has decided that it's best to just use both technologies in its upcoming devices. This way, a device such as the Galaxy S6 could be backwards compatible with both standards and all the accessories that support them. Soon, for example, Samsung's devices could be charged wirelessly either at McDonalds restaurants, which use Qi charging, or at Starbucks stores, which use PowerMat chargers.... ...Full Story
LTE standards group targeting mission-critical push-to-talk specifications for early 2016 UrgentComm February 23, 2015 - Officials for 3GPP, the standards body for LTE technology, recently said the organization plans to establish a standard for mission-critical-voice functionality over LTE early next year. That action could have significant impact on both 4G LTE initiatives and LMR plans for public-safety and critical-communications entities.
To help ensure that this aggressive timeline can be met, 3GPP has created a new working group—called SA6—specifically to tackle the challenges associated with mission-critical applications, with an initial focus on mission-critical voice, according to 3GPP officials.... ...Full Story
Call for Papers: Conference Theme: Interoperability, Intellectual Property and Standards IEEE-SIIT.org February 23, 2015 - Interoperability has never been more important than it is today. It can be achieved by design, following the market or through standardization. How does intellectual property impact interoperability? How do these factors interact with standardization? IEEE-SIIT 2015 will explore these, and other, important questions.
IEEE-SIIT conferences aim at bringing together academia, government and industry participants engaged in standardization to foster the exchange of insights and views on all issues surrounding standards, standardization, interoperability and innovation. Contributing academic disciplines include, but are not limited to: Business Studies, Computer Science, Economics, Engineering, History, Information Systems, Law, Management Studies and Sociology....[the deadline for submissions is April 3, 2015] ...Full Story
Wireless Power Consortium Achieves Key Technology Milestones for Fast Charging and Resonant Multi-Device Charging with Spatial Freedom Press Release WPC.com February 20, 2015 - The Wireless Power Consortium (WPC), the driving force and leader in the global adoption of wireless power technology, today made two draft specifications available to its members that extend the capabilities of the Qi wireless power standard.
The first extension of the Qi specification, called "Volume II: Medium Power," enables fast charging of smartphones with up to 15 Watts delivered into the battery....The second extension of the Qi specification, called "Volume III: Shared Mode," enables multi-device charging with a single inverter, a resonant technology that reduces the cost of manufacturing multi-device chargers while providing large freedom of spatial positioning.... ...Full Story
Web standard promising faster page loads wins approval Steven Musil and Stephen Shankland CNET February 20, 2015 - A new version of the HTTP standard that promises to deliver Web pages to browsers faster has been formally approved, the Internet protocol's first revision in 16 years.
The specifications for HTTP 2.0 have been formally approved, according to a blog post by Mark Nottingham, who as chairman of the IETF HTTPBIS Working Group serves as the standard effort's leader. The specifications will go through a last formality -- the Request for Comment documenting and editorial processes -- then be published, Nottingham wrote.
HTTP, short for Hypertext Transfer Protocol, is one of the seminal standards of the Web. It governs how a browser communicates with a Web server to load a Web page. HTTP 2.0, the protocol's first major revision since HTTP 1.1 in 1999, is designed to load Web pages faster, allowing consumers to read more pages, buy more things and perform more and faster Internet searches.
The new standard is based on SPDY, a protocol Google introduced in 2009.... ...Full Story