Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“Patents can promote innovation, but a patent is not a license to engage in deception”
-FTC director of Bureau of Consumer Protection Jessica L. Rich, commenting on the first settlement with a patent "troll"
Too many IoT standards, or too few? Richard Quinnell EDN Network November 20, 2014 - Interoperability and the easy exchange of data is a major concern in the buildup of the Internet of Things (IoT). To ensure those attributes, a set of commonly accepted standards will be needed. So, do we need to create those standards, or do we already have enough standards and simply need to pick and choose?...it may...be that there are enough standards already out there and what is needed is agreement on which set of standards are to be followed for the IoT. It is equally likely that a different set of standards will be in play for different use cases of the IoT, with applications such as industrial machinery using one set while telemedicine uses a different set. After all, if different types of applications have no need to share their data, then there is no reason to saddle them both with the same set of standards.... ...Full Story
State Council Pledges Support for Development of Cloud Computing USITO.org Weekly November 20, 2014 - On November 15, China's State Council pledged to accelerate efforts to develop cloud computing innovation as a means of stimulating development of China's information industry.
According to an official State Council statement...China will actively support the integrated development of cloud computing, the Internet of Things and mobile internet. China will also promote online research and design in the education and health care sectors, stimulate innovation in intelligent manufacturing based on cloud computing, and deploy pilot applications to enhance disease prevention, disaster mitigation, social security and e-government.
The statement also indicated that China would support core technological R&D necessary to enable these innovations, and allow the market to play a greater role in pricing information technology products and services. ...Full Story
Interview with OpenStand Advocate Tim Berners-Lee: The Internet Turns 25 OpenStand November 19, 2014 - From the beginning, the Internet was built on a set of open development principles, that are now recognized as the OpenStand Principles. As the Internet turns 25 this year, Tim Berners-Lee, inventor of the World Wide Web, sat down to reflect back on the first days of its existence. In the below video, he discusses how far web information has come, and how much more ground there is left to cover.... ...Full Story
Launching in 2015: A Certificate Authority to Encrypt the Entire Web Electronic Frontier Foundation November 18, 2014 - Today EFF is pleased to announce Let’s Encrypt, a new certificate authority (CA) initiative that we have put together with Mozilla, Cisco, Akamai, Identrust, and researchers at the University of Michigan that aims to clear the remaining roadblocks to transition the Web from HTTP to HTTPS.
Although the HTTP protocol has been hugely successful, it is inherently insecure. Whenever you use an HTTP website, you are always vulnerable to problems, including account hijacking and identity theft; surveillance and tracking by governments, companies, and both in concert; injection of malicious scripts into pages; and censorship that targets specific keywords or specific pages on sites. The HTTPS protocol, though it is not yet flawless, is a vast improvement on all of these fronts, and we need to move to a future where every website is HTTPS by default.With a launch scheduled for summer 2015, the Let’s Encrypt CA will automatically issue and manage free certificates for any website that needs them. Switching a webserver from HTTP to HTTPS with this CA will be as easy as issuing one command, or clicking one button....The Let’s Encrypt CA will be operated by a new non-profit organization called the Internet Security Research Group (ISRG). EFF helped to put together this initiative with Mozilla and the University of Michigan, and it has been joined for launch by partners including Cisco, Akamai, and Identrust. ...Full Story
Experts Predict Major Cyber Attack by 2025, According to Pew The Open Standard November 18, 2014 - The Pew Research Internet Project asked, and cyber security experts answered.
The iconic think tank has collected and parsed experts’ thoughts on the possibility of a “major cyber attack” by 2025 — and 61 percent of the 1,642 professionals interviewed said one would occur.
“By 2025, will a major cyber attack have caused widespread harm to a nation’s security and capacity to defend itself and its people?”
Pew asked: “By 2025, will a major cyber attack have caused widespread harm to a nation’s security and capacity to defend itself and its people?” The think tank defined “widespread harm” as “significant loss of life or property losses/damage/theft at the levels of tens of billions of dollars.”... ...Full Story
German e-health working group reasserts focus on interoperability Gijs Hillenius EU Joinup November 18, 2014 - Interoperability of e-health solutions is getting renewed attention from Germany’s health care organisations. Trouble with exchanging information between medical systems is hindering e-health reaching its full potential, says the Federal Ministry of Health. The ministry made interoperability a key topic at the e-health working group meeting, part of an IT Summit in Hamburg in October.
The ministry estimates that there are around 200 different healthcare IT systems in use in the country, creating interoperability barriers. In Hamburg, the e-health working group discussed the results of an e-health interoperability study. The results include a 2013 report, describing international and national interoperability e-health initiatives and good practices.... ...Full Story
Kalorama: New Consortium Will Improve miRNA Development Press Release Kalorama November 17, 2014 - Kalorama Information believes that a new consortium will greatly enhance the use of miRNA (or microRNA). A data management organization, the RNAcentral Consortium, now offers the website RNAcentral (http://rnacentral.org) to serve as a unified resource for all types of noncoding RNA data. Kalorama says the consortium was developed by pooling information from a variety of sources, including databases and tools for browsing, contains approximately 8 million sequences and can assist companies entering the marketplace....
miRNAs (MicroRNAs) are short, single stranded RNAs that regulate mRNA expression at the post-transcriptional level. These small bits of RNA, members of a class of non-translated molecules that do not produce protein, shut off gene transcription by base pairing with the target molecules. They are now recognized as pivotal regulators of gene expression; including development, proliferation, differentiation, and apoptosis and serving widespread functions as regulatory molecules in post-transcriptional gene silencing....there is great interest currently in the use of miRNAs as biomarkers for cancer and other diseases, given their involvement in cancer initiation, progression, migration, invasion and metastasis. Large data bases offer the opportunity to search out and evaluate large numbers of sequences. The detection of these sequences in plasma of breast cancer patients may provide new biomarkers for a number of different cancers, with the potential to develop and introduce novel and non-invasive screening tests.... ...Full Story
HDcctv Alliance Announces New HDCVI 2.0 Global Standard Based On Dahua HDCVI Technology SourceSecurity.com November 17, 2014 - The HDcctv Alliance is announcing a new global standard of HD analog — HDCVI 2.0. HDCVI 2.0 is based on Dahua’s HDCVI technology. The standard aims to provide a stringent level of certification among manufacturers. Certification will ensure that all HDCVI products with certification label are completely compatible with each other. This gives users complete freedom of choice for security equipment using different brands.... ...Full Story
New OASIS Standard to Build Biometric Security Wonderwall FindBiometrics.com November 14, 2014 - Non-profit IT consortium OASIS is developing a server-based biometric authentication standard. Industry professionals, government officials, and academics have been invited to help develop the standard as part of the Identity-Based Attestation and Open Exchange Protocol Specification – or IBOPS – Technical Committee.
The basic idea of the system they’re working on is to organize data storage by a server-based index system which, when accessed, would link to biometric identities that are not on the server. In other words, the data itself is not stored on the server, just indexed; and that index tells you where you can get the data, but that source is protected by biometric security measures. With this method, hackers could not access sensitive data by merely breaching the server.... ...Full Story
The Natural Security Alliance unveils new privacy rules around biometric security Stephen Mayhew Biometric Update.com November 14, 2014 - The Natural Security Alliance, an authentication standards association, has created a set of privacy rules that will help companies implement biometric security best practices and comply with data protection laws....The basis for the Privacy Rules can be attributed to the “accountability principle” established by the Article 29 Working Party, an independent advisory body established by the European Parliament to investigate concerns of personal data and privacy, as well as concepts around the application of biometrics from the EU’s National Data Protection Authorities....Additionally, the Alliance has developed two instruments: the certification and the mark, which ensure that products and organizations integrating the Natural Security Standard comply with the technical specifications. Certified products are deemed “genuine”, and able to communicate with other certified products as part of a genuine Natural Security environment. The Natural Security mark shows data subjects that the organizations that handle their data comply with the Natural Security Standard.... ...Full Story