Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“What the Federal Communications Commission’s local number portability rule is to telecommunications”
-TechPageOne's Dennis Smith offering an analog for interoperability standards and Cloud Computing
5.0 out of 5 stars Brilliant characters with an explosive plot line definitely worth a read!!! lizzy.b Amazon.co.uk Reader Reviews April 18, 2014 - Oh boy, I was definitely hooked. When I first started the book I had no idea where I would end up! It seems however that I have fallen hook, line, and sinker for this marvelous book of mystery and cyber-panic. One thing to note is that Updegrove really knows his stuff! I couldn’t believe the attention to detail that has been used in this book and the way that it is described and told to the reader.... ...Full Story
The Apache Software Foundation Announces 100 Million Downloads of Apache(tm) OpenOffice(tm) Press Release Apache Foundation April 18, 2014 - The Apache Software Foundation (ASF), the all-volunteer developers, stewards, and incubators of more than 170 Open Source projects and initiatives, announced today that Apache OpenOffice™ has been downloaded 100 million times.
Apache OpenOffice is the leading Open Source office document productivity suite, available in 32 languages on Windows, OS X, and Linux. OpenOffice includes a word processor ("Writer"), a spreadsheet ("Calc"), a presentation editor ("Impress"), a vector graphics editor ("Draw"), a mathematical formula editor ("Math"), and a database management program ("Base"). As Open Source software, Apache OpenOffice is available to all users free of charge; the C++ source code is readily available for anyone who wishes to enhance the applications....Official downloads at openoffice.org are hosted by SourceForge, where users can also find repositories for more than 750 extensions and over 2,800 templates for OpenOffice.... ...Full Story
6 standards that shape open-source cloud computing Dennis Smith TechPageOne April 18, 2014 - Open source, cloud computingAs cloud computing matures, the early stages of open cloud standards are taking shape – partly in response to IT’s concerns for increased security and lockout prevention. Alternately, the discussion continues around whether open source is the answer, especially given the number of firmly entrenched closed cloud players including Amazon, Google and HP....We’ve identified six standards areas that could influence the future of open-source cloud computing:... ...Full Story
XML Entity Definitions for Characters (2nd Edition), and Mathematical Markup Language (MathML) Version 3.0 2nd Edition are W3C Recommendations Press Release W3C.org April 17, 2014 - The Math Working Group has published two W3C Recommendations
* XML Entity Definitions for Characters (2nd Edition). This
document defines several sets of names, so that to each
name is assigned a Unicode character or sequence of
characters. Each of these sets is expressed as a file of
XML entity declarations.
* Mathematical Markup Language (MathML) Version 3.0 2nd
Edition. This specification defines the Mathematical Markup
Language, or MathML. MathML is a markup language for
describing mathematical notation and capturing both its
structure and content. The goal of MathML is to enable
mathematics to be served, received, and processed on the
World Wide Web, just as HTML has enabled this functionality
for text.... ...Full Story
Can the semantic web revolutionize healthcare analytics? Jennifer Bresnick EHR Intelligence April 17, 2014 - There’s nothing easy about building an analytics infrastructure in the healthcare industry. With data piling up in petabytes every couple of months and few organizations currently capable of wrestling their troves of clinical and financial into an actionable format, the analytics landscape looks hopelessly complicated and prohibitively expensive.
But what if data scientists could help healthcare organizations understand the value and deep interdependence of their data stores in an intuitive manner based on standards and natural language? Jay Shah, Executive Vice President at Octo Consulting, believes that the newly-emerging concept of the semantic web will provide a powerful boost to the problem of organizing and understanding healthcare data by creating new connections and leveraging the latest in cutting-edge data theory.... ...Full Story
SCTE Launches ‘Corporate Alliance Program’ Jeff Baumgartner MultiChannel News April 16, 2014 - In an effort to drive new technology training tools and education programs while also expanding its base of individual members, the Society of Cable Telecommunications Engineers (SCTE) has launched a Corporate Alliance Program, naming Comcast, Time Warner Cable and Suddenlink as the initiative’s charter members.
According to SCTE, the new program will focus on the development of training and education for emerging technologies, and offer discounts on individual employee memberships, access to online courses, and seats at the SCTE Leadership Institute programs at the Tuck School of Businesses at Dartmouth and the Georgia Tech Scheller College of Business.... ...Full Story
NIST Advisory Committee 2013 Annual Report Highlights Cybersecurity and Manufacturing NIST Techbeat April 16, 2014 - The Visiting Committee on Advanced Technology (VCAT) of the National Institute of Standards and Technology (NIST) has sent its 2013 annual report to Congress. The committee focused its primary attention on NIST's role and programs in two key administration priorities—advanced manufacturing and cybersecurity.
The committee report supports NIST's ongoing and planned work in cybersecurity and recognizes the level of effort and planning NIST puts into its outreach and partnership mechanisms for cybersecurity. The report applauds the success of NIST's execution of Executive Order 13636—Improving Critical Infrastructure Cybersecurity and other collaborative efforts. The committee recommends NIST continue its involvement in the framework's future.... ...Full Story
Take F2: NIST’s Latest, Most Accurate Time Standard Debuts NIST Techbeat April 15, 2014 - The National Institute of Standards and Technology (NIST) has officially launched a new atomic clock, called NIST-F2, to serve as a new U.S. civilian time and frequency standard, along with the current NIST-F1 standard.
NIST-F2 would neither gain nor lose one second in about 300 million years, making it about three times as accurate as NIST-F1, which has served as the standard since 1999. Both clocks use a "fountain" of cesium atoms to determine the exact length of a second.
NIST scientists recently reported the first official performance data for NIST-F2, which has been under development for a decade, to the International Bureau of Weights and Measures (BIPM), located near Paris, France. That agency collates data from atomic clocks around the world to produce Coordinated Universal Time (UTC), the international standard of time. According to BIPM data, NIST-F2 is now the world's most accurate time standard. ...Full Story
World's first Water Stewardship Standard is released Click Green April 15, 2014 - The first international Water Stewardship Standard, a global framework to promote sustainable freshwater use, has been released by the Alliance for Water Stewardship (AWS).
The Standard defines globally applicable, consistent criteria for sustainable management and use of the world’s limited freshwater resources.... ...Full Story