Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“The new Standard Swedish shows in a slightly absurd way that there is no such thing as correct Swedish”
-Asst. Prof. Mikael Parkvall of Stockholm University’s Department of Linguistics, announcing gthe release of "New Standard Swedish"
Banks Build a New Standard for Cross-Border With SWIFT (And Not Blockchain Grace Noto Bank Innovation February 24, 2017 - Instead of distributed ledger technology, banks like Citi, Wells Fargo, and BBVA, are looking to projects like SWIFT’s global payments innovation service — or gpi — which went live today, following its January launch....SWIFT gpi enables banks to offer transparent and traceable cross-border payments. Through a Tracker feature, corporate treasurers will have an end-to-end view on the status of their payments, including confirmations when payments have been credited to beneficiaries’ accounts (about time?). At the moment, there are 12 banks exchanging live on the network, including Bank of China and UniCredit, plus more than 100 member-banks, like Citi and Wells.... ...Full Story
The Linux Foundation Announces Merger of Open Source ECOMPTM and OPEN-OTM to Form New Open Network Automation Platform (ONAPTM) Project Press Release Linux Foundation February 23, 2017 - The Linux Foundation...today announced the merger of open source ECOMP and Open Orchestrator Project (OPEN-O) to create the new Open Network Automation Platform (ONAP) Project. ONAP will allow end users to automate, design, orchestrate, and manage services and virtual functions.
AT&T, China Mobile and the world’s leading operators are driving ONAP with a diverse group of founding members. Founding Platinum members include Amdocs, AT&T, Bell Canada, China Mobile, China Telecom, Cisco, Ericsson, GigaSpaces, Huawei, IBM, Intel, Nokia, Orange, Tech Mahindra, VMware and ZTE. Silver members of ONAP are ARM, BOCO Inter-Telecom, Canonical, China Unicom, Cloudbase Solutions, Metaswitch and Raisecom....
The Linux Foundation will establish a governance and membership structure for ONAP to nurture a vibrant technical community. A Governing Board will guide business decisions, marketing and ensure alignment between the technical communities and members. The technical steering committee will provide leadership on the code merge and guide the technical direction of ONAP.... ...Full Story
Open source human body simulator trains future doctors Gijs Hillenius EU Joinup February 22, 2017 - SOFA an open source human body simulator used for training medical students and for preparing medical interventions, is being used by an increasing number of research centres and companies,...A human body simulator is just one of the many uses of SOFA, says Talbot. SOFA is a framework for multi-physics simulation. “Our software aims at interactive and real-time applications, with an emphasis on medical simulation”, he says....The simulation software can combine patient data to create simulations of, for example, eye operations, neurosurgery, liver surgery, or to create anatomical models....Inria, France’s computer science institute, began developing SOFA in 2006, with initial funding from the Department of Defence in the USA. Last year, the developers founded a consortium, aiming to increase the number of researchers and attract start-ups and other companies interested in using the simulator. The growing SOFA community includes both universities and medical and robotics start-ups in France and Germany.... ...Full Story
Understand Your Distributed Apps with the OpenTracing Standard Ian Murphy Linux.com February 21, 2017 - Microservices and services-oriented architecture are here to stay, but this kind of distributed system destroys the traditional type of process monitoring. Nonetheless, companies still need to understand just what’s happening inside the flow of an application. Ben Sigelman, Co-founder of LightStep, said at his keynote at CloudNativeCon that by adopting a new standard for distributed applications called OpenTracing can tell those stories without building complex instrumentation, or fundamentally changing the code of your application....OpenTracing is a vendor-neutral API standard, not something that one deploys, Sigelman said. Instead it’s something you program against, something you build into your microservices architecture. The OpenTracing API sits in the middle of the microservices process, like application logic, control-flow packages or existing instrumentation, and tracing infrastructure like LightStep, Zipkin, or Jaeger.... ...Full Story
Europe’s EV infrastructure boosted by new alliance Jonny Barstow Energy Live News February 20, 2017 - Five electric vehicle (EV) fast-charging station networks have joined forces.
The Open Fast Charging Alliance (OFCA) aims to enable seamless, long-range travel in a battery-powered car across Europe.
It hopes to achieve this by allowing members of any of the firms involved to top-up their vehicles at any of the other companies’ charge points.
This is known as bilateral roaming and will give drivers access to a total of around 500 chargers across the continent....The alliance is open to other networks as long as they share the group’s goals and are committed to providing 24/7 customer service to ensure maximum network uptime.... ...Full Story
Open Networking Foundation Unveils New Open Innovation Pipeline to Transform Open Networking Press Release Open Networking Foundation February 17, 2017 - Today the Open Networking Foundation (ONF) is announcing its new Open Innovation Pipeline made possible through the aligned operations of ONF and Open Networking Lab (ON.Lab) as these two organizations finalize their pending merger.
ON.Lab, with CORD® and ONOS®, successfully brought together operators, vendors and integrators to build solutions for carrier networks by leveraging SDN, NFV and Cloud technologies through an open source approach to solution creation. Operators have embraced the approach, and the industry is in the midst of a resulting transformation revolutionizing how solutions will be built for 5G mobile, ultra broadband and other next-generation networks.
Building on the success of CORD and ONOS, the ONF is industrializing and opening the unique process that enabled the creation of these platforms. Central to the approach is to leverage the ONF's deep relationships with operators to validate the vision, a focus on high-value use cases and solutions, and solidifying pre-established paths for taking solutions into operator PoCs (proof of concepts), trials and deployment.
Now that the SDN movement, first initiated by the ONF, has successfully set in motion the disaggregation of networking devices and control software and fostered the emergence of a broad range open source platforms, the industry needs a unifying effort to build solutions out of the numerous disaggregated components. A trend has emerged where vendors leverage open source to build closed proprietary solutions, providing only marginal benefit to the broader ecosystem. The ONF's Open Innovation Pipeline intends to counteract this trend by offering greater returns to members who participate in the ONF's collaborative process. Through making active contributions to the Open Innovation Pipeline, vendors benefit from inclusion in CORD and ONOS solutions, thereby gaining access to operator deployments.... ...Full Story
Open Standards and Open Source in Telecom OpenStand.org February 15, 2017 - ...The development of new internet-enabled mobile devices and internet service providers have brought telecommunications to the forefront, as well as trends towards cooperation between the Open Standards and Open Source communities,...
Standards bodies must exist to continue internet innovation and functionality....In order to build and maintain successful innovations like 5G networks and IoT devices, a collaborative network must exist between SDOs and OSS. SDOs can assist with architecture, quality and interoperability of Open Source projects, as well as enhance the overall vitality of the mobile value chain.... ...Full Story
DoC’s Internet of Things Initiative to Catalog Existing Security Standards ANSI.org February 10, 2017 - This month, the U.S. Commerce Department's (DoC) National Telecommunications and Information Administration (NTIA) announced that it aims to catalog existing security standards through its Internet of Things (IoT) initiative. Ultimately, the multi-stakeholder process will serve to develop a broad, shared definition or set of definitions around security upgradability for consumer IoT, and help enhance consumer awareness and understanding related to IoT purchases and security.
The DoC supports the advancement of IoT— which it defines as a transformational evolution in global technology with the potential to benefit public safety, health care, governance, and the environment and improve the daily lives of workers and consumers— as outlined in its newly released green paper, which identifies areas to advance efforts, including promoting standards and technology advancement.
The action is a response to feedback on both the Internet of Things and cybersecurity, in which stakeholders urged the DoC and NTIA to address the security of IoT through voluntary, multi-stakeholder processes.... ...Full Story
ITU unveils new standard for high-quality voice over LTE Emmanuel Elebeke Vanguardngr.com February 9, 2017 - ITU has released a new standard that will influence end-to-end Quality of Service (QoS) for voice communications over 4G mobile networks. The ITU said the standard is expected to form the basis of future ITU standards on specific aspects of QoS for Voice over LTE (VoLTE) and Video-telephony over LTE (ViLTE). The entrance of 4G mobile-wireless communications signalled the arrival of a multimedia-rich user experience. Despite 4G’s significant advances over previous generations of mobile-wireless technology, ensuring high-quality voice communications remains a significant challenge in the packet-based communications environment. The recommended ITU-T G.1028 “End-to-end QoS for voice over 4G mobile networks” was developed by ITU’s standardization expert group for ‘performance, QoS and QoE’, ITU-T Study Group 12. ITU-T G.1028 offers guidance on the factors impacting the end-to-end performance of “managed” voice applications over LTE networks and how the impacts of these factors should be assessed. The standard describes typical end-to-end scenarios involving LTE access, including scenarios where one of the parties connects using a wired or wireless access technology other than VoLTE.... ...Full Story
Google will soon open-source Google Earth Enterprise Frederic Lardinois TechCrunch February 8, 2017 - Google Earth Enterprise, which originally launched over ten years ago, was Google’s tool for businesses that wanted to build and host private versions of Google Earth and Google Maps for their internal geospatial applications. In 2015, the company announced that it would shut the service down in March 2017 but in what is becoming a pretty standard move for deprecated products, Google this week announced that it would open source all of the core Google Earth Enterprise (GEE) tools.