Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“Existing car safety standards only "marginally address security", and "do not protect against attacks"”
-Report of the European Union Agency for Network and Information Security (ENISA)
Connected cars should be subject to third-party cybersecurity evaluations says EU agency Out-Law.com January 17, 2017 - The European Union Agency for Network and Information Security (ENISA) said an "independent evaluation scheme" would help ensure technology developed for new 'connected cars', such as telematics, connected infotainment or intra-vehicular communication systems, is not vulnerable to hackers.
Existing car safety standards only "marginally address security", and "do not protect against attacks", ENISA said.... ...Full Story
Implementing Medical Device Cybersecurity: A Two-Stage Process James Baker Med Device Online January 14, 2017 - ...In what many experts believe was a world first, manufacturer Johnson & Johnson recently issued a warning to patients on a cyber-vulnerability in one of its medical devices. The company announced that an insulin pump it supplies had a potential connectivity vulnerability. The wireless communication link the device used contained a potential exploit that could have been used by an unauthorised third party to alter the insulin dosage delivered to the patient....
Connected device cybersecurity is best approached in two stages:
- First, security is considered and specified in a top-down process, steering system architecture design at a fundamental level, and devolving down through the development process into testable units.
- Second, the design implementation is tested and verified against the specification requirements. To further prove system integrity, penetration testing can be used, conducted by testers separate from the original developer.... ...Full Story
Top Trends to Watch in 2017 Adrian Davis Infosecurity Magazine January 13, 2017 - As we enter 2017, this will be the year in which the potential cracks in the pillars of the knowledge economy start to show....Until now, there has been very little talk of APIs in the context of cybersecurity. However, this will start to change as they become the ‘joins’ of the connected economy; enabling software and systems to interact as never before, uniting millions of businesses, products and services as they all drink together in the pool of ‘open data.’ Transport for London’s open API already powers over 500 new travel apps, while the Amazon Echo’s API could allow you to connect everything from your kettle to your car.
Yet by enabling different software to become fully interoperable, APIs will increasingly provide a potential pathway for cyber-attackers to hopscotch across every sector of the economy. Crucially, one of the potential consequences of APIs resides in the fact that all businesses, software and systems are only as secure as the weakest link in the API chain.
For example, one vulnerable API in an App Store can allow hackers to take over millions of smartphones. This means that software design and information security will increasingly come together, as business begins to realize that there must be a common standard of cybersecurity enshrined at the heart of the design process across the entire conjoined software ecosystem.... ...Full Story
W3C and OGC put more Spatial (and space-born) Data on the Web W3C.org January 12, 2017 - The Spatial Data on the Web Working Group, a collaboration between W3C and the Open Geospatial Consortium, has published 4 documents today. "QB4ST" adds extensions to the "RDF Data Cube" for spatio-temporal components. These are designed to make it easier to share and manipulate data such as Earth Observations with linkable slices through time and space. The QB4ST extensions are used in another of today’s publications, "Publishing and Using Earth Observation Data with the RDF Data Cube and the Discrete Global Grid System," which shows how SPARQL queries can be served through OGC’s developing Discrete Global Grid System for observations, coupled with a triple store for observational metadata. The approach makes use of the power of Linked Data on the Web without requiring all data points to be encoded as RDF triples....The latest Working Draft of the "Semantic Sensor Network Ontology" sets out a modular approach that allows alignment with related vocabularies. The modular architecture supports the judicious use of “just enough” semantics for diverse applications, including satellite imagery, large scale scientific monitoring, industrial and household infrastructure, citizen observers, and the Web of Things. Finally, the Working Group is pleased to publish an update to its "Spatial Data on the Web Best Practices" document that advises on best practices related to the publication and usage of spatial data on the Web; the use of Web technologies as they may be applied to location. ...Full Story
HDMI 2.1 Announced: Supports 8Kp60, Dynamic HDR, New Color Spaces, New 48G Cable Anton Shilov Anandtech January 11, 2017 - The HDMI Forum on Wednesday announced key specifications of the HDMI 2.1 standard, which will be published in the second quarter. The new standard will increase link bandwidth to 48 Gbps and will enable support for up to 10K resolutions without compression, new color spaces with up to 16 bits per component, dynamic HDR, variable refresh rates for gaming applications as well as new audio formats
The most important feature that the HDMI 2.1 specification brings is massively increased bandwidth over predecessors. That additional bandwidth (48 Gbps over 18 Gbps, a bit more than what a USB-C cable is rated for) will enable longer-term evolution of displays and TVs, but will require the industry to adopt the new 48G cable, which will keep using the existing connectors (Type A, C and D) and will retain backwards compatibility with existing equipment. The standard-length 48G cables (up to two meters) will use copper wires, but it remains to be seen what happens to long cables. It is noteworthy that while some of the new features that the HDMI 2.1 spec brings to the table require the new cable, others do not. As a result, some of the new features might be supported on some devices, whereas others might be not.... ...Full Story
Ford and Toyota Establish SmartDeviceLink Consortium to Accelerate Industry-Driven Standard for In-Vehicle Apps Press Release Ford et al. January 10, 2017 - Ford Motor Company and Toyota Motor Company are forming SmartDeviceLink Consortium, a nonprofit organization working to manage an open source software platform with the goal of giving consumers more choice in how they connect and control their smartphone apps on the road.
Mazda Motor Corporation, PSA Group, Fuji Heavy Industries Ltd. (FHI) and Suzuki Motor Corporation are the first automaker members of the consortium. Elektrobit, Luxoft, and Xevo join as the first supplier members. Harman, Panasonic, Pioneer and QNX have signed Letters of Intent to join.
SmartDeviceLink provides consumers easy access to smartphone apps using voice commands and in-vehicle displays. Adopting the open source platform gives automakers and suppliers a uniform standard with which to integrate apps. Developers benefit because they can focus on creating the best experience for customers by integrating one linking solution for use by all participating automakers....SmartDeviceLink enables smartphone app developers to seamlessly integrate their app functions with in-vehicle technology such as the vehicle display screen, steering wheel controls and voice recognition. With this new level of integration, drivers enjoy their favorite apps on the road in an enhanced, user-friendly way.... ...Full Story
VR Industry Forum launches with Sony Pictures, Ericsson, NAB, others on board Ben Munson FierceCable.com January 9, 2017 - The Virtual Reality Industry Forum (VRIF) is the latest nonprofit industry group assembled to help push widespread adoption of virtual reality.
Founding members include Akamai Technologies, ARRIS International, b<>com, Baylor University, CableLabs, Cinova Media, Dolby Laboratories, DTG, DTS, EBU, Ericsson, Fraunhofer, Harmonic, Huawei, Intel, Irdeto, Ittiam, MovieLabs, NABPILOT, Qualcomm Technologies, Inc., Technicolor, TNO, Sky, Sony Pictures, Vantrix, Verizon, Viaccess-Orca and Orah.
The Forum sprang up from a series of meetings over the past year in which the group has looked at ways to agree on industry standards for an “interoperable, end-to-end ecosystem presenting high-quality audio-visual VR services.”...VRIF has stated its specific goals as advocating for voluntary consensus on common VR technical standards, interoperability, best practices guidelines, and general promotion of VR services and apps.
The VRIF comes to the fore less than a month after the official launch of the Global Virtual Reality Association (GVRA), a group counting Acer Starbreeze, Google, HTC Vive, Facebook’s Oculus, Samsung and Sony Interactive Entertainment among its members.
The GVRA members are mostly headset vendors—as opposed to the VRIF membership which includes many technology vendors, industry groups and service providers—but its stated goals don’t veer too far from the VRIF’s similar sounding mission to help foster development and adoption for VR.... ...Full Story
New Standard For Smart Devices Launches At CES Chase Martin MediaPost January 9, 2017 - Connecting IoT devices from different brands may soon become seamless, thanks to a new device language from a leading IoT standards group.
The ZigBee Alliance, which comprises more than 400 global companies, just launched its dotdot language at CES.
The idea of dotdot is to establish a standardized device communication platform for IoT devices that allows them to communicate with each other, regardless of the type of network the devices operate on.
This standardized connectivity has been a function of the Alliance’s own ZigBee language, in that any device using ZigBee can communicate with another.
However, dotdot expands the compatibility to devices using other forms of connectivity, such as Wi-Fi or Bluetooth, even if they are not ZigBee devices.... ...Full Story
Wi-Fi Expands with .11ax at CES Kevin Krewell eeTimes.com January 6, 2017 - A new revision of the standard focused on supporting greater client density should begin to roll out in 2017...
Current state-of-the-art wireless routers are based on the 802.11ac Wave-2 standard with multi user (MU) MIMO and 4x4 antenna arrays. They enable spatial reuse to minimize channel sharing among multiple simultaneous users....
The success of Wi-Fi is leading to new problems in channel congestion and being able to fairly deliver bandwidth to all clients as the number of clients and their data demands rise. 802.11ax will address increasing congestion and will bring better bandwidth management... ...Full Story
ITU Announces New ‘Access to Information’ Policy ITU.org January 5, 2017 - ITU has started the New Year by launching a new access to information policy, committing to make more information and documents held, managed, or generated by ITU, to be openly available online.
The decision was made by ITU’s governing Council in 2016. It aims to bring public access to information for ITU’s main conferences and meetings in line with other international organisations like the World Bank, UNDP, and UNESCO. The decision will enhance transparency to ITU’s decision-making processes.... ...Full Story