Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“The garden is a good place for testing technology”
-Semcon project manager Anna Funke on testing the new Bluetooth Mesh standards
On cyber, Trump team needs this Dodd-Frank piece to succeed Alan D. Grody The Hill June 27, 2017 - The U.S. Treasury’s common sense regulatory initiative, "A Financial System That Creates Economic Opportunities Banks and Credit Unions," includes a cybersecurity initiative that would have financial regulatory agencies standardize cyber security regulations. It also includes using a "common lexicon" to aid in that effort....The most critical data standard in financial cyberspace is that which describes the identity of large scale financial market participants. They, as well as corporations and other commercial users, use large payment systems to conduct business and pass value payments between themselves.A unique, unambiguous and universal identity code is critical as the first line of defense in preventing cybersecurity breaches. Hardening that identity so that it is unalterable, using encryption, public/private keys, hashing and other more advanced cryptology techniques should follow....To this end, the Office of Financial Research’s (OFR’s) legal entity identifier (LEI) initiative is well on its way to becoming that underlying identity code,...The OFR was created through the Dodd-Frank Act, which House Republicans want to replace with The Financial CHOICE Act. This act would eliminate the OFR. The rationale for eliminating the OFR focuses almost exclusively on its economic analysis function which, it is claimed, is duplicative of analysis done by multiple federal agencies.
This rationale fails to recognize the OFR’s key role in driving data standards throughout the financial system, a fundamental requirement for organizing data — particularly, identity data — to prevent cybersecurity breaches.... ...Full Story
Insurance industry making the leap to blockchain Matthew Lerner Business Insurance June 26, 2017 - Blockchain is making inroads into the insurance sector with the announcement of new initiatives aimed at expanding the use of the digital ledger technology.
Last week’s news of the initiative between American International Group Inc. and Standard Chartered Bank P.L.C. was the latest in a recent run of activity around the insurance sector’s potential use for the budding technology.
AIG and Standard Chartered, together with IBM, said they had used blockchain technology to create a multinational “smart contract” by converting a multinational, controlled master policy written in the United Kingdom and three local policies in the United States, Singapore and Kenya, into a format that provides a shared view of policy data and documentation in real-time, allowing visibility into coverage and premium payment at the local and master level as well as automated notifications to network participants following payment events.
Third parties, such as brokers, auditors and other stakeholders, can also be included, giving them a view of the policy and payment data and documentation. The pilot solution was built by IBM and is based on Hyperledger Fabric — a blockchain framework and one of the Hyperledger projects hosted by The Linux Foundation.
Results were promising, according to observers.... ...Full Story
China Is Driving To 5G And IoT Through Global Collaboration John Fruehe Forbes June 23, 2017 - Telecoms and cloud service providers are gearing up for two of the largest functional changes in decades: The Internet of Things (IoT) which is happening now and 5G which is on the horizon. Both will require substantial investments in capital and operations for today’s networks to be competitive and thrive in this connected future. No single vendor can deliver the full stack, and proprietary technologies will not keep pace with these future needs. This transformation will be delivered in virtualized (not physical) technologies, open source and multivendor, relying on significant integration work across many in the industry to be successful. Chinese players like China Mobilenull +0%, Huaweinull +0% and ZTEnull +0% are emerging as leaders in this space, through something not traditionally expected from the region: global collaboration.... ...Full Story
New Open Standard Makes Home Connection Simpler Semcon.com June 22, 2017 - The lack of joint standards makes home connection of products expensive and awkward. Semcon and Husqvarna have evaluated the new Bluetooth Mesh as part of their GRASS research project. The results show benefits in terms of range, simplicity and economy – and opportunities for broad usage.... ...Full Story
How open source is advancing the Semantic Web Don Watkins OpenSource.com June 21, 2017 - The Semantic Web, a term coined by World Wide Web (WWW) inventor Sir Tim Berners-Lee, refers to the concept that all the information in all the websites on the internet should be able to interoperate and communicate. That vision, of a web of knowledge that supplies information to anyone who wants it, is continuing to emerge and grow.
In the first generation of the WWW, Web 1.0, most people were consumers of content, and if you had a web presence it was comprised of a series of static pages conveyed in HTML. Websites had guest books and HTML forms, powered by Perl and other server-side scripting languages, that people could fill out. While HTML provides structure and syntax to the web, it doesn't provide meaning; therefore Web 1.0 couldn't inject meaning into the vast resources of the WWW.
Next came Web 2.0 and the emergence of user-generated content like blogs, wikis, video sharing, social media, and so forth. Dynamically generated content created two-way interaction. Sites like Flickr and Twitter employed user-generated tags (called folksonomies) to organize content into categories. While this represented a vast improvement in both interface and interaction over Web 1.0, it's not the full level of interactivity envisioned by Berners-Lee's definition of the Semantic Web.
The urgency to realize the Semantic Web has gained steam with the rapidly expanding Internet of Things (IoT), as each of these devices forms a web of semantic data that can be queried with appropriate tools. The intersection of artificial intelligence, big data, the IoT, and connected web technologies is creating the opportunity to derive more meaning and context from the data we share in our increasingly interconnected world. As this web of data continues to grow, we need software tools and frameworks to create and read this information...How does a web page distinguish information? How can my web content literally talk to other content in a way that the receiver knows my intent? How can information in a wiki's text and multimedia files, for example, be queried to determine what active projects took place in 2016? One open source tool that enables this type of interaction is Semantic MediaWiki.... ...Full Story
Opening up the way to industry transformation Alan Burkitt-Gray GTB.com June 20, 2017 - There’s a deep cultural change rolling through the industry. The way things have been done for the past century and a half – with vendors and operators doing their own R&D and competing vigorously – is being replaced by a new spirit of collaboration. At the heart of this is the move to software-defined networks (SDN) and network functions virtualisation (NFV) – two abbreviations that mean, in short, using IT industry-standard hardware in the network with software to define and run the services...[Historically,} Operators were usually locked in. If you had opted for Siemens switches in your network then it was a big task to introduce Ericsson or Alcatel alongside them. Now, the watchword across the industry is “open source”: software is free, developed by volunteers from the industry, and used by all who want to on standard hardware that is created by IT giants. Competition – for there will still be competition – has moved to different levels...
Full article: https://www.globaltelecomsbusiness.com/article/b13bt7b66lqc2h/opening-up-the-way-to-industry-transformation?copyrightInfo=true
Visit http://www.euromoney.com/reprints for additional distribution rights. For more articles like this, follow us @euromoney on Twitter. ...Full Story
World needs 1.8 million more cyber-security pros in the next five years Dave Neal V3 June 19, 2017 - Companies and organisations across the world will need another 1.8 million more cyber-security pros to protect themselves by 2022.
That's according to market researchers Frost and Sullivan. The deficit of security pros is revealed in the 2017 Global Information Security Workforce Study that the organisation has spent some time putting together...
Two-thirds of respondents said that they did not have enough skilled workers in-house to cope with current threats, and it's fair to assume that current threats are only going to get worse... ...Full Story
Potent malware targets electricity systems Business Standard June 16, 2017 - Hackers have developed powerful malware that can shut down electricity distribution systems and possibly other critical infrastructure, two cyber security firms announced today, with one report linking it to Russia.
Slovakia-based ESET said the malware is the most powerful threat to appear since Stuxnet, the hacking tool used to sabotage Iran's nuclear program believed developed by US and Israeli intelligence...
The company said Industroyer's potent threat is that it works using the communication protocols designed decades ago and built into energy, transportation, water and gas systems around the world...
Making use of these poorly-secured protocols, Industroyer can take direct control of electricity substation switches and circuit breakers, giving hackers the ability to shut down power distribution and damage equipment. ...Full Story
Public sector benefits from LibreOffice bug hunting Gijs Hillenius EU Joinup June 15, 2017 - The software development community working on LibreOffice have greatly scaled up their bug-hunting efforts, using automated software test tools made available by Google. Beneficiaries include the many European public administrations that use up-to-date versions of this suite of office productivity tools.
The Internet search engine giant is sharing some of its computing capacity to help open source projects find bugs. This markedly increases the number of tests, and so turns up software problems much faster...These tests are helping to improve the upcoming next version of LibreOffice, says Michael Meeks. All users of LibreOffice, including the many European public sector organisations, can reap the benefits. “If they stay up-to-date”, he adds. “Public administrations should make sure they have support and long-term maintenance for LibreOffice.” ...Full Story
Deadline Approaching: ANSI Nominations for 2017 Leadership and Service Awards ANSI.org June 14, 2017 - Reminder: Nominations due by Friday, June 16, for the American National Standards Institute (ANSI)’s 2017 Leadership and Service Awards. The awards, which are presented in conjunction with World Standards Week (WSW) 2017, honor individuals who have made significant contributions to voluntary consensus standardization and conformity assessment programs and have consistently demonstrated a commitment to their industry, their nation, and the enhancement of the global standards system... ...Full Story