Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“The new Standard Swedish shows in a slightly absurd way that there is no such thing as correct Swedish”
-Asst. Prof. Mikael Parkvall of Stockholm University’s Department of Linguistics, announcing gthe release of "New Standard Swedish"
Government Agencies to be Rated on Cybersecurity Using NIST Framework National Law Review March 22, 2017 - The Trump administration has announced that it will impose new metrics on federal agencies related to cybersecurity. Agencies and departments will be required to comply with the framework developed by the National Institute of Standards and Technology (NIST) and report back to the Department of Homeland Security (DHS), the Office of Management and Budget (OMB), and the White House....Plans to impose the NIST cybersecurity framework on federal agencies illustrate the Framework’s increasing importance as a standard for cybersecurity, not just for government agencies, but more broadly throughout the information ecosystem. With security breaches, state-sponsored cyber-attacks, and ransomware demands increasing, the Framework offers useful guidance on processes and actions designed to enhance data security for government and industry alike. ...Full Story
OGC approves new standard for geological science data Press Release OGC.org March 21, 2017 - The membership of the Open Geospatial Consortium (OGC®) has approved GeoSciML as an OGC Standard. The OGC GeoSciML Standard defines a model and encoding for geological features commonly described and portrayed in geological maps, cross sections, geological reports, and databases.
GeoSciML provides a mechanism for storage and exchange of a broad range of geologic data enabling users to generate geologic depictions (such as maps) in a consistent and repeatable fashion....This standard describes a logical model and GML/XML encoding rules for geological map data, geological time scales, boreholes, and metadata for laboratory analyses....
The GeoSciML standard includes a Lite model, used for simple map-based applications; a basic model, aligned with INSPIRE, for basic data exchange; and an extended model to address more complex scenarios. The standard also provides patterns, profiles (most notably of OGC Observations and Measurements - also ISO 19156), and best practices to deal with common geoscience use cases.... ...Full Story
Three challenges for the web, according to its inventor Tim Berners-Lee The Open Web Foundation March 20, 2017 - Today is the world wide web’s 28th birthday. Here’s a message from our founder and web inventor Sir Tim Berners-Lee on how the web has evolved, and what we must do to ensure it fulfils his vision of an equalising platform that benefits all of humanity.
Today marks 28 years since I submitted my original proposal for the world wide web. I imagined the web as an open platform that would allow everyone, everywhere to share information, access opportunities and collaborate across geographic and cultural boundaries. In many ways, the web has lived up to this vision, though it has been a recurring battle to keep it open. But over the past 12 months, I’ve become increasingly worried about three new trends, which I believe we must tackle in order for the web to fulfill its true potential as a tool which serves all of humanity.
1) We’ve lost control of our personal data
The current business model for many websites offers free content in exchange for personal data. Many of us agree to this – albeit often by accepting long and confusing terms and conditions documents – but fundamentally we do not mind some information being collected in exchange for free services. But, we’re missing a trick. As our data is then held in proprietary silos, out of sight to us, we lose out on the benefits we could realise if we had direct control over this data,...
2) It’s too easy for misinformation to spread on the web
...through the use of data science and armies of bots, those with bad intentions can game the system to spread misinformation for financial or political gain.
3) Political advertising online needs transparency and understanding
Political advertising online has rapidly become a sophisticated industry. The fact that most people get their information from just a few platforms and the increasing sophistication of algorithms drawing upon rich pools of personal data, means that political campaigns are now building individual adverts targeted directly at users. One source suggests that in the 2016 US election, as many as 50,000 variations of adverts were being served every single day on Facebook, a near-impossible situation to monitor. And there are suggestions that some political adverts – in the US and around the world – are being used in unethical ways – to point voters to fake news sites, for instance, or to keep others away from the polls....
These are complex problems, and the solutions will not be simple. But a few broad paths to progress are already clear. We must work together with web companies to strike a balance that puts a fair level of data control back in the hands of people, including the development of new technology like personal “data pods” if needed and exploring alternative revenue models like subscriptions and micropayments. We must fight against government over-reach in surveillance laws, including through the courts if necessary. We must push back against misinformation by encouraging gatekeepers such as Google and Facebook to continue their efforts to combat the problem, while avoiding the creation of any central bodies to decide what is “true” or not. We need more algorithmic transparency to understand how important decisions that affect our lives are being made, and perhaps a set of common principles to be followed. We urgently need to close the “internet blind spot” in the regulation of political campaigning....
It has taken all of us to build the web we have, and now it is up to all of us to build the web we want – for everyone. If you would like to be more involved, then do join our mailing list, do contribute to us, do join or donate to any of the organisations which are working on these issues around the world. ...Full Story
A Standard for Lighting Color Preference? NIST Techbeat March 20, 2017 - One of the goals of artificial lighting is to make things look natural....To hit the “sweet spot” between too dull and too vivid, lighting manufacturers rely on an international standard that helps them determine whether their white lights will render objects “correctly” – that is, the way they might look in sunlight. This standard is based on an old system called the Color Rendering Index (CRI), which scores lamps on their color fidelity: The higher the CRI score, the more natural objects should look when illuminated. A score of 100 is considered “perfect.” Most good white light lamps get scores of 80 or higher.
But just because something looks natural does not mean that people like it....The final goal is to allow a new version of the CRI to remain as a “color fidelity” metric, but also to create a new standard for “color preference” to give companies further guidance for manufacturing LED lights. Companies could use one or both of these metrics depending on the intended applications.... ...Full Story
ITU Publishes Policy Recommendations on Digital Financial Services Press Release ITU March 17, 2017 - After two years of extensive consultation, the ITU Focus Group on Digital Financial Services (DFS) has concluded its work with the publication of 85 policy recommendations and 28 supporting thematic reports. The Focus Group brought together more than 60 organizations from over 30 countries to drive greater financial inclusion for the estimated 2 billion people around the world who remain unbanked.
Commenting on the success of the Focus Group, ITU Secretary-General Houlin Zhao said: “Governments around the world face many similar challenges in their efforts to deliver fully integrated digital financial services. Until now solutions have largely been developed in isolation. This is the first time an organization has sought to develop a comprehensive set of practical and integrated guidelines drawing on expertise from across the financial service and telecommunication/ICT sectors.”... ...Full Story
New alliance to promote Ethereum blockchain technology Network Asia March 16, 2017 - The world's most advanced enterprise and startup blockchain innovators have formed an alliance to build, promote, and broadly support Ethereum-based technology best practices, standards, and a reference architecture, EntEth 1.0.
The Enterprise Ethereum Alliance (EEA) seeks to augment Ethereum, enabling it to serve as an enterprise-grade technology, with research and development focused on privacy, confidentiality, scalability, and security. EEA will also investigate hybrid architectures that span both permissioned and public Ethereum networks.
The founding members of the Enterprise Ethereum Alliance rotating board include Accenture, Banco Santander, BlockApps, BNY Mellon, CME Group, ConsenSys, IC3, Intel, J.P. Morgan, Microsoft, and Nuco....
EEA will collectively develop industry standards and facilitate open source collaboration with its member base as well as the Ethereum Chief Scientist and Inventor, Vitalik Buterin, and is open to any members of the Ethereum community who wish to participate. This collaborative framework will enable the mass adoption at a depth and breadth otherwise unachievable in individual corporate silos and provide insight to the future of scalability, privacy, and confidentiality of the public Ethereum permissionless network. ...Full Story
Using INSPIRE geospatial data to create innovative added-value services Monica Lopez Potes EU Joinup March 13, 2017 - The Joint Research Centre (JRC) of the European Commission has launched two pilot projects with private sector partners from Spain and The Netherlands to demonstrate the benefits of using Linked and Open INSPIRE Data using RDF, a developer friendly W3C specification for building the Semantic Web.
The JRC, in its efforts to facilitate cross-sector interoperability and help reuse the investments of INSPIRE in other data infrastructures, including Linked Data and Open Data portals, has procured and launched the development of two pilots. These pilots aim to illustrate how INSPIRE data can help in different e-Government services as well as the feasibility and possible benefits of representing INSPIRE data in RDF.
The first pilot is developed by Guadaltel(link is external) ...[and]addresses use cases in the area of the environment, more specifically related to the provisioning of a hydrography RDF services based on national INSPIRE data published by CNIG(link is external) (Centro Nacional de Información Geográfica). This RDF data can serve many applications but will explore its possible use within regional government and water management.
The second pilot...sets out to improve the information position of emergency responders by using Linked INSPIRE Data as a central point of reference.... ...Full Story
IEEE Announces Standards Project Addressing Algorithmic Bias Considerations Press Release IEEE.org March 10, 2017 - IEEE and the IEEE Standards Association (IEEE-SA), today announced the approval of IEEE P7003™—Algorithmic Bias Considerations. The new standards development project aims to provide individuals and organizations creating algorithms the certification-oriented methodologies that clearly articulate accountability and clarity around how algorithms target, assess and influence users and stakeholders of autonomous or intelligent systems....
"The rapid growth of algorithmic driven services has led to growing concerns among civil society, legislators, industry bodies and academics about potential unintended and undesirable biases within intelligent systems,” said Konstantinos Karachalios, managing director for IEEE-SA. “IEEE’s commitment to ethical alignment in intelligent and autonomous systems is further demonstrated by the approval of IEEE P7003, and follows in line with a number of initiatives and projects that are creating an all-encompassing framework to ensure end users and stakeholders are protected by prioritizing ethics in the development of new technologies.”
IEEE P7003 will allow algorithm creators to communicate to regulatory authorities and users that the most up-to-date best practices are used in the design, testing and evaluation of algorithms in order to avoid unjustified differential impact on users.... ...Full Story
Consumer Reports to Grade Products on Cybersecurity DarkReading.com March 9, 2017 - The non-profit consumer ratings group Consumer Reports plans to evaluate cybersecurity and privacy when ranking products, Reuters says. It is currently working with organizations to create methodologies for doing this. An early draft of standards is available.
This decision was made following a recent increase in cyberattacks on IoT devices, many of which contain vulnerabilities easily exploited by hackers. Researchers believe these attacks are unlikely to cease because manufacturers do not want to spend on securing connected products.
The draft prepared by Consumer Reports includes an analysis of built-in software security, amount of customer details collected, and whether all user data is deleted on account termination.... ...Full Story
EU updates smartphone secure development guideline Gijs Hillenius EU Joinup March 9, 2017 - The European Union Agency for Network and Information Security (ENISA) has published an updated version of its Smartphone Secure Development Guidelines. This document details the risks faced by developers of smartphone application, and provides ways to mitigate these.The original version of the Guidelines was published in 2011. The update was made available on 10 February. “New developments in both software and hardware have been translated into new significant threats for the mobile computing environment, highlighting the need for an update”, ENISA writes.
The guidelines detail 13 types of risk, including sensitive data, software flaws and (abuse of) biometric sensors. For each, the ENISA experts provide recommendations to reduce the risk of abuse. For example, to identify and protect sensitive data on mobile devices, ENISA recommends that software developers begin with classifying in the design phase data storage for passwords, personal data, location, and other sensitive records such as error logs. They can then process, store and use these data according to its classification, and validate the security of API calls.... ...Full Story