Yesterday, the Deputy CTO of the US Office of Science and Technology Policy issued a press release highlighting the efforts (and success) of the Obama Administration in getting data compiled at public expense into the hands of the private sector for commercial repurposing. The release refers to a McKinsey & Company report that estimates that making such data publicly available “can generate more than $3 trillion a year in additional value in seven key domains of the global economy, including education, transportation, and electricity.”
If I have seen farther it is by standing on the shoulders of giants
Sir Isaac Newton, 1676
If the phrase “open innovation” has a familiar ring, that’s not surprising. It’s not only a popular buzz phrase, but it has the type of virtuous ring to it that instinctively inspires a favorable reaction. But like most simple phrases, it intrigues rather than enlightens. For example, is open innovation feasible in all areas of creative, commercial and scientific endeavor? If so, do the rules, challenges and rewards differ from discipline to discipline, and if it’s not universally feasible, why not?
The big news in the standards arena yesterday was a joint announcement by five of the standards setting organizations (SSOs) that have been most essential to the creation of the Internet and the Web: IEEE, World Wide Web Consortium (W3C), Internet Architecture Board (IAB), Internet Engineering Task Force (IETF), and Internet Society (the last three being closely affiliated entities).
Joint announcements by SSOs are rare, and the subject matter of this announcement was more so: each organization was joining in the endorsement of a set of five principles that they assert support a “new paradigm for standards” development.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years. You can read the entire article here, and sign up for a free subscription here.
For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).
Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995. With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.
The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.
Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.
On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.
Standards cover an awful lot of ground — how big things are; how much they weigh; how fast they go; how much power they consume; how pure they are; how they must be shaped so that they fit together — the list goes on and on. But despite the enormous range of characteristics that standards define, you notice that they all have one thing in common: you can describe them by using the word "how."
In short, standards relate to measurable things. Indeed, the earliest formal standards created in societies everywhere were usually those related to weights and measures. Invariably these were established when trade became more sophisticated than tribal bartering. Ever since, the history of standards has largely been one of establishing ways to define more and more measurable characteristics as they became important and as the scientific ability to test them came along.
There is, however, one exception to this rule. Curiously enough, it involves a standard that is as old as weights and measures themselves. And despite its ancient lineage, nations still can't agree for very long on what measuring stick should be used, or how it should work. This is rather remarkable, given that the standard in question is perhaps the only one that nearly everyone makes use of almost very day of their lives.
That standard, of course, is money — dollars, Euros, renminbi — each one a measure of value.
The last issue of Standards Today focused on XML - the underpinning of ODF and hundreds of other standards - and one of the most important standards ever developed. Here is the editorial from that issue.
One of the many intriguing concepts mooted by Pierre Tielhard de Chardin, a French philosopher and Jesuit priest with polymathic insights (his academic explorations range from paleontology to the meaning of the Cosmos) is the "noosphere." In de Chardin's vision, the reality of the world encompassed not just the geosphere (inanimate matter) and biosphere (all forms of life), but an ever expanding nimbus of knowledge representing the fusion of the minds and knowledge of all humans.
In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
Quote of the Day
“[D]o you want to [hand] a 500-page specification...to a light bulb manufacturer, or do you want source code that you can hand to that manufacturer that enables interoperability?”
-Linux Foundation Jim Zemlin on why open source software is replacing open standards
Alexandria Project Review: "Mind Blowing! (5 stars)" Amazon Reader Reviews October 31, 0214 - This is an absolutely fantastic book. It's a tale of technology and cyber crime told by a seasoned writer who obviously knows his way around a keyboard. The twists and turns that lead you through the story kind of reminded me of Polanski or Hitchcock. It's certainly easy to imagine the main character being played by Harrison Ford.
Bear in mind that this is not an easy read. There's enough here to get your brain worked up into a frenzy. The author seriously knows his stuff. I can't recommend this highly enough. ...Full Story
After Broadcom imbroglio, Open Interconnect Consortium, AllSeen Alliance wrestle with IP issues in IoT Monica Alleven FierceWirelessTech October 31, 2014 - The Open Interconnect Consortium (OIC) and the AllSeen Alliance are both working to standardize the Internet of Things (IoT) space and make devices interoperable--and in doing so they pit some of the industry's biggest giants against one other. And that battle appears to be entering a new phase over intellectual property (IP) licensing.
The situation crystalized earlier this month when Broadcom, a founding member of the OIC, reportedly left the organization due to a disagreement over intellectual property. GigaOm first reported Broadcom's exodus, citing a source who said Broadcom's departure was due to IP licensing agreements that required companies that were donating code to the project to give up their right to sue over that IP. The source said that the AllSeen Alliance doesn't have as rigorous a policy when it comes to its IP licensing agreements.
The AllSeen Alliance does have an IP policy, which is available here. But leaders of the OIC say it does not include a RAND-Z provision that says companies that participate must offer a zero-rate reasonable and non-discriminatory license to their code for member organizations. The OIC does have that provision.... ...Full Story
Bluetooth Smart Improvements Appear in More Devices Molly Wood New York Times October 31, 2014 - FOR years, Bluetooth was practically synonymous with irritation....Still, Bluetooth is becoming the default system for connecting our devices wirelessly. It is now responsible for connecting phones with wearable devices like fitness trackers, door locks and even toothbrushes and light bulbs. The reason: Bluetooth has quietly evolved into a much smarter technology....But Bluetooth Smart isn’t the only connection technology available, and its strongest rival, Wi-Fi Direct, offers faster data speeds and possibly stronger security.
Wi-Fi Direct is based on Wi-Fi, but it lets two devices connect without having to go through a wireless router....In the end, we’ll probably find ourselves in a world filled with both Wi-Fi Direct and Bluetooth Smart. The more difficult question to answer is whether any other connection standards will make a dent in their dominance, like near-field communication and ZigBee, another standard that allows devices (now mostly smart-home gadgets) to talk with one another.
Those other technologies have a steep hill to climb. Bluetooth and Wi-Fi are in almost everything these days, and Bluetooth, in particular, is cheap to include and increasingly reliable.... ...Full Story
W3C Declares HTML5 Standard Complete Frederic Lardinois TechCrunch October 30, 2014 - More than four years ago, Steve Jobs declared war on Flash and heralded HTML5 as the way to go. You could be forgiven if you thought the HTML5 standard — the follow-up to 1997’s HTML 4 — has long been set in stone, given that developers, browser vendors and the press have been talking about it for years now. In reality, however, HTML5 was still in flux — until today. The W3C today published its Recommendation of HTML5 — the final version of the standard after years of adding features and making changes to it....the W3C today notes in its press release that the next version of the standard needs to focus on a number of core “application foundations” like tools for security and privacy, device interactions, application lifecycle, media and real-time communications and services around the social web, payments and annotations. All of these are meant to make it easier for developers to support the web platform.... ...Full Story
Alliance to Promote Multi-Gigabit Ethernet Technology for Enterprise Wired and Wireless Access Networks Press Release NBASE-T Alliance October 30, 2014 - Cisco, Aquantia, Freescale and Xilinx today announced that they have formed the NBASE-T Alliance, an industry-wide cooperative effort to promote the development of 2.5 and 5 Gigabit Ethernet (2.5GE and 5GE) technology for enterprise network infrastructure. The objective of the nonprofit organization is to advance multi-gigabit Ethernet technology that enables faster data rates on existing enterprise cabling originally designed for 1 Gigabit Ethernet (1GbE) technology....Early promoters Cisco, Aquantia, Freescale and Xilinx welcome interested parties to join the alliance and contribute to its objectives. More details can be found on the alliance website, at www.nbaset.org.
According to Cisco Visual Networking Index (VNI), total mobile data traffic will surpass 30 Exabytes per month in 2018. An estimated 52 percent of that traffic will be offloaded from cellular networks to the fixed network through WiFi, adding to the vast amount of wireless data transmitted over WLAN in enterprise branch and campus networks. The 802.11ac WiFi standard was developed to deal with this massive amount of wireless data. As the Wave 2 of the technology gets introduced, traffic aggregated on APs will quickly surpass multiple gigabits per second, and therefore require both the access point and the Ethernet Switch ports to scale beyond the 1GbE used in most networks....In most enterprise campus networks around the world, Category 5e (Cat5e) and Category 6 (Cat6) twisted-pair copper cables are the most common deployed. These cables do not support 10 Gigabit Ethernet (10GbE) up to 100 meters, therefore the need for intermediate rates between 1 and 10 Gigabit has gained support throughout the industry. To advance the enormous potential for rates greater than 1GbE on legacy cabling, the NBASE-T Alliance founding companies teamed up to promote the development of 2.5GbE and 5GbE that will extend the life of the installed cable plant.... ...Full Story
NIF observatory: interoperability platforms boost data exchange, eServices and eSignature EC Joinup October 29, 0214 - The National Interoperability Framework Observatory (NIFO) community is making available an updated series of NIFO factsheets. The updates track interoperability initiatives in European countries.
Recently published on the Joinup platform, the updated NIFO factsheets provide new information on interoperability for over half of the countries. The update replaces factsheets from May this year. The observatory identified new interoperability platforms in many fields, including data exchange, eServices and eSignature.... ...Full Story
Take Control With Open Source Hardware Carla Schroder Linux.com October 29, 0214 - Free and open source software are no good without open hardware. If we can't install our software on a piece of hardware, it's not good for anything. Truly open hardware is fully-programmable and replicable. So what is open hardware, exactly? OSHWA, the Open Source Hardware Association, defines it as:
"Open source hardware is hardware whose design is made publicly available so that anyone can study, modify, distribute, make, and sell the design or hardware based on that design. The hardware's source, the design from which it is made, is available in the preferred format for making modifications to it. Ideally, open source hardware uses readily-available components and materials, standard processes, open infrastructure, unrestricted content, and open-source design tools to maximize the ability of individuals to make and use hardware. Open source hardware gives people the freedom to control their technology while sharing knowledge and encouraging commerce through the open exchange of designs."... ...Full Story
NISO Launches Open Discovery Initiative (ODI) Standing Committee NISO October 28, 0214 - The National Information Standards Organization (NISO) is pleased to announce the next phase for the Open Discovery Initiative, a project that explores community interactions in the realm of indexed discovery services. Following the working group’s recommendation to create an ongoing standing committee as outlined in the published recommended practice, Open Discovery Initiative: Promoting Transparency in Discovery (NISO RP-19-2014), NISO has formed a new standing committee reflecting a balance of stakeholders, with member representation from content providers, discovery providers, and libraries. The ODI Standing Committee will promote education about adoption of the ODI Recommended Practice, provide support for content providers and discovery providers during adoption, conduct a forum for ongoing discussion related to all aspects of discovery platforms for all stakeholders, and determine timing for additional actions that were outlined in the recommended practice.... ...Full Story
How a USB key drive could remove the hassles from two-factor authentication Tony Bradley PC World October 28, 0214 - We've had enough malware campaigns and data breaches to confirm the need for better data protection online. The Universal 2nd Factor (U2F) standard is a step in the right direction, and the first compatible devices are coming out now.
U2F is an open authentication standard. It was initially developed by Google, but it's now managed by the FIDO (Fast Identity Online) Alliance....Two-factor, or multi-factor authentication has long been promoted as a more effective security mechanism, but it's a hassle, requiring you to juggle passwords with a second factor such as a texted code or an authentication app. U2F proposes to streamline the process using a U2F-enabled USB or NFC key fob, card, or mobile device alongside traditional authentication methods.... ...Full Story
The Future of the Internet - 20 Years Ago The birth of Netscape and its browser Glyn Moody ComputerWorld.uk October 27, 0214 - By Glyn Moody | Published 15:15, 20 October 14
Facebook 3 Twitter 34 LinkedIn 0 Google Plus 2 Share This 75 Article comments
Last week, the following tweet appeared:
Netscape Navigator was released 20 years ago [last week]...The fall of Netscape was not entirely down to Microsoft's aggressive moves. Netscape made a number of serious missteps, and the quality of the Netscape Navigator code started deteriorating. Eventually, that led to most of the Netscape program being released as open source, and the creation of the Mozilla project - something I wrote about in detail in an Open Enterprise column published seven years ago.
But here, I'd like to dwell on that moment in October 1994 when the first beta version of Netscape Navigator was released, and many of us sensed that this was the start of a new era in computing. Below is a column I wrote at that time, exactly as it first appeared; I hope it conveys a little of the atmosphere of those heady times.... ...Full Story