The Standards Blog

Home

Wednesday, January 12th, 2011 @ 05:33 AM
Contributed by: Andy Updegrove
Views: 3,312

The following is the introduction to the Feature Article in the most recent issue of Standards Today, the free "eJournal of News, Ideas and Analysis" that I have been writing for the last seven years.  You can read the entire article here, and sign up for a free subscription here.

For more than 100 years, the United States has been the exemplar of the "bottom up" model of standards development. Under this methodology, society relies on the private sector to identify standards-related needs and opportunities in most sectors, and then develops responsive specifications. Government, for its part, retains ultimate control over domains such as health, safety, and environmental protection, but preferentially uses private sector standards in procurement, and also references private sector standards into law when appropriate (e.g., as building codes).

Until recently, government agencies in the United States commonly developed their own standards for procurement purposes. This era of separate but equal standards creation officially came to an end with the passage of the National Technology Transfer and Advancement Act of 1995.  With this legislation, Congress directed government agencies to use "voluntary consensus standards" (VCSs) and other private sector specifications wherever practical rather than "government unique standards," and to participate in the development of these standards as well. In 1998, Office of Management and Budget Circular A-119 was amended to provide additional guidance to the Federal agencies on complying with the NTTAA.

Wednesday, January 5th, 2011 @ 05:44 PM
Contributed by: Andy Updegrove
Views: 21

Over the last few months, I've frequently pointed out the vulnerability of important open source projects that are supported and controlled by corporate sponsors, rather than hosted by independent foundations funded by corporate sponsors.  One of the examples I've given is SUSE Linux, which has been hosted and primarily supported by Novell since that company acquired SuSE Linux AG in 2003.  Novell, as you know, is expected to be acquired by a company called Attachmate a few weeks from now, assuming approval of the transaction by the Novell stockholders and by German competition regulators.

Recently, the future of the SUSE Linux Project (as compared to the Novell commercial Linux distribution based on the work of that project) has become rather murky, as reported by Pamela Jones,
at Groklaw.   Apparently, Novell is facilitating some sort of spin out of the Project, which is good but peculiar news.

Wednesday, December 29th, 2010 @ 05:09 PM
Contributed by: Andy Updegrove
Views: 20

The pace of technology is wondrous indeed. No corner of our lives seems safe from digital invasion, from picture frames to pasta makers. For years now, we have been threatened with Internet-enabled refrigerators, and perhaps 2011 will see it so.

Nor is the process likely to stop there. Soon, we are told, our homes will become infested by "mesh networks" of sensors, each one whispering information surreptitiously to its neighbor, in order to render our lives more energy efficient. But in so doing, they will observe our every move and report it to heavens knows whom.

Tuesday, December 21st, 2010 @ 08:49 AM
Contributed by: Andy Updegrove
Views: 9,596

 Have you discovered The Alexandria Project?

Last Thursday the European Commission took a major step forward on the “openness” scale.  The occasion was the release of a new version of the European Interoperability Framework (EIF) which definitively endorsed the use of open source friendly standards when providing “public services” within the EU. This result was rightly hailed by open source advocates like Open Forum Europe   But the EC took two steps backward in every other way as it revised its definition of "open standards," presumably reflecting IT industry efforts (e.g., by the Business Software Alliance) to preserve the value of software patents.
  In this blog entry, I’ll review the seven-year long process under which the “European Interoperability Framework” (EIF) first set a global high water mark for liberalizing the definition of open standards, and then retreated from that position.   
Monday, December 13th, 2010 @ 07:00 AM
Contributed by: Andy Updegrove
Views: 5,081

 

On December 8, the U.S. National Institute of Science and Technology (NIST) issued a public Request for Information on behalf of the recently formed Sub-Committee on Standards of the National Council of Research and Technology. The titular goal of the RFI is to assist the Sub-Committee in assessing the “Effectiveness of Federal Agency Participation in Standardization in Select Technology Sectors.” Although the publication of the RFI gave rise to not a single article in the press, this event was none the less extremely consequential.   
Thursday, December 9th, 2010 @ 05:53 AM
Contributed by: Andy Updegrove
Views: 24

The following is a position statement I contributed to an on-line forum that will launch at 11 EST today focusing on policy reform within the Chinese standardization system.  You can join in that discussion here

By Georgio (Own work (Photo personnelle)) [GFDL (www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (www.creativecommons.org/licenses/by-sa/3.0/)], via Wikimedia CommonsA variety of constituencies from the West have taken it upon themselves to reach out to China to "educate" the Chinese about the existing global standards development infrastructure, and to urge them to take part in that infrastructure in the same way as do other countries.  Clearly, having China, with a single national vote, participate in ISO, IEC and ITU would be best for the status quo players that have become skillful in participating in those organizations through decades of effort.  It's interesting to ask, however, whether that course of action, without more, would truly be best for China and its people.

If I were a policy maker in China, the most obvious question that I would be asking would be what strategy Chinese industry should follow as regards consortia, as well as the "Big I's."  To date, China has participated primarily in the latter, and in only a few of the former (e.g., OASIS and the W3C).  But China has launched a number of domestic consortia open either largely, or only, to domestic companies, to develop "home grown" standards.  And that seems backwards to me.

Monday, November 29th, 2010 @ 07:07 AM
Contributed by: Andy Updegrove
Views: 8,536

 Have you discovered The Alexandria Project?

Ever since the proposed acquisition of Novell by Attachmate Corporation there has been much curiosity, but almost no information, relating to the other major piece of the deal: the acquisition of 882 patents by a consortium led by Microsoft for $450 million. There are three main areas of undisclosed information that are piquing peoples’ interest, and in this bog entry I’ll go through each of them.  
Wednesday, November 24th, 2010 @ 10:32 AM
Contributed by: Andy Updegrove
Views: 18,877

Have you discovered The Alexandria Project?

Two days have now passed since Novell announced the high-level terms of its proposed sale, and so far the press has not been able to prize any additional details out of the parties involved. As a result, speculation is rife on several key points, and especially with respect to the 822 patents that Novell proposes to sell to a consortium of companies, only one of which has been disclosed: Microsoft.   Given this dearth of information and the fact that I’ve been a transactional lawyer for over 30 years, I’ll use this blog entry to lay out those things that can be known, those that can’t (yet) be known, and when we can expect additional disclosures.  (This is a long blog post, so if you have a short attention span and only care about Linux, that bit is at the end.)

 

Friday, November 12th, 2010 @ 08:35 AM
Contributed by: Andy Updegrove
Views: 4,275

Handshake, by Tobias Wolker, multiple licenses at http://commons.wikimedia.org/wiki/File:Handshake_%28Workshop_Cologne_%2706%29.jpegAbstract: The last twenty-five years have been marked by an explosion of consortia formed to develop, promote and/or otherwise support standards enabling information and communications technology. The reasons for forming a new consortium, as compared to adding to the work program of an existing body, include the absence in such organizations of appropriate technical expertise, interest, and/or supporting programs, as well as the benefits to be gained from directing all of the resources and efforts of a new consortium to the achievement of a set of specific objectives. This article reviews the benefits to be obtained from launching a new consortium, the criteria that should be used to determine whether doing so is appropriate, the programs and functionalities available for achieving specific goals, and the stages of institutional maturity at which each function should be added in order to accomplish a new organization's mission.

Tuesday, November 9th, 2010 @ 12:01 AM
Contributed by: Andy Updegrove
Views: 8,076

Have you discovered The Alexandria Project?

After sixteen years of working in parallel to the traditional standards infrastructure, the World Wide Web Consortium has taken an interesting decision: to begin submitting selected W3C Recommendations to that same system for endorsement. In doing so, it joins the small handful of consortia (seven, to be exact) that have applied for this option out of the hundreds of consortia currently active in the information and communications (ICT) to apply for that option.   

If this process sounds vaguely familiar, that’s likely because this is the same process that OASIS used to gain global endorsement of its OpenDocument Format (ODF).  Microsoft took a similar, but procedurally distinct, route with OOXML, its competing document format, when it offered it to ECMA, which enjoys a special “Fast Track” relationship with JTC1.  What won't sound familiar is the conditions that the W3C has successfully included in its application to make submissions, on which more below.