Consortium Standards Bulletin- February 2007
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
   Home > Consortium Standards Bulletin > February 2007
  Untitled Document
Untitled Document


February 2007
Vol VI, No. 2


Today, a building in the United States with public access must by law be accessible to those with disabilities but a computer program does not, even if the inability to use it might be a bar to employment. As modern society becomes ever more dependent on information and communications technology, both the private sector as well as the public sector need to become more mindful of accessibility needs that can be favorably addressed through standards. Print this article


Governments interact with standards as developers (when they draft laws), adopters (when they reference standards in regulations), influencers (when they join SSOs), and as end-users. To date, government involvement with ICT standards has been light. But as essential services continue to redeploy across the Internet, the workplace becomes ever more IT dependent, and paper public records give way to exclusively digitized data, its time for that role to be re-evaluated.   Print this article
Last August, the Commissioners of the Federal Trade Commission voted unanimously to find semiconductor designer Rambus, Inc. guilty of abusing the standards system, but deferred assigning a penalty pending further testimony and deliberations.  In February they issued their verdict, and capped the royalties that Rambus could earn to license its SDRAM patents to implement the JEDEC standard at issue.   It could have been worse for Rambus – and almost was, with two out of five Commissioners filing an opinion advocating a harsher penalty. Print this article
Legislators in three states (thus far) this year have introduced coordinated bills addressing a key standards-related policy issue: how can governments best protect public records? Print this article
What could be more boring than a standard listing arbitrarily assigned three-letter codes to identify languages? You might be surprised. Print this article

Print this issue





Over the past several years, I have found myself returning repeatedly to the role (or, more properly, the many roles) that governments play in standardization.  Recent issues with government themes include Government and SSOs: Optimizing the System (August 2005), Massachusetts and OpenDocument: the Commonwealth Leads the Way (September 2005), Standards for a Small Planet (October 2005), WSIS and the Governance of the Internet (November 2005), and Standards and Human Rights (September 2006).  Stories in other issues along the way have explored related themes.

That's not surprising, because private-sector standard-setting is a quasi-governmental process in its own right.  Moreover, there are many interdependencies between the private and public sectors when it comes to standards: governments set standards (in laws and regulations); they adopt private sector standards (by referencing them in laws and regulations); they participate directly in standard setting when they join standards organizations as members; and they influence standards through procurement, to highlight only the more obvious examples.

This public-private partnership has been extremely productive for all concerned, allowing standards to be set by those parties that have a direct interest in their specifics (thus saving government from the burden of creating the standards themselves), while still allowing governments to vet and utilize the results when they wish.  But in the United States, which supports a "bottom up" standard setting philosophy, the relationship between government and the private sector is much looser and ad hoc than is the case in many other countries.  The result is that the U.S. federal government maintains a more detached, and less informed, relationship to standards than is often the case abroad.

In this issue, I focus on an area in which I believe governments should take renewed interest: the role of information and communications technology (ICT) standards in modern society.  With our increased reliance on the Internet and the Web and the digitization of public records, the need for a citizen to have full access to ICT at home and in the workplace has become fundamental.  In consequence, I believe it is incumbent upon governments to reevaluate the roles they play in relation to standard setting in ICT domains.

I begin this exploration in my Editorial, focusing on "accessibility standards," broadly construed – highlighting the role of standards domestically in areas such as IT accessibility for those with disabilities, and internationally at the level of domain names and broadband access.

I continue this theme in this month's Feature Article, in which I review the various ways that governments interact with standards and standard setting, examine how various roles might best apply in the case of accessibility standards, and finally make recommendations on how governments could best evolve their relationship to ICT standards going forward.

Next up is an Update on the long-running prosecution of semiconductor design firm Rambus, Inc. by the Federal Trade Commission.  The FTC's decision to sanction Rambus highlights the traditional and ongoing role of government as the ultimate (and perhaps too occasional) guarantor against abuse of the standard setting process.

This month I've decided to include two related entries from the Standards Blog rather than one, as each relates to an accelerating trend among governments to mandate the use of "open document formats" to preserve public records.  Four U.S. States now either already specify (Massachusetts) that their employees create and save documents in such standards-based formats, or have current legislation in process that, if adopted, would require such action (Minnesota, Texas and California). 

Finally, I turn to a different type of accessibility standard in my Consider This piece for this month.  That standard assigns a simple three-letter code to each language in existence (and to many that are now extinct).  This humble and little-noticed standard, like the Unicode, helps ensure that all peoples of the world will be able to access the Internet.  You may be surprised how the standard came into being, and the nature of the organization that has taken responsibility for keeping it current.

I'm certain that this won't be the last issue that I dedicate to the relationship between government and standards, and I look forward to continuing to share my thoughts with you in the future on this important topic.  Perhaps you will find the time to share a few of your thoughts with me as well.

As always, I hope you enjoy this issue. 
Andrew Updegrove
Editor and Publisher
2005 ANSI President’s
Award for Journalism
The complete series of Consortium Standards Bulletins is available on-line at  It can also be found in libraries around the world as part of the EBSCO Publishing bibliographic and research databases




Andrew Updegrove


Citizens of modern societies lead highly regulated lives.  Whether as individuals we agree or disagree with the degree to which governments control our existence, we nevertheless benefit from a myriad of laws and regulations that seek to ensure our safety and welfare.  The range of regulation is breath taking, encompassing the purity of air and water, the quality of food, the sanitation of towns and cities, the safety of transportation systems, and the delivery of utilities and other essential services, to name just a few.

To date, however, the provision and usage of information and communications technologies (ICT) are largely unregulated at the technical level, despite the increasingly profound impact that ICT has on our lives.  True, the communications side of the equation continues to be subject to significant government control.  Radio, television, and a rapidly increasing range of wireless frequencies are the subject of treaties internationally, while the allocation, sale and usage of the bandwidth thus defined remains the province of national regulation.  In the United States, Congress occasionally passes a law to accomplish a particular data-related purpose, such as preventing the unauthorized sale of consumer information.  But most aspects of the modern networked world are controlled primarily by commercial forces, and to the extent that they are regulated on a de facto basis, it is through the adoption and use of consensus (and sometimes proprietary) standards. 

This relative lack of regulation is attributable in part to the decreasing intervals between technology revolutions that typify the world of ICT today.  Regulation is a ponderous process that usually begins with perceived pain at the voter or industry level, which is then made known to legislators through the ballot box and the lobbyist.  Only with time do those that make the laws and regulations become engaged, educated, and eventually active, since government is far more likely to be reactive than proactive.  Similarly, regulators must perceive a problem and investigate it before they can act in the individual case – at which point the lengthy process of litigation begins.  Small wonder, then, that legislators have little appetite to regulate what may have ceased to be relevant by the time the regulations are complete.

As a result, allowing ICT deployment to romp ahead of regulation in an unconstrained fashion may in fact be the right as well as the inevitable course of (in)action.  But a creeping sense of unease can arise when one considers that more and more of the reality of modern life plays out across the Internet and the Web.  Is security adequate to protect privacy and prevent identity theft?  Could the cyberinfrastructure withstand a determined assault, not by hackers, but by an enemy government?  Has adequate attention been given to whether those with disabilities can exist in a Web-based economy?

These are hardly trivial questions, as more and more critical activities redeploy to the Internet. To give but a few examples, national and global financial infrastructures, first responder networks, national defense telecommunications, government services, and the healthcare system are all becoming ever more dependent upon ICT platforms forming national, and often international networks.  These platforms are evolving rapidly and organically largely in response to market forces, without guidance from, and subject to few constraints imposed by, governments in the free world.

As our dependency on ICT increases exponentially, it therefore makes sense to ask whether the continuation of this laissez-faire atmosphere will remain (even) on balance a good thing.

At the highest level, that question might be answered in one of three ways:  

The first would be to conclude that we should apply the theory and practice of historical regulation to evolving ICT realities in as consistent a manner as possible.  But the old wine often pours poorly into these new bottles.  Established notions regarding copyrights, for example, are being challenged by consumers that want to link, mash up and share Web-based content that is effortlessly accessed and difficult to copy protect.  Intriguingly, many major companies in the equally established (and challenged) industries that produce and own this content are scrambling to t hink up ways to make money on freely distributed content. This process already muted the calls for reform, and may ultimately moot the need for new laws and regulations entirely.  On the other hand, a court has recently concluded that a major retailer (Target Corporation) should be required to make its Website as accessible to its customers as its stores.

Another option would be to decide that we should go to the opposite extreme, and declare the Internet, the Web, and everything that relies upon them to be unique, demanding new approaches and novel solutions.  Or, less radically, modern ICT could be viewed as a fresh slate upon which new regulatory formulae, as fresh as the opportunities that these technologies offer, could be written, free of the obligation to apply old rules in rote fashion. 

But success has been mixed with this approach as well, as demonstrated by the history to date of the Internet Corporation for Assigned Names and Numbers, more often referred to simply as ICANN.  ICANN was created as a brand new quasi-public entity in 1998 to maintain the root directories of the Internet, although it is common knowledge that the International Telecommunication Union (ITU) views itself as the natural and rightful custodian of these basis resources.  The bypassing of the ITU was accomplished through the efforts of those who were loath to entrust the root directories to the venerable (and bureaucratic) ITU, despite the fact that it is a treaty organization in which governments work together under the aegis of the United Nations.  But despite this fresh approach, ICANN has been regularly criticized on a number of fronts, and remains controversial today, due in large part to the continuing influence that the United States (as the original developer of the Internet) has reserved for itself over ICANN.

The third high-level option would be for government to continue to largely stand aside, allowing commercial interests to play the greatest role in defining our ICT future.  But already, as the recent controversy over so-called "Internet Neutrality" has demonstrated, government may find itself dragged into the debate over ICT whether it wishes to stand aside or not.  And at other times where similar intervention may be sorely needed, less organized stakeholders, such as consumers, and less powerful polities, such as third world populations, will likely have a hard time making their voices heard.

Thus it seems that there is no clear answer to the question of what role governments – and, for that matter, what role which governments – should play in regulating ICT.  During the pioneer period of the Internet and the Web, one could make a strong case that government has played the most important role it could by simply staying out of the way.

But now that the paradigm has shifted, and everything that was based upon tangible media has become virtual, it may be time to reexamine the balance between unregulated innovation and the provision of essential services.   Perhaps, before things reach a crisis point that leads to overreaction and over regulation, someone needs to consider those big issues that are of a type that government has addressed in the past. 

Many of those issues in ICT can conveniently be grouped under the concept of accessibility.  Will those with disabilities have equal access to Internet based information and services?  Will public records be accessible to all over the long term, regardless of changes in proprietary technology?  Will individual patient records be accessible, on an appropriate, privacy-protected, basis to those doctors and others that need to review them, regardless of where they work?  And will access to identity information only be granted to those who have a valid reason to gain it?

It may not be appropriate to argue that it is urgent at this time to reconsider the role of government in the regulation of ICT, although it is unquestionably timely.   Regardless of timing, one thing can be assumed with certainty:  if private industry does not satisfactorily address the issues identified above, then sooner or later, voters and special interest groups will demand that Congress do the job instead of industry.  When that happens, it is unlikely that the legislative approach will be fresh or innovative.  In many cases, this may serve the public weal quite well.  But in others, more nimble and flexible solutions, devised and deployed by those most intimately involved at the market and the technical levels, might be of far better benefit to all.

That's something that everyone involved in developing and deploying technology might take the time to consider. It would be especially worthwhile for those that are involved in the consensus based ICT standards organizations that have the power to place us on prudent paths of self-regulation to do so as we move forward into the future.


Comments? Email:

Copyright 2007 Andrew Updegrove



Untitled Document


Andrew Updegrove

Abstract: Governments interact with standards in many ways: as developers, when they draft regulations; as adopters, when they reference private sector standards in laws and regulations; as influencers, when they exercise their vast procurement powers in the marketplace, and when they send representatives to participate in private sector standard setting organizations; and as end-users, when they utilize standards-based products. The role that a given government decides to play in relation to standards varies, depending upon the subject matter of the standard, and also among governments, and over time. To date, governments have not as often acted as developers or adopters in the area of information and communications technologies (ICT) as they have in traditional areas of interest, such as public health and safety. However, with the redeployment of a vast range of essential services (including government services) over the Internet, the digitization of public records, and the increased use of information technology in the work place, it is incumbent upon governments to reevaluate their relationship to ICT standards, and decide what roles they wish to play in ensuring that standards development and uptake best serves the public interest. In this article, I seek to facilitate that process, by reviewing the various roles that government can play, using accessibility standards (broadly construed) as an example.


Introduction: Over the last hundred years, a fairly predicable allocation of responsibility for standards has emerged among government and industry. Typically, government assumes primary responsibility in areas such as health, public safety and ensuring the equality of access to basic rights and opportunities. When a government acts, it provides laws and regulations defining minimum standards, specifies consequences for failing to meet those standards, and enforces those standards through the courts. The common theme among these standards is that government generally concerns itself with those areas where the potential harm to the citizen from non-compliance can be highest.

Private industry tends to be most active in what might be considered to be elective standards, or specifications that are used where the non-commercial stakes are much lower.  For example, the physical dimensions of a light bulb socket in a lamp, and the gauge of the electrical wiring in the wall that supplies the current to that lamp, are all physical standards developed by accredited standards development organizations (SDOs).  Most aspects of the light socket standard are intended solely to ensure that a light bulb purchased from one manufacturer can be used in a lamp fabricated by another (an interoperability standard), and vendors comply with the standard not because they must, but because it makes good commercial sense for them to do so.  In contrast, the gauge standards for the wiring in a wall have been created (again by SDOs) in order to provide reliable reference points for determining the load that can be placed on that wiring without generating dangerous amounts of heat.  In the United States, gauge standards achieve the force of law when they are incorporated by reference into local building codes from coast to coast.

Government and industry thus often work hand in hand, with governments taking advantage of the efforts of SDOs and unaccredited organizations (consortia) to develop useful and appropriate standards, and adopting those standards as the reference points for regulation.  As a result, standard setting organizations that are both accredited as well as unaccredited (collectively, "SSOs") fulfill a kind of quasi-public role in supporting government.  This occurs not only in areas such as health and safety, where governments adopt the standards that SSOs create, but also in more elective situations, where society benefits from the development and broad adoption of interoperability and performance standards for use in less critical areas, and government plays no part at all.

But where should the lines be drawn between these three layers of standard setting?  How should governments determine when they should preempt the field entirely (e.g., in setting automotive safety standards), where they should adopt industry standards (e.g., by incorporating SSO standards into building codes), and when they should leave everything to the SSOs themselves (as is the case with almost all information technology standards)?  And how should governments decide when the boundary lines between those three layers should change, as technology, societal practices, and other variables evolve over time?

Moreover, are there limits that governments should apply to their ability to impose standards?  Conversely, are their situations where governments should use standards, not to protect the populace, but to advance social agendas, or to encourage certain types of behavior?

One area where questions such as these are being asked with increasing frequency may be loosely referred to as "accessibility," a domain that involves a broad array of practices that until very recently have always involved face to face communications and paper-based records.  Now, these same communications are increasingly being transacted via the Internet, and the resulting records are archived in electronic media.  This transformation creates enormous opportunities and efficiencies, but at the same time gives rise to new concerns of significance, and can exacerbate traditional risks and concerns as well. 

For example, when a doctor created a typical paper-based record a decade ago, that record existed only in that physician's file cabinets, except to the extent that she sent a photocopy to a known recipient, that in turn maintained physical custody of the record, or destroyed it.  Today that same record, which might contain not only highly confidential medical information, but also the name, birth date and social security number of the patient, will likely be created in electronic form on the physician's server (hopefully protected by a robust firewall), and then rapidly propagate to the servers of a hospital, a radiologist, an oncologist, an insurance company – and perhaps others.  Moreover, if the patient needs urgent medical attention while on a skiing vacation, a doctor in Vail, Colorado may need immediate, secure access to that same record.  Fifty years later, the same data may be wanted for diagnostic purposes by a descendant of the same patient.

In this article, I will review the direct and indirect roles that governments have historically played in relation to standard setting, and how these roles may relate to the differing challenges that ICT presents on an ever-evolving basis today.  I will then use the example of accessibility standards to explore what types of roles government might choose to play in connection with ICT, both from the traditional perspective of protecting health and safety, as well as in new or more elective roles related to advancing social agendas.  Finally, I will recommend specific actions that government might take to maximize its effectiveness in these roles.

I           Direct Government Involvement

Governments can influence the development and uptake of standards in a variety of ways.  The following are the most important modes of engagement that government has traditionally employed.

Procurement:  The most obvious ways that governments can influence standard setting are first, through its enormous procurement power, and second, through direct participation in consensus based standards organizations.  In the United States until 1995, the former was by far the dominant role at the federal level, with the latter being far less important.  Most United States federal government agencies fulfilled many of their purchasing requirements by requiring that bidders build to "government unique" specifications.  However, these specifications did not necessarily become adopted outside of the government domain, due to factors such as differing levels of cost consciousness between the public and private sectors, design features that are unique to government needs, and more rigorous performance requirements that were not always of equal appeal in the general marketplace.

Over time, the inefficiencies of government unique specifications-based procurement became obvious, in comparison to buying products in the open market that were based upon consensus standards.  In response, Congress decided to both improve its own purchasing practices as well as bolster the private development of standards through the passage of the National Technology Transfer and Advancement Act of 1995 ("NTTAA"), which explicitly promotes voluntary consensus standards for regulation and procurement by the U.S. government. 1  

Following the passage of the NTTAA, Federal agencies were required to use non-government unique standards whenever possible, and also to actively participate in the development of those standards. As a result, the long-standing roles of the federal government came to be reversed.  Thereafter, its specification drafting influence dramatically diminished, while its level of participation in the consensus standard setting process as dramatically increased. In 1998, the Office of Management and Budget (OMB) updated its already issued Circular A-119 to provide additional guidance to the Federal agencies on implementing the NTTAA. 2

Whether or not the overall impact of the federal government increased, decreased or stayed the same as a result of the passage of the NTTAA would be difficult to determine without an empirical study.  On the one hand, procurement of products built to government-unique specifications dramatically decreased.  On the other hand, government procurement of consensus standards based products increased by a roughly proportionate amount, and its direct participation in SSOs increased by orders of magnitude. 3

Despite the fact that the United States federal government now sets standards with far less frequency, its direct participation and buying power can nevertheless have a significant impact on the success or failure of a given standard.  For example, when a government agency adopts a new standard, it can convey additional credibility to that specification, resulting in broader and more rapid uptake of the standard than might otherwise occur. 4

Legislation:  The most decisive role that a government can play in the standards space is to intervene by imposing its own standard (i.e., a law or regulation).  One would expect that this is most likely result when new technologies replace old ones in areas of traditional government concern.  In the case of ICT, this has already occurred as legislatures struggle with issues such as spam, where the analogies to junk mail, junk faxes and telemarketing are obvious and the legislative precedents are comparatively clear.  Similarly, responding to identity theft via the Internet can rely to a significant extent on earlier decisions based on the theft of credit card and other data by more traditional means.  But setting laws to control behavior (e.g., by setting criminal penalties for phishing) is different than intervening to influence the technical standards that can lower crime by augmenting security (i.e., by making phishing more difficult). 

Some areas of actual communications technology (such as radio and televisions frequencies) have been the subject of domestic regulation (in the United States, via the Federal Communications Commission, FCC).  Regulation of the multiple new types of wireless standards that are now regularly emerging therefore arises naturally within existing regimes and systems.

International treaties:  Most international collaboration on ICT standards occurs through non-governmental entities, such as the Joint Technical Committee 1 (JTC 1) of ISO and the IEC.  But there are exceptions, such as the International Telecommunication Union (ITU), a venerable organization with a more than century long history that now operates as a treaty organization under the auspices of the United Nations.  Participation in the ITU is by institutional representatives appointed by national governments. 

Treaties can also utilize, bolster or regulate activity in relation to consensus-based standards.  The most powerful example of this activity is the Agreement on Technical Barriers to Trade (ATBT), enacted under the World Trade Organization in 1995.  Under that treaty, governments are prohibited (among other restrictions) from using unreasonable conformance testing or unnecessary national standards as tactics to impede or prevent the import and sale of foreign products. 5

II            Indirect Government Involvement

Standard setting is time consuming, and requires expert input.  Some nations (such as China) pursue a so-called "top down" approach and maintain extensive standard setting infrastructures at the national level.  Others, such as the United States, rely heavily on the private sector to do the work of prioritizing projects and designing standards, through a "bottom up" process utilizing SSOs to achieve their goals.  Among the nations that follow this approach, greater influence is given to commercial players, resulting (proponents of this approach believe) in more timely, responsive and effective standards.  An added benefit of this approach is the fact that tax dollars are saved, because the costs of standards production are shifted to industry. 6

Areas of standards concern:  When a government decides that it will not set the standards in certain domains that are nonetheless of significance to policy or traditional governmental roles (I will call them "areas of standards concern"), many questions arise.  From a practical perspective, how should government staff and maintain expertise and lines of communication in areas in which it does not have direct involvement?  Also, how can government still influence the standards that are set in those sectors?  For nations that prefer the top down approach, should exceptions to that approach be made, and if so, how will decisions to do so be made, and by whom? 

For example, if the private sector is to predominate in areas that are not sufficiently crucial to warrant regulation, but are nonetheless significant enough to warrant government concern, should government intervene if needed standards do not materialize within a reasonable period of time?  If government does intervene, should it do so as a catalyst to nudge the private sector into action, or should it preempt the field entirely, developing the needed standards itself, and then enacting them into law? 

Examples of areas of standards concern are not hard to find. Three current standards domains in which government has decided to become more actively involved are first responder standards, data sharing standards, and electronic healthcare standards.

          First responder standards:  Five and a half years ago, the 9/11 attacks exposed the tragic lack of effective standards to permit first responders to communicate with each other.  Multiple efforts were launched within many SSOs to address this gap.  The American National Standards Institute (ANSI) created the Homeland Security Standards Panel in February of 2003 to help coordinate standards activities in this and related areas with the efforts of the newly formed Department of Homeland Security.  On May 19, 2004, Secretary of Homeland Security Tom Ridge highlighted the lack of appropriate standards in his testimony to the 9/11 Commission. 7

But despite these activities, enormous gaps remain in the ability of police, fire and other emergency forces in adjacent communities to communicate effectively when coordination is essential.

          Data sharing standards:  The failure of the FBI and the CIA to share key data has  been identified as one failing that helped to enable the events of 9/11 to occur.  In an effort to avoid future lapses in the integration of intelligence and actions among relevant agencies, a new cabinet-level agency named the Department of Homeland Security was created to oversee, and improve coordination between, agencies able to assist in defending homeland security.

But inter-agency rivalries were not the only cause for poor data sharing.  In fact, the federal government maintains information in a very large number of IT "silos" that have been created through non-coordinated IT purchasing. The result is great difficulty in data sharing among, and even within, individual agencies.  An example of the challenges that result from the historical creation of such disparate systems can be appreciated from the following elementary data formatting example:

The task of bridging hundreds of formerly stove-piped systems is enormous. Every system has its own way of formatting data and defining the meaning of database terms. For instance, one system may use the term "FirstName" and another the term "FName" to specify a person's first name in a database. In other cases, different systems may use an identical label to represent different data. "CNum" may mean "case number" in one system and "catalog number" in another. 8

          eHealth standards:  The enormous costs of medical care have focused attention on deriving ways to gain efficiencies and cut costs.  In December 2003, Congress passed the Medicare Prescription Drug, Improvement, and Modernization Act, 9 under which the Commission on Systemic Interoperability was created, with the charge of developing a strategy to make healthcare information instantly accessible at all times, by consumers and their healthcare providers. 10   One mechanism adopted to further that goal was the creation in late 2005 of the Healthcare Information Technology Standards Panel (HITSP), which operates under a contract administered by the Office of the National Coordinator for Health Information Technology (ONC).  HITSP is administered by ANSI in cooperation with strategic partners including the Healthcare Information and Management Systems Society (HIMSS), the Advanced Technology Institute (ATI) and Booz Allen Hamilton. 11

As the examples above demonstrate, government has not hesitated to engage with the standard setting infrastructure through a variety of means when it became convinced that achieving an ICT standards solution was important.  Regrettably, the examples given also demonstrate how the progress of such efforts can be slow indeed, as often as not.

III         ICT Standards and Social Agendas

Just as governance occurs at multiple levels (local, state, federal, and to a limited but growing extent, internationally), the ability to pursue social agendas arise at the same levels as well.  Many of these agendas could, but thus far usually have not, embraced or utilized ICT standards, although the opportunities of doing so are increasing.  But as our dependency on the Internet and digitized information continues to increase, it may prove to be difficult or impossible to advance such agendas at all without using standards as tools.

As noted above, governments have begun supporting and influencing the development of ICT standards in new ways.  To date, most such action has been oriented towards achieving generic technical, as compared to specific social goals, but the initiatives directed towards eHealth standards noted above are an example of a turn in this direction.

Where standards can be shown to be relevant to achieving identified social goals, how great a departure would it be for governments to employ them to this purpose?  After all, from one perspective, there is a great deal of vacant space between attending an arcane technical meeting and (for instance) increasing the value of distance learning for K-12 students.  Why should governments consider digging so deep?

In truth, this is not as strange or unusual a concept as it may at first seem.  Consider the following examples of government action in one area that is intended to achieve results in another:

  • Tax credits, deductions, and other special provisions:  Federal and state tax codes have been used to influence more types of social goals than could be summarized in a paper many times the length of this article, from the development of alternative energy, to supporting tax exempt entities of all kinds, to promoting home ownership.  More locally, municipalities frequently give tax abatements to provide incentives to build job-creating facilities in their areas.
  • Government contracting:  Federal and state regulations have been used to preferentially create opportunities for minority owned, women-owned, and other types of entities.
  • Loan programs:  Federal and state loan, and loan guarantee, programs help provide financing to the same types of businesses, as well as to support small businesses generally.
  • Employment:  Affirmative action hiring policies can be found at many levels of government.
  • Research and development:  Governmental agencies provide billions of dollars of support for basic research in universities and other venues that is not directed at achieving immediate, identified government goals.  In contrast, direct economic support by government in the United States for standard setting activities is almost non-existent.  Indeed, participation by government representatives in consortia is almost invariably at discounted rates, as compared with the fees required for participation by private industry.

In contrast, direct economic support by government in the United States for standard setting activities is almost non-existent.  Indeed, participation by government representatives in consortia is almost invariably at discounted rates, as compared with the fees required for participation by private industry.  As a result, government participation in consortia is in fact subsidized by the other members of these SSOs, rather than the reverse.

Given the above examples, why should government not consider utilizing ICT standards to achieve social goals as well, and dedicate an appropriate amount of resources to that result?   Assuming that the questions of "why" and "whether" have been addressed adequately, attention should then logically turn to the questions of "when" and "how."

IV.            Accessibility standards

One area where government interest in ICT standards would be particularly apt falls under the category of "accessibility," broadly defined.  Increasingly, the ability to use all functions of whatever computer system (operating system, browser and applications) an individual citizen may own or have access to in order to utilize the Internet is becoming essential to virtually all aspects of modern life.  Not only is this true with respect to managing one's personal affairs, but also to obtain information from, and interact with, government as well.  Similarly, gaining access to the job market on an equal basis with others increasingly requires the ability to utilize ICT. 

In short, a person is severely impacted in living in a modern first-world country if she does not have access to, and the physical ability to use, ICT of many types.  For someone in the third world, the inability to access or use such technology may foreclose the only available hope of benefiting from the broader opportunities of the wider world.

To date, many governments have been largely uninvolved in such questions, even where there are clear analogs to traditional government action.  Consider from this perspective the following differential examples between the regulation of ICT practices and other types of conduct.

Access for those with disabilities:  In the United States, buildings with public access are subject to federal and state rules intended to ensure that those with disabilities can enter, pass between floors, and use essential services such as rest rooms, all without undue difficulty. 12   In contrast, there has thus far been little legislation or court action directed at ensuring that those with disabilities will not be the subject of discrimination and unequal opportunity when it comes to utilizing ICT to access essential information, services and opportunities.  Indeed, governments themselves have sometimes been slow to assign a high priority to accessibility in their procurement of ICT for use by their own employees.  The following are examples of recent events that highlight the evolution of thought in this area:

          Procurement:  In Massachusetts, the most significant argument leveled against adoption of a rule requiring the procurement of products supporting the OpenDocument Format (ODF) was that such products did not provide the same degree of accessibility for those with disabilities as Microsoft Office.  This charge was acknowledged by the Information Technology Division (ITD), and a variety of efforts were mounted in cooperation with leaders of the community of the disabled to address the issue.  As a result of these actions and the efforts of private sector developers, Massachusetts was able to meet its originally planned conversion date (January 1, 2007) and begin deploying ODF-compliant software, using "plug ins" to permit those with disabilities to continue to use Office until ODF-compliant products had closed the accessibility gap with their Microsoft product complements.13

          Private industry:  In February of 2006, the National Federation of the Blind brought suit in California against retail giant Target Corporation, charging that Web pages did not support available accessibility standards.  As a result, the blind could not successfully make purchases at the Target Website, because (for example) images could not be identified and described using screen reader software, nor could a purchase be completed without the use of a mouse.   The suit was brought under the California Unruh Civil Rights Act, the California Disabled Persons Act and the federal Americans with Disabilities Act, and immediately commanded nationwide attention.  Not long after, a Federal District Court judge denied Target's motion to dismiss, holding that such a suit could be properly brought under existing law.  14

The prospect of widespread liabililty for inaccessbility of Websites may seem novel in the abstract, but in fact the World Wide Web Consortium (W3C) has already created a variety of standards that can be implemented to bring Websites to a greater level of accessibility – either voluntarily, or perhaps with time, as a result of government regulation. 15

Will specific regulations be required to ensure appropriate Web accessibility?  Using the United States as an example, the Target case is of special interest, as the plaintiffs brought suit under existing anti-discrimination laws, and the judge in the Federal District Court in which the case is being tried has, as a preliminary finding, concluded that this is appropriate.  Accordingly, it may be that existing laws may prove adequate to the task. 

On the other hand, reaching ultimate certainty in that regard can take years, if different courts reach different conclusions.  If that occurs, it would require action by the Supreme Court to ultimately resolve such inconsistencies – which it could decline to do for some time.  Even if the Supreme Court were to intervene, it may choose to limit its holdings to the specific facts and issues presented, which might or might not prove to be dispositive on broader questions.

As a result, action by Congress could serve to shortcut the process by conclusively amending existing laws to provide specific guidance, or by passing new laws extending current protections to relevant ICT-based settings.

Technical Accessibility:  Prior to the ICT revolution, government facilities and information were available to all, either by visiting or calling a government office, or by writing to the government and requesting information.  In each case, the means to do so (whether they be public transportation, the mails or the telephone) were available to all.  But with government enthusiastically embracing the Web, new questions arise, such as what obligation governments should have to make their information and services available to all, regardless of the technical platform and applications that citizens may choose to use?

The most common questions that arise in this area involve browsers and text applications.  Government Web sites that provide information that can only be read easily (or at all) in Internet Explorer (and not, for example, Firefox), or opened in Word (and not Apple applications or ODF-compliant software) provoke angry responses from happy users of those products.  When the information in question is vital and needed on an urgent basis – as occurred following the Katrina disaster on the Gulf Coast of the United States - this question transcends convenience, and goes to the core of a government's responsibilities to its citizens.

Document Accessibility:  The decision of the Massachusetts ITD to adopt ODF (and not the Microsoft OfficeOpen XML format, also known as OOXML) in August of 2005 immediately brought the question of long-term accessibility of documents before the public eye.  At the heart of the matter was the responsibility of government to ensure that public records should be available not only in the near term, but over long periods of time, regardless of whether the proprietary products in which they were originally created maintain backwards compatibility, or remain in existence at all.

Following the ITD's decision, a variety of state, local and federal governments around the world took up the issue, and an increasing number of governments and others are either investigating, or have already moved to purchase or require ODF-compliant products. 16   The level of government interest in ODF has been augmented in part by the formation of an organization called the ODF Alliance, which was formed for the specific purpose of educating policy makers about ODF, and promoting its uptake. 17   The Alliance was announced on March 3, 2006, with 36 founding members.  As of the date of this writing, its membership has swelled to several hundred members around the world, including government representatives at all levels, non-profits, open source software organizations, corporations, and others.

International equality of access:  The ability to gain access to the Internet and the Web (and with the ever-expanding data density of interactive Web content, broadband access) is becoming increasingly significant on a national as well as an individual basis.  In part, such access is a matter of national investment and priority setting, but it is also a matter of international action (or inaction) as well: large sections of Africa, for example, cannot yet gain broadband access, because the international fiberoptic cables needed to provide it have not yet been laid.   18

Accessibility issues are not uniquely the province of national governments, however.  Globally, an entirely different set of issues comes into play, some of which are infrastructural, and some of which are uniquely standards-dependent.  The following are standards-based examples of ways in which international commercial and government collaboration are needed in order to ensure provision of equal access to all citizens of the world:

          Domain name support:  The allocation of domain names and the control of the root directories of the Internet has been a topic of heated discussion almost from the creation of the Internet Corporation for Assigned Names and Numbers (ICANN).  While the root directories are insignificant in size, they offer the technical ability to literally turn off a country's access to the Internet.  Similarly, ICANN plays a role in deciding how many unique Website addresses are available to any given nation.19

          Character support:  Information that is entered in one character set needs to be convertible into other character sets in order for that data to be exchanged internationally without limitation.  At the most basic level, all currently utilized written languages must be supported by the infrastructure of the Internet in order to guarantee equal access to all.  This need is particularly acute at present, given that the great majority of data currently on-line has been input in first-world langauges, and in particular in English.  In order to permit full academic and historical collaboration, the character sets of archaic languages (e.g., Babylonian) must be digitized as well.  One of the great unsung teams of heros of the Internet labors for the relatively unknown Unicode Consortium, which has worked for years to accomplish the Herculean task of digitizing all existing and historical character sets. 20

          Language support:  Like countries (which are identified by three letter codes under a standard maintained by ISO), languages need identifiers as well.  And like character sets, codes are needed for lost languages as well as for those that are spoken today.  Or, as described by the registrar of ISO 639-3, Codes for the representation of names of languages, "ISO 639-3 attempts to provide as complete an enumeration of languages as possible, including living, extinct, ancient, and constructed languages, whether major or minor, written or unwritten."   21 These codes are used, among other purposes, for specifying languages at Websites, identifying interpreter needs, and for research cross checking.  The latest revision of ISO 639-3 (released in February of 2007) increased support from just 478 languages to 7,546 – although this number is still short of the ultimate goal.

V        The Future

Representative governments, being the servants rather than the masters of their citizens, tend to be reactive rather than proactive by nature.  It is therefore hardly surprising that governments have not yet given great attention to the new realities and challenges of ICT at the level of the standards that in part enable these technologies.  But as the importance of ICT increases and public interest groups take ever-greater notice of this reality, government will need to be increasingly knowledgeable about the role of ICT standards, and the mechanisms by which they are created.

How should government react to this anticipated reality?  And should its reaction be different when using standards to advance social policy, as compared to technical, goals?

Areas of standards concern:  Using the examples already explored in this article, here are some of the situations that government will need to consider:

          Document formats:  There are billions of electronic documents in existence today, some of which are no longer easily accessible due to abandonment of the applications in which they were created. What guarantee will there be that documents created today will be accessible tomorrow, and what role should governments play in achieving that goal?  More specifically:

  • Procurement:  Should government limit its own procurement to applications that support open standards in order to provide greater incentives for such standards to be created? If so, should this policy be applied in some, or all situations? 
  • Social policy:  Should governments concern themselves only with their own information needs, or should they take an interest in the long-term accessibility of information generally, and therefore seek to influence the behavior of information users generally? 
  • Public accessibility:  What document formats should governments make available to its citizens?  Does supporting a proprietary format represent an endorsement of that format and its vendor-owner?  Does the failure by a government to support a format fail that portion of the citizenry that utilizes that format?  How can such concerns be balanced with economic and other resource issues?

          Accessibility for those with disabilities:  Clearly, ICT accessibility is of crucial and increasing importance to full participation in society.  But how should governments respond to this emerging reality?  Consider the following questions:

  • Analogs:  Are ICT accessibility issues different in meaningful, as compared to simply mechanical, ways from their analogs in the physical world?  If they are not, then existing laws may serve to meet the need.  If they are in fact different, then amendments to existing laws, and/or new laws, will be needed.
  • Cost/benefit ratios:  Are the costs associated with accommodating the needs of those with disabilities in the virtual, as compared to the physical world, significant (in either direction)?  If so, should this indicate the need for any differences in laws or penalties?
  • Coverage:  Should ICT-related standards requirements be more broad or narrow in their coverage than existing laws?
  • Responsibilities:  Who should create the standards upon which ICT-related regulations rely?  Should they be regulators, SDOs, or any SSO with domain expertise?

Issues of standards concern:  In addressing areas of standards concern, subsidiary issues will also need to be considered, including the following:

Definitions: If governments are to take greater interest in standards, what type of standards should they find to be acceptable?  This gives rise to additional questions:

  • "Open standards:"  Current government procurement law in the United States, as codified in OMB Circular A-119, favors the use of widely adopted standards, but does not state a preference between those adopted by stringent consensus processes and those specifications that become de facto standards purely as a result of wide usage.  Should only standards that meet certain threshold requirements of process, lack of proprietary control, or other criteria be used for certain purposes, and if so, which purposes? If so, what should the requirements be, given that opinions vary widely on what an "open standard" should mean?
  • Uniformity:  If every state and national government were to adopt its own definition of an open standard, it would defeat the purpose of having standards at all, since vendors would simply not bother to meet so many disparate requirements, other than in the non-elective case of procurement – with the result that governments would simply be reverting to the use of "government unique specifications." 22

Standards infrastructure:  If governments conclude that ICT standards are of increasing significance to the public interest, how can they ensure that an adequate and timely supply of such standards is available, without undertaking to develop such standards themselves?  And under what criteria should SSOs be differentiated, if at all?

The traditional standards infrastructure evolved around national standards bodies that created standards that were in turn adopted by quasi-governmental, global standards organizations.  However, in information technology, most standards are set by non-accredited rather than accredited organizations, and in communications technology the number is increasing as well. 

Historically, many governments around the world have preferred, or required, the use of standards approved by global standards bodies, such as ISO and IEC.  Happily, ISO/IEC JTC 1 offers the Publicly Available Standards (PAS) process as an avenue for consortium-based standards to advance to approval as ISO/IEC standards.   But given that consortia are usually global in membership to begin with, is there a need for a second layer of global approval that adds time and effort, but may or may not add additional quality control?

          Representation:  The accredited standards world espouses equality of access to all stakeholders in the standards process.  However, that ideal is often hard to achieve in practice, due to the time and resources needed to participate in the standard setting process.  Lack of consumer interest can provide a challenge to broad representation of stakeholders as well.  If standards are to be used to further social agendas, should some greater level of attention be given to achieving broader participation in SSOs?  Or perhaps some other form of safeguard is needed to vet standards at the legislative level?

          Certification:  If ICT standards become more legally significant, a greater need will arise for effective certification of compliant products.  However, as a generalization, compliance with information technology standards is less frequently, and less stringently, tested than compliance with other types of products where (for example) safety is a concern.  This is in part due to lack of demand, and in part due to the high costs of creating industrial strength compliance tests relative to the funds that vendors are willing to invest in standards-related collaborative activities. 23   If compliance becomes a legal necessity, then testing and certification will become more important.  The costs of such added activities will presumably impact product prices.

          Funding:  Standards development, conformity assessment and participation costs in the United States are currently borne almost entirely by SSO participants and vendors.  If governments that endorse the bottom up process want non-profits and the public sector to continue to provide needed standards, then such governments should consider whether some measure of public funding should be supplied to help support this process.

VI           Summary and Recommendations

Summary:  Ensuring that appropriate ICT standards are available to meet the needs of modern society, and that those standards are in fact adopted, is a legitimate (and perhaps overdue) area of concern for governments to consider.  While private industry may in many, or even most, respects voluntarily act to provide the standards needed, it does not necessarily follow that this will occur in all cases, or as quickly as may be needed, or in as effective a fashion as may be needed to achieve appropriate social goals.
ICT "areas of standards concern" are many, including eHealth, homeland security, accessibility, public records, emergency response, and much more.  Moreover, standards-related work and regulations that have already been accomplished in the physical world must now be replicated in the virtual world.  This task needs to be informed by, but not limited to, using the existing body of laws, regulations and standards and the decisions underlying them as guidance.  Where necessary, amending old regulations, and in some cases, the passage of new laws or the launching of new programs will likely be necessary.

Recommendations:  As a result, federal, state, and local governments should reevaluate what their roles and responsibilities in relation to ICT standardization should be.  The following recommendations are intended to assist in embarking upon such a study.

          Recognition:  Governments should affirmatively recognize the importance of ICT standards in many areas, including ensuring full participation in, and access to government, and achieving equality of opportunity in education and employment.

          Public records:  Governments should ensure that all public records are created, available, and archived in formats that ensure, to the greatest extent possible, practical long-term preservation and access.

          Competence:  Legislators, regulators and Congressional and agency staff need to be aware of ICT technologies, realities and needs.  Existing government resources may not be adequate to this task.  Increased funding for the National Institute of Standards and Technology (NIST), relevant Congressional subcommittees and personnel in some agencies may be needed in order to properly support decision makers and administrators.

          Infrastructural support:  Virtually no money trickles from the top down to support the bottom up standards development process upon which US society and commerce depends today.  A small amount of funding could have very significant impact, if directed at discrete projects, such as funding testbeds upon which standards can be based, the creation of best practices, and other inexpensive projects.

          Underwriting certification testing:  Conducting certification tests can provide revenues to SSOs, but only after the test suites upon which the tests are based have been funded and created.  A modest "evergreen" fund (perhaps $40 million) could provide low-interest loans to SSOs that could be used to create test suites.  The loans could then be repaid, using profits derived from certification testing using the test suites so created.

          Standards availability:  Consortia charge high membership feels and make their standards available for free, while SDOs charge less to participate, and sell their standards.  Where the input of consumers  and/or the availability of standards without charge is important to the public interest, government should consider a process whereby an SSO could apply for offsetting funds, in order to allow consumers and others to participate in consortia without charge, and to make SDO standards available without cost.

          Open standards definition:  A common definition of open standards should be developed that would ideally be used by all governments, both federal and state, for procurement and social agenda purposes.  This would enable vendors to efficiently create and support standards processes that meet the common definition.  Such a definition should be high-level and not detailed, to avoid gratuitous and harmful restriction on flexibility where flexibility is needed, and should be agnostic as to nature of SSO (i.e., accredited or nonaccredited), so as not to exclude the consortia that set the majority of ICT standards.

The list of recommendations is hardly exhaustive of the actions that governments should consider.  But it should be sufficient to facilitate the commencement of a dialogue on that topic.  Hopefully, such a discussion among interested parties may be launched in the not too distant future.


Comments? Email:

Copyright 2007 Andrew Updegrove

End Notes

1. National Technology Transfer and Advancement Act of 1995, 15 U.S.C. § 3701 (1995).

2. OMB, Federal Participation in the Development and Use of Voluntary Consensus Standards and in Conformity Assessment Activities, Circular A-119, Revised (Feb. 10, 1998), available at <>.

3. In that regard, it is worth noting that while government participation in consortia and SDOs has increased, government agencies rarely if ever join a consortium at the higher and more influential levels of membership that usually control strategy.  Even when government representatives do take board seats, they frequently abstain from voting.

4. An excellent current example is provided by the near-contemporaneous announcement by both Wal-Mart and the United States Department of Defense that each would require certain of its suppliers to adopt and deploy RFID technology in their deliveries.  For more on the recent evolution of United States government standards policy, see Updegrove, Andrew, A Work in Progress: Government Support for Standard Setting in the United States: 1980 – 2004,, Consortium Standards Bulletin, Vol. IV, No. 1, January 2005, at <>, and sources cited therein.

5. The text of the ATBT can be found at < > (accessed March 3, 2007).  For more on the WTO's activities to prevent technical barriers to trade, see its activity page on that topic, and the many links and resources that can be accessed there, at <> (accessed March 3, 2007).

6. Where, of course, they are passed on to consumers and other end-users.  The net result is therefore that the costs are not avoided by taxpayers, but simply reallocated through a different formula.  The top-down vs. bottom-up debate is long-standing and energetic.  For a discussion of the two philosophies, as manifested by the United States and China, see Updegrove, Andrew, Top Down, or Bottom Up?  A Tale of Two Standards Systems,, Consortium Standards Bulletin, Vol. IV, No. 4, April 2005, at <>.  A recent example of the approach in action with respect to a single product area is provided by the paths taken to develop and deploy cellular phone technology.  In Europe, a "top down" approach led to the early adoption of a single standard that came to be adopted almost everywhere in the world (except in the United States), providing seamless geographic services to those using phones compliant with the resulting GSM standard.  In the United States, a market driven approach led to competing standards and near-term incompatibilities between carriers, but also to the eventual adoption of a standard that some believe to be technically superior to the GSM standard.  The cellular phone example has been widely examined from this perspective.  For an alternative view of the same history – contending that results are more likely to be based upon openness of architecture – see Rice, John and Galvin, Peter, The Development of Standards in the Mobile Telephone Industry and Their Effect on Regional Industry Growth,2000, at <>.

7. American National Standards Institute, 9-11 Commission Hearing Calls for Standards in Areas of Emergency Response, May 19, 2004, at <>.

8. Kurlander, Neil. Out of Step - NIEM and N-DEx:  Two national data-sharing initiatives face major challenges., January 25, 2007 at <> (accessed February 27, 2007). Efforts are ongoing to bridge the gaps between these islands of data.  Two initiatives discussed in the same article are the National Information Exchange Model (NIEM) and the National Data Exchange (N-DEx).

9. Law No. 108-173, 117 Stat. 2066.  The full text of the Act can be found at <> (accessed March 3, 2007).

10. On October 28, 2005, the Commission announced the release of a report, making many standards and certification based recommendations, titled Ending the Document Game: Connecting and Transforming Your Healthcare Through Information Technology.  These recommendations may be found at <>, and the complete report at <>

11. On January 23, 2007, U.S. Department of Health and Human Services (HHS) Secretary Michael O. Leavitt announced his acceptance of thirty (30) consensus standards recommended by HITSP.  See: HHS Secretary Leavitt Accepts Recommendations from Healthcare Information Technology Standards Panel (HITSP),, January 25, 2007 at <> (accessed February 27, 2007).

12. Building owners were not as a rule required to upgrade their facilities upon the passage of these laws.  However, modifying such a "grandfathered" structure subjected the building to the upgrading requirement, and all new buildings subject to these laws are required to comply with accessibility regulations at the time of completion.

13. While ODF-compliant products were justifiably criticized by members of the community of the disabled when the ITD announced its initial support of ODF 1.0 in 2005, that situation had changed dramatically by February 2007, due to actions taken by OASIS working groups, the ITD, and the developers of the plugin software, as reported in the February 13, 2007 OASIS press release announcing the adoption of ODF version 1.1.  See, OpenDocument Version 1.1 an OASIS Standard, at <>.

14. National Federation for the Blind v. Target Corporation, Northern District of California Case No. C 06-01802 MHP.  The order can be found at <>.  An overview, the complaint, a press release, and other information can be found at the NFB's Website at this page:  <>.

15. The W3C Accessibility Initiative Web page can be found at: <>.

16. In response to the growing interest in ODF by governments, particularly in Europe, Microsoft announced in the fall of 2005 that it would contribute the OOXML specification to Ecma, a European standards organization.  ODF (originally developed by OASIS, a consortium) was preliminarily approved by ISO/IEC in May of 2006, and became an official standard later the same year.  Meanwhile, OOXML was approved by Ecma in early December, becoming Ecma-376, and weighing in at over 6,000 pages (in contrast to ODF's c. 800 pages).  As this article is being written, Ecma-376 is under consideration by the ISO/IEC JTC 1 under the "Fast Track" process.  For much more on the competition between ODF and OOXML, see the September 2005 issue of the Consortium Standards Bulletin, titled Massachusetts and OpenDocument:  The Commonwealth Leads the Way (Updegrove, Andrew, Vol. IV, No. 9), at >>, and the many blog entries that can be found in the OpenDocument file at Updegrove, Andrew,, The Standards Blog, September 17, 2005 to date, at <>.  Links to virtually all primary resources can be found in these blog entries.

17. The mission of the ODF Alliance is more specifically stated in part as follows:  "…To enable the public sector to have greater control over and direct management of their own records, information and documents, the ODF Alliance seeks to promote and advance the use of OpenDocument Format (ODF) as the primary document format for governments.  The alliance works globally to educate policymakers, IT administrators and the public on the benefits and opportunities of the OpenDocument Format, to help ensure that government information, records and documents are fully and natively accessible across platforms and applications, even as technologies change."  Mission Statement of the ODF Alliance, at <> (accessed March 3, 2007).

18. Perhaps the starkest current example of disparities in access among countries due to the decisions of local governments can be found by comparing conditions in the two Koreas.  While South Korea currently has one of the highest per-capita broadband access rates (ranking number 4 globally, with 26.4% of all inhabitants enjoying broadband access as of June 2006), North Korea has one of the lowest.  A list of most-connected countries and related data can be found at Organization for Economic Co-operation and Development (OECD) at this Web page: <,2340,en_2825_495656_37529673_1_1_1_1,00.html>

19. The topic of address allocation has some sensitivity. In the days before address needs began to grow explosively (and before ICANN was created), IBM was assigned 33 million addresses, Stanford was awarded 17 million, and the entire Peoples Republic of China was magnanimously awarded just 9 million.  The situation was later rectified by ICANN, but the continuing indirect control of ICANN by the United States through the oversight power reserved to the US Department of Commerce continues to generate concern among some nations, and annoyance among others.  For a fuller discussion of the ICANN controversy, see Updegrove, Andrew, WSIS, ICANN and the Future of the Internet,, Consortium Standards Bulletin, Vol. IV, No. 11, November, 2005, at <>, and sources cited therein.

20. The Unicode Consortium Website can be found at>.  Two appreciation of the Unicode Consortium and its mission that I have written previously are Savoring the Unicode,, Consider This, October 29, 2003, at <> and The Unicode Standard 5.0: An Appreciation, the Standards Blog, October 17, 2006, at <>.

21. SIL International Website, at <>, accessed March 4, 2007.  For more about ISO 639-03 and SIL International, see Updegrove, Andrew, Language Codes and a "Philosophy of Three-Part Service,", Consider This #46<>.

22. In the case of open document formats, several American states are obviously collaborating on a common definition of an open standard.  Bills have been introduced in the current legislative sessions in Minnesota, Texas and California that would require that documents be created and archived only if they are created in applications that support XML-based formats that are:
(1) Interoperable among diverse internal and external platforms and applications.
(2) Fully published and available royalty-free.
(3) Implemented by multiple vendors.
(4) Controlled by an open industry organization with a well-defined inclusive process for evolution of the standard.
For more on these bills and to find links to their texts, see Updegrove, Andrew, And California Makes Four,, The Standards Blog, February 28, 2007, at <>.

23. For a more detailed explanation of compliance, certification and branding in the ICT industry, see the Certification Testing and Branding section of the Essential Guide to Standard Setting Organizations and Standards, at <>.






The Federal Trade Commission announced on February 5, 2007 that it had at last delivered a penalty ruling in its long-running prosecution of memory technology company Rambus, Inc.  The full Board of Commissioners had earlier found, on appeal from a holding in favor or Rambus by an FTC Administrative Law Judge in 2004, that Rambus had illegally created a monopoly in certain standards-reliant technology by abusing the JEDEC standard setting process in the early 1990s.

That opinion was handed down in August of 2006, at which time the Commissions announced that they would hold further hearings with FTC Complaint Counsel and Rambus, and would welcome industry input regarding what penalties would be appropriate to levy against Rambus.  Several industry groups and I filed amicus briefs in response.  My own brief urged the FTC to include a punitive element, in order to emphasize that abuse of the standard setting process would result in dire results.  Most obviously, such an element could bar Rambus from charging any royalties at all from those that wished to implement the standard at issue.

The stakes were high for Rambus while awaiting the Commissioners' decision, since that decision could influence the outcome and damages assessed in the multiple private cases that are ongoing between Rambus and the various semiconductor companies that have refused to pay royalties to Rambus.  These royalties relate to patents that the FTC had already held were illegally hidden by Rambus from the working group members in JEDEC created the SDRAM standard at issue.  At least one judge had already delayed further action in one of these cases, in order to learn what penalty the FTC might conclude would be appropriate under the circumstances.

For the standards community, the FTC's anticipated judgment would also be significant, because if Rambus were to be let off lightly, gaming the standards system could appear to offer better business returns than playing by the rules.  Such a perception could tempt other companies to try the same gambit, and also lessen participation in standard setting overall. Clearly, if there is more to lose than to gain by helping create, and then adopt, a standard, then standards development could become a process in which only the unaware – and those that wished to prey upon them – would participate.

In fact, the FTC had substantial latitude in arriving at its decision.  As noted by the Commission in the majority opinion announced on February 5, the Supreme Court has previously:

…emphasized the Commission’s wide discretion in its choice of remedy, and stated the expectation that the Commission would ‘exercise a special competence in formulating remedies to deal with problems in the general sphere of competitive practices’….[Thus, the Commission] enjoys ‘wide latitude for judgment’ in fashioning a remedial order, subject to the constraint that the requirements of the order bear a reasonable relationship to the unlawful practices that the Commission has found.

In a somewhat similar case, Dell Computer acquiesced in 1996 to an FTC consent decree, under which it surrendered any right to require payment of royalties by implementers of a VESA standard.  Like Rambus, Dell had been accused of failing to disclose a patent during a VESA standards development process, and then later asserting that patent against implementers of the same standard.  Dell was also required to subject itself to oversight in its standards-related activities for a period of ten years.  Most obviously, then, the FTC could decide to limit the royalties that Rambus could charge to implement the SDRAM standard, or to bar Rambus from charging any royalties at all. 

In the event, the FTC opted to require Rambus to license its essential patent claims (something that Rambus was already doing), to limit the amount of royalties that Rambus could charge, and to bar it from charging any royalties at all after three years.

In order to determine the amount of royalties Rambus would be permitted to charge, the Commissioners attempted to determine what licensing terms would have prevailed had Rambus disclosed its patents in timely fashion.  Or, as stated in the FTC press release, quoting the majority opinion:

“Having found liability, we want a remedy strong enough to restore ongoing competition and thereby to inspire confidence in the standard-setting process. At the same time, we do not want to impose an unnecessarily restrictive remedy that could undermine the attainment of pro-competitive goals,” it says.

“[T]he Commission has previously declared, and we agree, that ‘where the circumstances justify such relief, the Commission has the authority to require royalty-free licensing.’ . . . We conclude, however, that Complaint Counsel have not satisfied their burden of demonstrating that a royalty-free remedy is necessary to restore the competition that would have existed in the ‘but for’ world – i.e., that absent Rambus’s deception, JEDEC would not have standardized Rambus technologies, thus leaving Rambus with no royalties. . . .We have examined the record for the proof that the courts have found necessary to impose royalty-free licensing, but do not find it. ”

But how, exactly, does a court determine how to take the marketplace back in time, and predict how it would have behaved, based only upon assumptions derived from the evidence provided by each side?  The press release continues as follows:

“We therefore are left with the task of determining the maximum reasonable royalty rate that Rambus may charge those practicing the SDRAM and DDR-SDRAM standards. Royalty rates unquestionably are better set in the marketplace, but Rambus’s deceptive conduct has made that impossible. Although we do not relish imposing a compulsory licensing remedy, the facts presented make that relief appropriate and indeed necessary to restore competition,” the opinion states.

“[W]e find that a maximum royalty rate of .5% for DDR SDRAM, for three years from the date the Commission’s Order is issued and then going to zero, is reasonable and appropriate. We also find that a corresponding .25% maximum rate for SDRAM is appropriate. Halving the DDR SDRAM rate reflects the fact that SDRAM utilizes only two of the relevant Rambus technologies, whereas DDR SDRAM uses four.”

Significantly, the order also prohibits Rambus from charging more than these rates on any outstanding license agreement.  This element of the order would seem to automatically diminish the damages that Rambus could be awarded in any ongoing litigation based upon alleged infringement of the patents in question.

The Commissioners also imposed an oversight condition that echoes the intent, but does not replicate the specifics, of the Dell consent decree.  Under this section of Commission's order, Rambus must in the future not only make complete disclosure of all relevant patents when and as required by the rules of any standard setting organization in which it participates, but it must "employ a Commission-approved compliance officer to ensure disclosure of intellectual property rights to standard-setting organizations and to verify the accuracy of Rambus’s periodic reports to the Commission," as well as keep records of its participation that can be audited by the Commission to monitor Rambus' compliance with the order.

Unlike the decision this summer, which was unanimous, the new opinion was supported by only three out of the five Commissioners (Commissioners Pamela Jones Harbour and J. Thomas Rosch concurred in part and dissented in part; each also issued a separate opinion).

Those looking to future holdings by the FTC should note well that both of the dissenting Commissioners would have imposed more stringent penalties than the majority.  In his dissent, Commissioner Rosch stated his belief that Rambus should have been barred from receiving any royalties at all.  To do otherwise, he stated, would enable Rambus to “continue to reap the fruits of its ongoing violation of Section 2.”   Commissioner Harbour reached the same result, observing "I also dissent [because] I do not believe the remedy adopted by the majority goes far enough to restore competition."

What happens next?  Rambus has announced that it will appeal the Commissioner's order, which otherwise would take effect in 60 days time. 

Independent of any appeal by Rambus, Monday's decision is not the final word in the broader context of its liability.  Judges in the ongoing litigation between Rambus and the memory manufacturers may perhaps find the dissenting opinions of Commissioners Rosch and Harbour to be more persuasive.  And future standard setting participants might find that odds of 3-2 do not appear attractive at all, especially after the substantial costs of litigation already incurred by Rambus are taken into account.

The FTC press release can be found here.  Copies of the Commission’s opinion and order, the Commissioners’ separate statements and the other legal documents associated with this case are available from the FTC Website and also from the FTC’s Consumer Response Center, Room 130, 600 Pennsylvania Avenue, N.W., Washington, DC 20580.

The press release issued by Rambus can be found here.


Copyright 2007 Andrew Updegrove



Editor's note:  This month, I'm departing from tradition by including two blog entries instead of one, each of which relates to the theme of this month's issue.  Together, they better represent the degree of activity that is currently playing out in the public sector on a key standards-related policy issue: how can governments best protect public records?



February 06, 2007
Views: 3,754

Most of the attention this week relating to open document standards will focus on the responses ISO/IEC JTC 1 received regarding the Ecma 376/Microsoft OOXML submission during the "contradictions" phase of its Fast Track process, which ended on the February 5. I just posted this entry on that topic, reporting that a total of twenty national bodies have filed contradictions or other comments as part of this phase of the process.

But while this global drama has been playing out, I've learned that a third US state will consider requiring use of open document formats by government agencies (Massachusetts and Minnesota are the other two to date).  That state is Texas, where a bill has been introduced to require that only "open document formats" should be utilized by government agencies.  The bill has been designated as SB 446, and was filed on February 5 (the full text is reproduced at the end of this blog entry).

How does the Texas bill define an open document format?  As stated in the bill, such a format would need to be based upon the Extensible Markup Language (more commonly referred to as XML), would need to have been already adopted as a standard, and would be required to meet the following additional criteria as well:

(1)  interoperable among diverse internal and external platforms and applications;
(2)  published without restrictions or royalties;
(3)  fully and independently implemented by multiple software providers on multiple  platforms without any intellectual property reservations for necessary technology; and
(4)  controlled by an open industry organization with a well-defined inclusive process for evolution of the standard.

The language quoted is problematic in some ways (how many platforms and applications does it take to achieve diversity?  Does "without restriction" mean without even those restrictions that are deemed to be consistent with the most "open" standards in use today, such as a license term providing for defensive suspension upon assertion of infringement by a licensee?).  That aside, the excerpt quotes above clearly states the intention of the bill's proponents to exclude proprietary or overly limited specifications. 

How would OOXML and ODF fare in Texas if the bill is adopted as written?  ODF would appear to meet the test today, while OOXML would have difficulty with the third requirement at minimum for some time – and perhaps forever, depending upon whether "fully and independently implemented" means in an office productivity suite in addition to Office 2007.  It could also be debated whether Ecma maintains an "inclusive process," given the relatively small size of its membership, and the fact that members at the highest level must be approved before they can join.

The broad application of the proposed legislation is also significant.  If adopted in its current form, all "state agencies" would be affected – cutting a very wide swathe indeed.  As defined, a state agency would include not only the state offices that would be immediately apparent, but also all attorneys that are admitted to the State Bar, as well as all state colleges and universities.  Local lawyers and law firms may not take kindly to a bill that requires them to upgrade their IT infrastructure, and it will be interesting to see whether the Texas bar association takes a position on the bill, and perhaps lobbies against at least this part of it.

Those affected by the bill in its final form (assuming it passes) would be required to create and save, as well as be able to receive, documents in approved open formats after the bill's effective date – December 1, 2007.  Another section of the bill would bar those affected from converting any open document into a format "used by a single vendor."  The conversion of existing documents into approved formats would not initially be mandated, but the Texas Department of Information Resources (DIR) would be required to draft guidelines by December 1, 2008 for performing such conversions, taking into account considerations such as cost, document life, and need for public access.

It will be very interesting indeed to see how this bill fares.  On the plus side, the Texas DIR will be spared the wrenching experience that the IT managers of Massachusetts suffered when they sought to put such a policy in place on their own.  Too, debate over the bill will occur in public.  But on the negative side, the legislators of Texas may be surprised at the magnitude of effort that lobbyists on both sides of the issue may expend "educating" them on the issues at hand.

It will also be interesting to see if legislators in other states opt to file similar bills.  One would assume that the greater the number of states in which similar initiatives are launched, the wider will be the public dialogue that will follow.  Hopefully, this will lead to a progressively more informed debate, and an evolving consensus over the duties of government as regards public records.

That's an important topic, and as a result, I applaud the sponsors of this new bill, and look forward to the debates that will follow.

And what about the Minnesota bill?  As you may recall, a Minnesota legislator filed a bill during the previous legislative session that also included a bill-specific definition of open standards that gave me some concern, because that definition was not only quite detailed but also in many ways not in line with traditional definitions of "open standards.".  Why was I concerned?  Because if every state legislates its own definition of what constitutes an "open standard," then there will be no "standard" definition of an open standard.  If that occurs, then how can vendors offer uniform licensing terms for products intended to be sold on a national basis, and how can standards organizations create intellectual property rights policies intended to meet the needs of the marketplace?

Obviously, last year's Minnesota bill highlighted the need for a consensus definition for an open standard.  This time around, the Minnesota proponents of open formats have filed a new and shorter bill (on January 17 of this year), with a more concise definition of an open standard.  I am told that the bill enjoys broader support.  I've included the text of that draft bill after the California text, and you can find it on line at the Minnesota government Web site as well.

 You can follow the progress of Texas S.B. No. 446 here

You can follow the progress of Minnesota H.F. No. 176 here.

For further blog entries on ODF, click here

* * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Full text of the Texas bill as of today's date:





SECTION 1.  Subchapter F, Chapter 2054, Government Code, is amended by adding Section 2054.124 to read as follows:

Sec. 2054.124.  OPEN DOCUMENT FORMAT REQUIRED.  (a)  In this section, "state agency" means:

(1)  a board, commission, council, department, office, authority, or other agency in the executive branch of state government created under the constitution or a statute of the state, including an institution of higher education as defined by Section 61.003, Education Code;
(2)  the legislature or a legislative agency; or
(3)  an appellate court or an agency in the judicial branch of state government, including the State Bar of Texas.

(b)  Each electronic document created, exchanged, or maintained by a state agency must be created, exchanged, or maintained in an open, Extensible Markup Language based file format, specified by the department, that is:

(1)  interoperable among diverse internal and external platforms and applications;
(2)  published without restrictions or royalties;
(3)  fully and independently implemented by multiple software providers on multiple platforms without any intellectual property reservations for necessary technology; and
(4)  controlled by an open industry organization with a well-defined inclusive process for evolution of the standard.

(c)  Each state agency must be able to receive electronic documents in an open, Extensible Markup Language based file format for office applications and may not change documents to a file format used by only one vendor.

(d)  The department shall develop guidelines for state agencies to follow in determining whether existing electronic documents must be converted to an open, Extensible Markup Language based file format.  In developing guidelines under this subsection, the department shall consider:

(1)  the cost of converting electronic documents;
(2)  the need for public access to the documents; and
(3)  the expected storage life of the documents.

SECTION 2.  Not later than September 1, 2008, the Department of Information Resources shall develop the guidelines required by Section 2054.124(d), Government Code, as added by this Act.

SECTION 3.  (a) Except as provided by Subsection (b) of this section, Section 2054.124, Government Code, as added by this Act, applies only to electronic documents created on or after the effective date of this Act.

(b)  Section 2054.124, Government Code, as added by this Act, applies to electronic documents created, exchanged, or maintained before the effective date of this Act only to the extent required by the guidelines developed by the Department of Information Resources under Section 2054.124(d), Government Code, as added by this Act.

SECTION 4.  This Act takes effect December 1, 2007

* * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Full text of the Minnesota bill as of today's date:

A bill for an act relating to state government; establishing Preservation of State Documents Act; proposing coding for new law in Minnesota Statutes, chapter 16E.



Effective July 1, 2008, all documents including text, spreadsheets, and presentations of the state of Minnesota shall be created, exchanged, maintained, and preserved in an open, XML-based file format, as specified by the chief information office of the state, that is:

(1) interoperable among diverse internal and external platforms and applications;
(2) fully published and available royalty-free;
(3) implemented by multiple vendors; and
(4) controlled by an open industry organization with a well-defined inclusive process for evolution of the standard. By that date, the state of Minnesota shall be able to accept all documents received in open document format for office applications and shall not migrate to a file format currently used by only one organization.



February 28, 2007
Views: 3,404

The big news of the day is that a legislator in California has decided that it is time to convince his colleagues that the Golden State should become the latest U.S. State to get on the open formats bandwagon. The California initiative represents the third piece of legislation to the same purpose to be filed in recent weeks (the others were filed in Texas and Minnesota). A link to the California bill is here, and the full text appears at the end of this blog entry. 

As defined in the draft legislation, the bill would require that "all documents, including, but not limited to, text, spreadsheets, and presentations, produced by any state agency shall be created, exchanged, and preserved in an open extensible markup language-based, XML-based file format, as specified by the department." Significantly the bill continues:

When deciding how to implement this section, the department in its evaluation of open, XML-based file formats shall consider all of the following features:

(1) Interoperable among diverse internal and external platforms and applications.
(2) Fully published and available royalty-free.
(3) Implemented by multiple vendors.
(4) Controlled by an open industry organization with a well-defined inclusive process for evolution of the standard.

Happily for all concerned, this definition is very close, and in many cases identical, to the open standards definitions used in both the Texas and Minnesota bills. As I observed at the time that the original Minnesota legislation was introduced (which included a rather eclectic definition of an open standard), it would do more harm than good for every state to enact its own definition of an "open standard." Were this to happen, vendors would have neither the incentive, nor perhaps even the ability, to meet the multiple procurement requirements that were legislated, dooming the process to failure.

Like the Texas (but not the Minnesota) bill, the California legislation calls for the appropriate IT state agency (in this case, the California Department of IT Services) to create guidelines for use by state agencies to decide whether a given product is based on "open, XML-based formats," which guidelines should take into account considerations such as cost, the need for public accessibility and the expected storage life of the documents in question. This language is identical to that included in the Texas bill in its current form.

It was 18 months ago that Massachusetts launched this trend, when its Information Technical Division revised the Enterprise Technical Resource Model (ETRM) upon which its IT procurement is based. That revision not only required open standards and welcomed open source in its procurement, but also blessed a (then) relatively unknown open document format standard called Open Document Format, or ODF. Since then, government procurement based on open standards in general, and the virtues of ODF in particular, have been very much in the spotlight. 

2006 saw the first filing of an open standards bill as well (in Minnesota), after ODF had been in the news for some time.  That bill was not voted on before the legislative session ended.  But earlier this year, the sponsor of the original bill reintroduced a similar (and in my view, much improved) bill in Minnesota.  Another bill was introduced by a State Senator in Texas, using an identical definition of open standards. I am told that a State Representative has now agreed to join the State Senator as a co-sponsor of the Texas bill, allowing it to progress to the next step of consideration.

The California initiative was introduced by Democratic Assembly Member Mark Leno as A B 1668, and like the Texas bill, would (if enacted) go into force on January 1, 2008.

For further blog entries on ODF, click here

* * * * * * * * * * * * * * * * * * * 


INTRODUCED BY   Assembly Member Leno 

FEBRUARY 23, 2007

An act to add Section 11541.1 to the Government Code, relating to information technology.


AB 1668, as introduced, Leno. Information technology: open-document software

Existing law sets forth the requirements for the acquisition of information technology goods and services, and establishes the duties and responsibilities of the Department of Technology Services. This bill would require all state agencies, beginning on or after January 1, 2008, to create, exchange, and preserve all documents, as specified, in an open extensible markup language-based, XML-based file format, and to start to become equipped to receive any document in an open, XML-based file format, as specified. The bill also would require the Department of Technology Services to evaluate, as specified, all open, XML-based file formats and to develop guidelines, as specified, for state agencies in using open, XML-based file formats.

Vote: majority.

Appropriation: no.

Fiscal committee: yes.

State-mandated local program: no.


      SECTION 1. Section 11541.1 is added to the Government Code, to read:

      11541.1. (a) Beginning on or after January 1, 2008, all documents, including, but not limited to, text, spreadsheets, and presentations, produced by any state agency shall be created, exchanged, and preserved in an open extensible markup language-based, XML-based file format, as specified by the department. When deciding how to implement this section, the department in its evaluation of open, XML-based file formats shall consider all of the following features:

 (1) Interoperable among diverse internal and external platforms and applications.
 (2) Fully published and available royalty-free.
 (3) Implemented by multiple vendors.
 (4) Controlled by an open industry organization with a well-defined inclusive process for evolution of the standard

      (b) Beginning on or after January 1, 2008, state agencies shall start to become equipped to accept all documents in an open, XML-based file format for office applications, and shall not adopt a file format used by only one entity.

      (c) The department shall develop guidelines for state agencies to follow in determining whether existing electronic documents need to be converted to an open, XML-based file format. The department shall
consider all of the following: 

(1) The cost of converting electronic documents.
(2) The need for the documents to be publicly accessible.
(3) The expected storage life of the documents.

Bookmark the Standards Blog at or
set up an RSS feed at:

Comments? Email:

Copyright 2007 Andrew Updegrove




[][][] March 6, 2007

#46 Language Codes and a "Philosophy of Three-Part Service"

The process of creating, maintaining and studying of technical standards, as everyone knows (except, those who frequent this site) must be dry as dust, deadly boring, and the very stuff of which tedium is made.  Granted, for those versed in the craft, the creation of standards describing complex interoperability interfaces must demand an attention to detail that might capture the interest of those involved.  But to others, standards must be totally opaque and meaningless – mere arcane squigglings at best.

How much more deadly then, must be the creation and maintenance of those elementary and arbitrary, albeit utilitarian, codes used to designate items within a single category of data that need shorter and more uniform names in order to be efficiently processed - time codes, country codes and such?  Truly, being one of the bit-stained wretches taxed with responsibility for such binary trivia must be ultimately and terminally boring.

Right?  Well, perhaps.  But perhaps you should defer judgment until you've Considered This:

One of the amazing things that Google has brought to the world is the Google Alert – the modern analog to the traditional fee-based clipping services that would scan newspapers and journals for articles on a given subject on demand.   But unlike a clipping service, which used to be quite pricey and delivered its results only in periodic batches, Google Alerts arrive instantaneously and for free.  How cool (and distracting) is that?

Not long ago, one of my standing Google Alerts delivered up a press release announcing that a new ISO language code standard had been released.  What are language codes?  In simplest terms, they are more or less randomly assigned three-letter combinations that can be used as universally recognized surrogates for the names of languages.  In that form, they can be easily used as encodings to specify (for example) what language a Web site uses. 

Just how many languages are included in the standard?  From the press release, I learned that the new version of ISO 639-3 categorizes a grand total of 7,546 languages, up dramatically from the 478 languages included in the previous version.

But the press release hinted at some less unanticipated information as well, which piqued my interest, and led me to dig a bit more deeply. The following are some of the things that caught my fancy as I learned more.

First, I found the scope of the effort to be intriguing, as the standard includes five categories of languages, and not just currently spoken ("living") languages.  The others are "historical" (i.e., archaic versions of still-spoken languages, such as Middle English), "extinct" (languages that have passed from usage in the last millennium), and "Ancient" (those that are no longer extant, and that went extinct more than 1,000 years ago).  The final category?  That's "constructed," which applies to artificial languages that are intended to be used by humans, as compared to computers.  Apparently, there have been many more such languages created than just Esperanto (one Web site states that there are more than 300 such languages, (most used only by their creators), including Klingon – a constructed language that ISO has not as yet seen fit to include in ISO 639-3).

Next, browsing through the languages of the world is a fascinating pursuit in its own right. 

If you love language for language's sake, you will appreciate the richness of the names of the languages themselves. Consider, for example, these selections, taken from just those languages that have been given codes beginning with the letter "a."

For example, there are the lyrical African languages Alumu-Tesu,  Mandobo Atas, Abaga and Aka-Bea.   Sadly, reduced to three letter sequences, they become to simply aab, aax, abg, and, just as abjectly, abj

But never mind that for now.  Instead, enjoy more originals – languages with percussive names, such as these Mezzo-American tongues: Cubulco Achi and Aguacateco  Or those with more mysterious sounds, such as Acheron and Galo Adi,

Many are simply fun to roll off the tongue – languages with names like Obojuitai, Akawaio and Anakalangu.  Or how about another Amerindian language: Amahuaca?

Some names are funky rather than lyrical.  Try Achterhoeks, Dzodinka, Pudtol Atta, and my personal favorite, which is Akhvakh, which seems to call for a !Kung people-style, introductory exclamation mark to be fully appreciated.  Or perhaps you can't help thinking of those insurance commercials with the duck (AFLAC!).

Other languages provide multicultural history lessons, such as Saint Lucian Creole French, or Judeo-Tunisian Arabic (what's the deal with that one?) or offer ethno-geographic snapshots, like Mescalero-Chiricahua Apache.

Some language names present puzzles – where exactly does Alabama come from if not…?  And who is that speaks War on an everyday basis?  Others are simply unfortunate, at least to an English speaker: Bozoum is not too bad, but how about Anal and Anus?   In this case, the three letter code combinations anm and auq, respectively, are Occidental improvements.

Some language names can't fail to raise a smile of a different sort, particularly when grouped together, such as Mia Brat, Omie, Abom, and ultimately, Amuzgo, so Solong. 

And if Molmo One were a jet, would it be an Awak?

Or perhaps you enjoy parlor games.  In that case, say this one five times quickly:  Adynyamathanha.

The one of a kind prize in the A series, however, must unquestionably go to =/Kx'aul//'ein.  If you're curious where that one is spoken, as I was, you can Google it and find out (it's often spoken, but presumably seldom spelled, in Namibia).

All of which makes for good fun.  But as a matter of fact, there's also a more serious back story.  ISO 639-3, you see, is not maintained by a typical standards organization.  Instead, the Registrar appointed by ISO to be its custodian is called SIL International, an organization with a mission that is as interesting as its history.

SIL, you see, derives from the organization's original name, which was the Summer Institute of Linguistics, an organization founded by William Cameron Townsend (1896 – 1982), who departed for Guatemala in 1919, accompanied by crates filled with the Bible in Spanish to distribute to the indigenous peoples of Central America.  When he arrived, however, he found to his chagrin that the native Guatemalans did not in fact speak Spanish.  But in that realization, he found his life-long calling as well. 

Instead of completing his original plan, Townsend moved up country, settling in with the Mayan Cakchiquel.  There, he learned to speak their language, developed an alphabet to record it, and with the help of assistants, translated the New Testament into Cakchiquel.

Moreover, unlike the stereotypical Ugly American, he began to develop a personal "Philosophy of Three Part Service," summarized at the SIL International Website as follows:
Townsend…insisted that members of SIL should be ready to serve others scientifically, materially, and spiritually. From early in his career Townsend was personally committed to each of these three areas of involvement. It is not sufficient, he argued, that a person should be interested in serving people unless he has that scientific preparation which will make his contribution relevant and effective. Service based on a foundation of scientific investigation, he held, is more likely to have a permanent impact than service motivated by high ideals but without a thorough understanding of the people being served.

Of special importance, he maintained, is a careful study of a people's language and, by means of that language, an acquired insight into their aspirations and goals. But a scientific study in which the investigator is interested merely in amassing data about the people studied and not in helping them reach worthy goals may have some value to the scientific world, but it will have ignored human values. Townsend affirmed that scientific knowledge should be used as a means for offering developing people the resource of choice for bettering their daily lives. Additionally, he taught that unless a minority people can adjust to their place in the changing world and, with economic assistance, learn something of the acquired wisdom of humankind, these people may sink into apathy or despair.

Crucial to a well-rounded program for minority-language groups, Townsend believed, is the spiritual component. Natural religion, defined as man's seeking for an integrating explanation of his life and world, indicates that all people have deep, unfulfilled spiritual needs. An adequate effort to serve minority-language communities, he believed, must take cognizance of this spiritual dimension. It may not be convenient for some individuals or for a government to be involved in such matters, but for a private organization it is appropriate. It can devote itself to the tasks of scientific investigation and at the same time to practical service and to spiritual orientation. This three-phased objective molded Townsend's career.

Townsend founded a number of affiliated non-profits to further his service ideals, one of which was SIL International.  A quarter-century after Townsend's death, that organization describes itself as follows:

Founded over 70 years ago, SIL International is a faith-based organization that studies, documents, and assists in developing the world’s lesser-known languages. SIL’s staff shares a Christian commitment to service, academic excellence, and professional engagement through literacy, linguistics, translation, and other academic disciplines. SIL makes its services available to all without regard to religious belief, political ideology, gender, race, or ethnic background.

SIL…has grown from a small summer linguistics training program with two students in 1934 to a staff of over 5,000 coming from over 60 countries. SIL’s linguistic investigation exceeds 1,800 languages spoken by over 1.2 billion people in more than 70 countries.

If you read further at the SIL Web site, you'll learn that it has been granted formal consultative status by UNESCO, and that it focuses on unwritten languages, because, "[p]eople who speak these languages often live in geographic, social, and economic isolation. Studying these languages results in practical help for local people and contributes to the broader knowledge of linguistics, anthropology, and ethnomusicology."   You'll also see that SIL researches and develops software and speech analysis tools, among many other activities, and that, "most SIL workers develop individual funding resources for particular projects and personal support."

Of course, it is also the Registration Authority for ISO 639-3, which is based in large part on its own Ethnologue (15th edition), subtitled An encyclopedic reference work cataloging all of the world’s 6,912 known living languages

All of which goes to show that creating and maintaining even something as mundane as a three letter language code standard may not be quite as boring an enterprise as it might at first seem to be.  And, perhaps also, that even in the case of standards, the Lord does indeed work in mysterious ways.

Comments? Email:

Read more Consider This… entries at:

Copyright 2007 Andrew Updegrove


OpenDocument Workshop:
Adoption, Accessibility, Programmability and the Future

April 18, 2007
9;00 – 12:00 AM
San Diego Marriott Mission Valley
San Diego, California

The members of the OASIS ODF Adoption TC will be hosting a half-day workshop dedicated to topics important to understanding the ODF standard and its technology.  The workshop is open to OASIS members and non-members alike.  Further information may be found here, and registration information is here.

Due to the emergence of multiple ODF supporting applications interoperability of content and presentation is a user requirement. Ongoing efforts within the Formula and Metadata subcommittees are addressing these issues. In the area of programmability, several toolkit projects have emerged that promise to simplify the development of new, innovative ODF supporting applications. OpenDocument Format solutions and applications must meet and serve the needs of Persons with Disabilities. The new ODF v1.1 specification paves the way for Assistive Technology Vendors (ATVs) to implement support for ODF implementations. And lastly, pubic policy change in major governments is accelerating the adoption of ODF.

Agenda items include:

  • Keynote: The Past, Present, and Future of ODF
    Presenter(s): Michael Brauer, Technical Architect Software Engineering, Sun Microsystems
  • ODF Accessibility
    Presenter(s):  Pete Brunet, Accessibility Software Engineer, IBM
  • ODF Programmability
    Presenter(s):  Rob Weir, Software Architect, IBM and Michael Brauer, Technical Architect Software Engineering, Sun Microsystems
  • Interoperability
    Presenter(s):  Rob Weir, Software Architect, IBM

The workshop will end with a closing session addressing the Adoption of ODF.  Attendance at this workshop is free with Symposium registration and also available separately.