Consortium Standards Bulletin- April 2004
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
   Home > Consortium Standards Bulletin > April 2004
  Untitled Document
Untitled Document


APRIL 2004
Vol III, No. 4


In the age of bricks and mortar, one organization might have been able to provide every standard that a given industry needed. But with ICT convergence, it takes a "village" of organizations to do the job. Of course, not everyone in a village always gets along. Print this article

What do you call an accredited SDO that gives its standards away, creates standards for everything from stable paper for perpetual archiving to client/server service and protocol standards for information retrieval? Oh, and they also butt heads with the IETF over the namespace identifiers. You call it "NISO". Print this article

FTC Complaint Counsel has filed a brief appealing the Administrative Law Judge's decision In the Matter of Rambus that was almost as thick as the ALJ's own exhaustive opinion. Weighty briefs were filed by three groups of "friends of the court" as well. The Commissioners have some reading to do. Print this article

Standards take many forms. One is the verbal standard, often providing a historical reference point. Many people are arguing today over whether Iraq is "another Vietnam". What exactly does "another Vietnam" mean, and does the Iraq situation meet that standard? Print this article

News Shorts:  A new GRID consortium (finally) shows its face; A startup company says it will ensure Linux vendors and users against SCO; A Munich court enforces the GPL; ISO looks to consortia for new standards; major vendors continue to announce "private" specifications; PKI gets new respect; Sir Tim receives a new reward; and much much more. Print this article

Print this issue



Andrew Updegrove


Imagine a hypothetical organization that sets standards for pipes and pipe fittings. There are some interfaces with other building materials that must be taken into account (e.g., pipes have to match up with faucets and taps, pass through walls, hang from structural supports, and so on). And while the standards relating to materials, tensile strength, and so on could become complex, those standards would not need to acknowledge, or be acknowledged by, other standard setting organizations. After all, a pipe is fundamentally just a tube, and its interoperability aspects start to run out after gauge and thread characteristics are agreed upon.

Now think of what we refer to when we use the single word "Web" (the one with the capital "W"). The name is rather apt from the standards perspective, when one envisions the skein of layers and protocols that enable it, let alone the myriad standards that sit on top of it and make it more useful (think of the myriad XML-based specifications alone). And yet the sources of the standards upon which the Internet and the Web are based are a variety of consortia, and not just one single, coordinated body. While each consortium has a clear view of the boundaries of its own allotted domain, its peers in the standard setting infrastructure would not necessarily agree on where those boundaries lie. Reasonable overlaps in competence not only can and do exist, but the organizations themselves are fundamentally different in their membership, philosophy, style, rules and approach. A brief look at the websites of the W3C, the IETF and OASIS makes that point abundantly clear. These differences can be exploited productively (and otherwise) when companies decide which organization to approach with a proposal for a new initiative.

Most standards, after all, emerge from the unregulated world of consensus-based standard setting. And while everyone agrees on the utility of the results, and thousands of companies and individuals participate in the production of the standards we all use, there is no arbiter that is acknowledged to be entitled to settle boundary disputes. The situation is not unlike lobster fishing off the coast of Maine, where generally recognized, but unwritten laws roughly control where someone can fish, and the rights to exploit a given territory evolve incrementally over time. If any one seeks to push the boundary too suddenly, or to enter a new territory unannounced, the reactions start with severed pot buoys, and rapidly escalate to scuttled boats and even gunfire.

Happily, standard setting never leads to physical violence, although the commercial tactics can become pretty hardball. And to be sure, the ease with which new consortia are launched results in a rich offering of standards, and a Darwinian struggle of competing solutions. When this system works best, the results are more robust, and everyone benefits. When it works poorly, there is inefficiency and contention, and sometimes the best solution does not predominate.

This situation is destined to become more problematic rather than less so, as convergence intensifies and the potential benefit that participants can derive from influencing widely adopted standards rises. It will be interesting to see how the market reacts to this reality. Will the status quo continue pretty much as it exists today? Will members seek to merge organizations, in order to maximize efficiency and coherence? Will government be invited to the table, or perhaps even demand a seat (beyond mere membership by its agencies) at the table on its own volition? Already, ISO is recognizing that global equal opportunity is becoming inevitably tied to access to the Internet -- and that such access also needs to include technical accommodation of cultural, linguistic and economic differences as well. It cannot be too long before national governments take note of the fact that society is becoming dependent for its very survival on the Internet to the same extent as it is on any regulated utility. Indeed, are we not already there?

We don't know today what a mature standard setting infrastructure might look like. Given the dynamism of the subject matter that technical standard setting addresses, perhaps we never will. But whether we think that it should take a "village" of standard setting organizations working together to make things work, that is what we have today. Like any village, not everyone gets along with everyone else, or has the same opinion of the importance or cooperativeness of their neighbors. But everyone does need to get along.

In this issue, we look at standard setting in the global village. In our lead article, we profile NISO - an accredited standards developer that acts more like a consortium, and despite its roots in library science, has the temerity to tweak the IETF lion's tail by setting name space identifier standards. In our IPR update, we describe efforts by the standards community to support the Federal Trade Commission in its effort to enforce good faith conduct obligations in standard setting. And finally, in this month's selection from the Standards Blog, we recognize that standard setting occurs in a world that must pay attention to standards of conduct and accountability as well as interfaces and protocols.

Comments? Email:

Copyright 2004 Andrew Updegrove



Andrew Updegrove

Introduction: It is human nature to pigeonhole things. The world being the diverse and contradictory place that it is, categorizing data in this fashion helps us to organize our knowledge and work with it more comfortably (albeit often at a cost). Applying the same approach to the categorization of standards organizations generally works quite well. But now and again one runs into a body like the National Information Standards Organization (NISO), and all bets are off.

Consider this: NISO is an ANSI accredited standards development organization ("SDO") -- and it also makes its standards available for free. It has its roots in library science -- but creates standards applicable to any organization that maintains large stores of information. It creates standards for paper that can remain stable for hundreds of years -- as well as client/server service and protocol standards for information retrieval. It includes libraries as voting members -- as well as Lucent Technologies and the U.S. Department of Defense. And it not only sets standards for the management of information -- but also butts heads with the International Engineering Task Force (IETF) over the development of namespace identifiers for use on the Web.

All in all, not your average accredited standards development organization. How did an organization founded by the library and publishing community in 1935 get from there to here? And where is it heading next?

Nature and Nurture: The original goal of NISO's founders was to "standardize" serial publications. Libraries were struggling with the demands of cataloging, collecting and providing access to a growing body of serials and journal literature that had no consistent rules for addressing issues such as pagination and formatting. NISO solved those problems when it released its first standard, Z39.1. The organization received formal ANSI accreditation in 1941.

Over the years, some things changed, while others remained constant. On the one hand, data increasingly became stored, accessed and displayed digitally. With these innovations, librarians and archivists needed to solve some of the same types of issues that they had addressed in a paper-based world all over again.

But on the other hand, with the advent of the Web and the feasibility of making local content accessible on a global basis, new challenges arose. How can one search diverse libraries and archives that are not set up identically? And since content is content, and the same technology should be usable to access any type of content, what about the divergent needs of the owners of different types of content? Should these new interest groups (e.g., those who maintain corporate data archives, public data and other masses of information) be welcomed into the organization, and if so, how can their ideas and needs be assimilated and addressed?

In the words of Patricia Harris, the Executive Director of NISO, and a 20 year NISO employee:

NISO's mission expanded with the onset and explosion of digital information exchange. Commercial forces began influencing our timetables and agendas, as it became clear to vendors and others that “library standards” could do more than make libraries more valuable and efficient -- they could enable and improve all kinds of information exchange, and create important commercial opportunities, as well.

The result has been a significant change in the NISO membership base. While thirty years ago its members were mostly corporate libraries and associations, its members today also include information dependent businesses of all types, such as publishers, content aggregators, and the companies that provide the software and technology that enable publishing and content distribution.

Board Chair Jan Peterson of Infotrieve (publisher relations and licensing content) puts it this way:

NISO is, and must be, responsive to the changing business models of publishers and other content providers, as well as the growing community of web services providers.  The business models that worked when print on paper was the sole method of distribution are becoming obsolete.  The emerging digital business models require identifiers that work at a very granular level, such as articles and the references at the end of an article, as well as defining access rights.  The value of information is increasingly defined by its usage, and standards make usage definable.   

The way in which NISO adapted to this morphing of its core constituency explains much about the unique road that it has traveled in recent years. While it has embraced new commercial challenges and members, it has remained true to its early (and some would say academic) roots, continuing to act on values that have more in common with the open source community than the world of traditional SDOs.

Bringing customers as well as vendors together in the same organization has had some other interesting benefits. Harris notes: " Too often, consumers don’t know what the constraints of business are. We have found that the beneficial side of bringing together consumers and providers is that they all end up having their eyes opened."

NISO Today: Described in a traditional sense, NISO's credentials read as follows: it is an accredited standards development organization that is formally associated with the ISO and is the Secretariat for TC 46 Subcommittee 4, (Technical Interoperability). NISO prides itself, however, on several non-traditional aspects of its approach to standard setting.

The most obvious distinction is that NISO makes its specifications (both completed and in draft form) available for free to the world. NISO is the only ANSI-accredited SDO that has taken this approach -- the balance continue to sell their standards, and are anxiously monitoring the after-effects of the so-called "Veeck Case". In that case, a Texas court held that a standard referenced in a building code must be made available for free to those bound to comply with that code. Since SDOs have traditionally relied on the sale of standards to defray a substantial portion of their operating costs, the case sent shock waves through the SDO system.

NISO views the issue from a different perspective. Libraries, after all, are about making information available, and academics depend upon publishing to not only spread their ideas, but advance their careers as well. This focus on access and sharing rather than selling ultimately led NISO to take the consortium approach of free access to its work product when the Web made it possible to share that work product at no added cost per recipient. In the words of Harris, "NISO standards grew out of an open source spirit long before open source became a buzzword."

NISO Board member and Standards Development Committee Chair Pat Stevens, of member OCLC, explains the NISO view this way:

While there is competition in some sense between universities for research dollars and students, they have a strong tendency to work together to solve shared problems.  Libraries are particularly known for the level of cooperation and sharing. Also, t he academic world has a deeply held belief that the greater the access to information and knowledge the more rapid the growth of that information and knowledge. This spirit is of great benefit to NISO as it provides an environment that encourages and rewards collaboration even among those who provide services for a fee.   For those who provide services, NISO standards have created opportunities for creating innovative solutions that take advantage of the standards and the open environment. 

NISO has strong feelings regarding the practices of other SDOs in this regard as well. When ISO mooted the possibility of charging for the use of the ubiquitous (and elementary) country codes that are called upon by all manner of IT applications, NISO joined the hue and cry against the possibility. NISO's Harris puts it bluntly: "Standards are so critical to the NISO community, that we want no barriers to implementation. This is not a viewpoint that ISO shares. TC46, for which NISO is the US TAG, has repeatedly taken resolutions back to ISO to help us make the country codes freely available, but to no avail."

Not surprisingly, NISO also takes an advanced position with respect to including royalty-bearing intellectual property rights ("IPR") in its standards. Under the current ANSI, NISO is not permitted to adopt a strictly royalty-free IPR policy. With ANSI reconsidering its patent policy, NISO is looking forward to the time when it expects to be able to take this step.

A reach that exceeds its grasp: NISO's efforts have wide impact beyond its core constituencies. Several of its current initiatives illustrate the broad relevance of its work beyond the needs of its immediate membership:

  • Metasearch Initiative: This initiative seeks to enable a "Google like" search capability across multiple sources of licensed material, employing search software that is more sophisticated than a bot. Ideally, the technology will enable responses to queries by content providers in real time, and eliminate duplicative responses in the process. Currently, this is not a practical option for content providers. NISO expects that its work in Metasearch will improve capabilities in the area of e-learning, where there is a need to provide for the exchange of supplementary learning "objects" (e.g., a PowerPoint presentation), and to access multiple learning objects.
  • Identifiers: NISO is already well known for its ISBN (books) and ISSN (periodicals) identifiers, and for the Digital Object Identifier (DOI) that is transforming access to digital content. In January, a NISO group debuted the new INFO URI scheme, which will provide a consistent and reliable way to represent and reference such standard identifiers as Dewey Decimal Classifications on the Web. The new scheme permits existing identifier systems (such as the identifiers assigned to records in the PubMed database maintained by the National Center for Biotechnology Information (NCBI) of the National Library of Medicine). PubMed identifiers pre-date the Web, and the Web only recognizes URIs as a means to identify information resources. The INFO URI scheme allowed the NCBI to register the PubMed identifier namespace under the INFO Registry. The result: the record currently known by PubMed identifier “12376099” is now registered in URI terms as info:pmid/12376099.
  • Networked Reference Standard: While search engines can access content, they cannot answer actual questions. The NRS standard is intended to support actual questions and answers between users and expert services (e.g., a virtual reference service offered by a library that would allow the user to query a reference librarian from her home, dorm or office). The NRS standard would support both real-time chat and asynchronous e-mail, as well as extended referrals among services. Like the existing NISO Z39.50 standard, the new service is intended to enable new services and businesses, as a result of permitting both client and expert to employ different technology platforms.
  • OpenURL Standard: Although this standard was originally targeted at the electronic delivery of scholarly journal articles, it is expected to enjoy a much wider uptake. The standard enables a user searching for an information resource citation to obtain immediate access to the most "appropriate" copy of the full resource through the implementation of extended linking services. "Appropriateness" can take into account the user's preferences relating to attributes such as location, cost, and contractual or license agreements already in place with information suppliers.
  • RFID Standards: While RFID technology in retail settings has achieved the greatest current attention (and concern, on privacy grounds), this technology plays a less controversial role in the library setting. Typically, a library tag carries a "dumb-number" item identifier, readable only from inches away. A tag on merchandise in a store might contain diverse kinds of identification information, readable from a much greater distance.

Still, old labels die hard. When asked what popular misconception about NISO Harris would most like to correct, the answer was emphatic: "That NISO does more than just “library standards!” NISO’s standards are robust examples of information solutions. NISO solves problems of information retrieval, management, storage, and publishing that people in other communities have to solve. "

So many organizations, so little turf: Externally, life for NISO is sometimes complex. As is the case with any other accredited or non-accredited standards development organization (SSO), it cannot create standards in isolation. Increasingly, standard setting addresses multi-dimensional needs: the same challenges often arise in diverse settings, especially since the advent of the Internet and the increasing convergence of information technology and communications. Often, each commercial domain has its own SSO, with its own agenda and its own ideas about what solution will suit its members best. Too often, there is an unavoidable overlap of effort, and the loose ad-hoc system of liaison relationships maintained between SSOs is sometimes not sufficient to resolve all differences.

Not surprisingly, NISO’s ability to execute on its mission is affected by changing technology, intense competition in the field of information services among diverse commercial interests, and an occasional lack of effective coordination among those standards organizations whose efforts overlap those of NISO. The standards of other SSOs do not necessarily complement those of NISO, and, in the view of Harris, the efforts of other SSOs sometimes even undermine those of NISO. With no "uber SSO" that coordinates the efforts of the hundreds of SSOs active in the ICT space today, there is no formal way to resolve differences of approach, opinion, and priority.

The result can be conflict, as occurred when NISO launched its INFO URI identifier scheme (discussed above) in January of this year. This work builds on earlier consultations with representatives from the World Wide Web Consortium (W3C) and the Internet Engineering Task Force (IETF). Nevertheless, the relationship with IETF on the issue of namespace is, in the words of Harris "not exactly harmonious." Harris reports that NISO’s introduction of the INFO URI scheme is considered by some to be an unwelcome invasion of IETF's historic turf.

For its part, NISO believes that it is making a valuable contribution to Web users at large, and not its members alone. At the same time, Harris is under no illusions over the prospect for NISO displacing the W3C or the IETF. She says:

What’s happening in our world is being driven by the Internet. The world needs content with integrity, not just endless links to websites. Because the NISO community of publishers and content aggregators provides the content that has integrity, our organization should not be dismissed. NISO isn’t asking to direct the big-picture agenda, to drive the car, so to speak. But we do want to have our pinky on the steering wheel!

Leslie Daigle, an individual technologist who's been involved in the IETF URI work for some time, views the situation somewhat differently (speaking on her own behalf, and not as an official representative of the IETF):

I don't understand this as NISO "moving into IETF space". I see it as NISO wanting to use our output. Where there has been some tension in registering URI schemes (and this generalizes beyond the discussions with the NISO folks) is that people are in fact less focused on understanding the URI standard in its entirety (including its applicability in protocols beyond HTTP and web applications) than they are focused on getting "something that I can use in XML or web". When those people enter the IETF URI registration process as the last step in their efforts (i.e., products have shipped, another standards organizations specifications have been published), they are understandably less than perfectly receptive to IETF requests or suggestions for change to the scheme registration. They just want their scheme registered. Tension ensues. And a lot of unregistered URI schemes fly around the Internet.

NISO Tomorrow: As we have frequently noted in the past (see, for example, Past, Present and Future: The Accelerating Rate of Change ), SSOs today cannot afford to rest on their laurels, or to assume that what they did yesterday to serve their members' needs will be sufficient to meet the challenges of tomorrow. To address this reality, NISO's Board of Directors this month launched a year-long strategic planning initiative, funded by a grant from the Mellon Foundation.

The approach that the Board has adopted is self-critical, and will seek to evaluate NISO’s past progress, present challenges, and future directions. The review will involve not only Board retreats and member surveys, but also a formal external evaluation. The external review will be conducted by a panel of thought leaders in the communities that NISO impacts, as well as those that it serves. Further details on these activities will be featured on the NISO website beginning in May 2004.

What gives rise to this type of introspection at this point in time? As described by Harris, the answer is opportunity, rather than stress: "NISO is at a juncture. More and more interest groups -- technology vendors, information services, publishers, content aggregators, and libraries -- are drawn to NISO. Our members are positioning NISO as the international leader in its field and a necessary partner to complementary standards development organizations. The challenge is to turn that potential into a fait accompli."

What are the forces that are likely to shape the future for NISO? In the words of Harris:

The standards world will continue to reflect the changes impacting the business community and society at large. For example, the proliferation of e-learning has re-shaped the business model for publishers and NISO’s agenda does, and must, reflect such dramatic shifts in how members stay competitive. In general, in the information community the physical is giving way to the virtual; even digital concerns no longer have the significance they once held. At the heart of the matter now are service, delivery, and performance. For example, in the metasearch environment, each user session involves services from many providers. The standards must work in this context so they must be developed in the context of the entire information exchange. NISO standards will continue to focus on those objectives.

Summary: Standard setting as such may not be a dynamic activity, but the IT and commercial context in which it occurs is becoming ever more so. Successful SSOs have recognized that reality, and have adapted to new challenges and opportunities to remain relevant and useful to their members (and beyond). In the SDO world in particular, there has too often been a "circle the wagons" reaction to challenges such as the rise of consortia and the economic threat of the Veeck case. In contrast, NISO presents an image of an organization that is happy to push the SDO envelope to realize the goals of its members, and to extend its manifest destiny aggressively into the future.

Not bad for "a bunch of librarians".

Comments? Email:

Copyright 2004 Andrew Updegrove

NISO at a glance:

 Date of formation


Founded in 1939, incorporated as a not-for-profit education association in 1983, and assumed its current name the following year

Number of Current members

85 Voting Members. For a full list, see:

Number of classes of membership

One; individual libraries may also join the lower cost, affiliated Library Standards Alliance

Membership Fees

$1,260 (for organizations with less than $500,000 in revenues) up to $9,450 (for those with revenues greater than $15 million)

Number of issued standards or specifications

35 - See list at

Significant Relationships

NISO is formally associated with the ISO and is the Secretariat for TC 46 Subcommittee 4, Technical Interoperability.

Number of current initiatives


Other types of work product

white papers, technical reports, meeting reports

Other activities

Workshops, programs at professional meetings, conferences

Website address

Companies currently represented on the Board of Directors

EBSCO Publishing, John Wiley and Sons, H. W. Wilson, VTLS, Infotrieve, Davandy

Executive Director

Patricia R. Harris

Total Staff

8 (2 employees and 6 contractors)

Annual Budget

c. $500,000


Some of NISO's more significant standards:

ANSI/NISO Z39.2 -1994 (R2001)
Information Interchange Format
Equivalent international standard: ISO 2709
Abstract: The basis for the MARC (Machine-Readable Catalog) record, this standard specifies the requirements for a generalized interchange format that can be used for the communication of records in any media. This standard was first released in 1971.

ANSI/NISO Z39.48 -1992(R2002)
Permanence of Paper for Publications and Documents in Libraries and Archives
Equivalent international standard: ISO 9706
Abstract: Sets the basic criteria for coated and uncoated papers that will last several hundred years under normal use. It covers ph value, tear resistance, alkaline reserve and lignin threshold. Recycled papers will meet the criteria specified.

ANSI/NISO Z39.50 -2003
Information Retrieval: Application Service Definition & Protocol Specification
Abstract: Defines a client/server based service and protocol for Information Retrieval. It specifies procedures and formats for a client to search a database provided by a server, retrieve database records, and perform related information retrieval functions. The protocol addresses communication between information retrieval applications at the client and server; it does not address interaction between the client and the end-user.

ANSI/NISO Z39.9 -1992 (R2001)
International Standard Serial Numbering (ISSN)
Equivalent international standard: ISO 3297 (SEE NOTE BELOW)
Abstract: Well-known as the ISSN, this standard defines the structure and presentation of a code to uniquely identify serial publications in print and non-print formats. This standard sets forth the format and characteristics of the ISSN and designates a central authority for code administration.

US leadership on ISBN and the revision of ISBN
The International Standard Book Number (ISBN) is based on an ISO International Standard that was first published in 1972 as ISO 2108. ISO 2108 specifies the basic structure of an ISBN, the rules for its allocation, and the administration of the ISBN system. ISO 2108 is currently under revision—it will go from a 10-digit to a 13-digit number—to deal with changes to the ISBN system.






Andrew Updegrove

While it may seem that the already decade-long tale of the JEDEC-based Rambus disputes may never end, the saga does have its more important chapters. One of those chapters is entering its final pages, and a significant event in that chapter occurred on April 16. Regular readers will know that the FTC brought an action against Rambus based on the conduct of Rambus within the JEDEC standard setting process; that the action was heard before an Administrative Law Judge (ALJ) last summer; and that the ALJ roundly rejected all important elements of the government case in a decision released in February of this year (see FTC Loses First Round to Rambus).

The FTC's rules permit its attorneys (referred to as "Complaint Counsel") to appeal a decision by an ALJ to the Commissioners themselves. The Complaint Counsel team, led by Geoffrey Oliver, has done so, filing an exhaustive 125 page rebuttal of the ALJ's findings of fact and legal conclusions on April 16. At the same time, several "friend of the court" briefs were filed. One was submitted by JEDEC itself. A second was filed by an industry group comprising three companies (Micron Technology, Hynix Semiconductors, and Infineon Technologies) independently sued by Rambus for royalties, based on the same SDRAM standard at issue in the FTC action. These briefs, of course, represent the views of interested parties, and that fact would be taken into account by the Commissioners when they review the points made in those briefs.

In order to emphasize the importance of punishing the conduct of Rambus to support the integrity of consensus-based standard setting, Gesmer Updegrove LLP (the sponsor of filed a pro bono brief written by this author on behalf of 12 standard setting organizations (including both consortia and accredited organizations). The membership of those organizations totals over 8,600 companies, government agencies and universities, and encompasses a broad range of technologies. The central thesis of the brief is that standard setting is vital to the national interest and society itself. Absent the enforcement of a good faith obligation on those that participate in the development of standards, the standard setting process itself is in danger of collapsing.

Rambus will now have an opportunity to file an answer to Complaint Counsels brief. A partial day of oral arguments will follow, likely in late summer. The Commissioners will then need to digest what they have heard and the hundreds of pages of decisions, appeals, and answers, as well as the voluminous trial record itself. A final decision would not be likely to issue in less than six to eight months from the date of oral argument.

In the meantime, the separate private suits between Rambus and the three private company defendants will continue. The Federal Circuit Court that ruled against Infineon Technologies returned portions of the case to the trial court, which must now address those elements of the case. Defense Counsel for Hynix and Micron, for their part, are free to bring different arguments before different judges and juries than did Infineon. And, of course, any or all of the private litigants can settle with Rambus at any time. At the moment, this appears unlikely.

In the absence of closure, standard setting organizations are taking what action they can. Most consortia have now revamped their intellectual property policies to tighten them up, and to specifically address the key areas addressed in the Rambus litigation.

To read a copy of Gesmer Updegrove amicus brief, see

Comments? Email:

Copyright 2004 Andrew Updegrove




[] [] April 28, 2004

#15 Is Iraq "Another Vietnam?"   As I have previously observed, the earliest "standards" were undoubtedly verbal reference points -- shorthand descriptions capable of conjuring up a picture or concept in the listener's mind. Often, such a concept may involve complex nuances and symbolic connotations. In a simple example, when someone says "Its as clear as the nose on your face", we understand that the speaker is not talking about the size of someone's olfactory equipment or the clarity of their complexion, but that the speaker believes that something is incontrovertibly obvious. Similarly, we understand a woman who says "He's as hunky as Ben Affleck" to mean one thing, and "He has as much sex appeal as Bill Gates" to mean something quite different, even though each gentleman is both rich as well as famous.

So what does someone mean when they say that the situation in Iraq has become "another Vietnam" for America? And what, precisely, is Donald Rumsfeld denying when he emphatically states that it has not? Finally, with tomorrow [April 30] being the 29th anniversary of the fall of Saigon, have we remembered anything of what we learned from our experience in Vietnam?

Required Elements: Those who are active in standard setting are familiar with the concept of "required" or "essential" elements. One does not have a "compliant implementation" if all of those "required elements" are not found in the implementation. The same applies, in a somewhat less exact fashion, to a verbal standard. What, we must therefore ask, are the "required elements" of "another Vietnam?"

When we say that something is "another Vietnam" we are conveying that the situation is even more dire than a "quagmire" (another verbal standard much in use today). While a "quagmire" implies a subset of the characteristics of "another Vietnam", I believe that it leaves out several important attributes of "another Vietnam". A "quagmire" does not necessarily imply pain and suffering, for example, or military action (although it often does).

I would posit that in order for Iraq to qualify as "another Vietnam", it must display the following required elements of the Vietnam experience. Let us see whether it does.

Perceived Deception: While the factors that gave rise to America's direct involvement in the Vietnam War were complex, the proximate event cited by the Johnson administration to gain the permission of Congress to engage aggressively with the enemy was the so-called "Gulf of Tonkin" incident of 1964. In the naval skirmish to which that name applies, North Vietnamese gunboats were alleged to have twice attacked a U.S. vessel without provocation. To this day, some of the crucial facts of that incident remain in dispute. But it is commonly accepted that (at least) the specifics of the skirmish were misrepresented by the administration in order to gain authority to react decisively, and that in fact North Vietnam may not have been the aggressor at all.

It is also commonly accepted that the Johnson administration was already internally committed to military intervention, but did not yet have popular support to embark upon that path. The affirmative vote of Congress in response to the Gulf of Tonkin incident provided the authorization for the actions that committed the country to a course of conduct from which it could not later easily withdraw.

In later years, the Nixon administration maintained significant military activities in Laos and Cambodia, including a punishing bombing campaign, while consistently denying to the American people that it had any combatants outside of Vietnam. Before long, the administration was no longer trusted by the youth of America, and many of their elders as well.

Does Iraq meet this test? Yes. I believe that the crucial aspect of the Gulf of Tonkin incident is not in fact whether the facts were accurately presented to Congress, but that many Americans came to believe that they had not. Certainly, the current administration's pre-war emphasis on the existence of Iraqi Weapons of Mass Destruction (WMDs) to Congress to gain authorization to make war is reminiscent of the role of the Gulf of Tonkin incident. And just as certainly, the continuing failure to find WMDs after the fall of the Hussein regime has inspired profound doubt and suspicion among many Americans. As was the case with the Vietnam War, such doubts cause many Americans to doubt whether new statements, purporting to be factual, can be relied upon.

Why are we there? One of the disturbing aspects of the Vietnam War was its ambiguous moral justification, independent of the Tonkin incident. Principally, the national interest at stake was the threat posed by the so-called "domino effect", a theory first propounded under the Eisenhower administration, that posited that the leaders of the communist bloc would take over one nation after another until Fortress America stood alone against a sea of red. While it is true that communist cant actually did embrace world domination as a goal, and that communist insurgents were supported on a global basis by the U.S.S.R, many Americans could not help feeling uneasy about the justification of destroying Vietnam in order to save it. Even if the domino effect did pose a true threat to America, did we have the right to visit millions of casualties on the Vietnamese people in order to protect ourselves?

Does Iraq meet this test? Yes. Certainly, and especially with the apparent non-existence of WMDs, it is difficult to state why America needed to incur hundreds of billions of dollars of expense and an as-yet unknown number of American lives to bring down Saddam Hussein. The rest of the world was unconvinced that he presented a clear and present danger, and even if he had possessed the alleged WMDs, he lacked the means to deliver them to American shores. In contrast, there have been ample numbers of dictators around the world that have visited as much, and often more, death and exploitation on their own peoples. With rare exceptions (e.g., Kosovo), we have not intervened.

Damage to America's Reputation Abroad: The Vietnam war was the first major American military engagement in the 20th century that was not strongly supported by its historical allies. While it enjoyed military support from a few allies early on, it soon found itself fighting alone. Some western nations first distanced themselves from America, and even became staunch detractors. A few (notably) Sweden and Canada, willingly acted as havens for Americans of draft age seeking to avoid military service.

America also lost its moral luster in the eyes of the people of many countries around the world. The legitimacy of James Winthrop's vision of America as a model society for the world, "as a City on a Hill", was dimmed, and the moral justification of American positions on other issues therefore became more subject to doubt and suspicion.

Does Iraq meet this test? Yes. America has not only lost the support of some of its most staunch allies in recent weeks, it did not have the support of many of its traditional allies even before invading. The failure of forces in the field to find any meaningful indications of WMDs has also seriously damaged the credibility of the Bush administration, at best, and America itself, at worst. Who will follow us the next time we seek to convince the world of a clear and present danger, and should we blame them if they do not?

Is this a War we can Win? Ultimately, what turned many pragmatic conservatives against the Vietnam War was the blunt reality that America did not have the will, and perhaps not the appropriate means, to defeat the North Vietnamese. It is a commonplace that generals are often the most reluctant advisors to support a President wishing to wage war, not only because they know of the horrors of the battlefield, but also because they understand the difficulty of pacifying invaded peoples. America ultimately withdrew from Vietnam not because it had defeated the enemy, but because it knew that it could not.

Does Iraq meet this test? Yes. The disturbing reality is that America cannot win the peace in Iraq by force of arms or television broadcasts. Military might and exhortations will have no effect in winning the hearts and minds of the Iraqi people any more than they were capable of converting the Vietnamese to espouse American democratic values. Colin Powell, a Vietnam veteran, knew this well. Whether or not the violence that is building in Iraq today can be reigned in and a united country created lies not with American arms, but with the wavering will of the disparate ethnic groups that comprise geographic Iraq. Occupation Administrator L. Paul Bremer stated this clearly in a sober broadcast to the Iraqi people four days ago.

A Living Room War: The Vietnam war was famous for being America's first "living room war". Previously, American conflicts since the Civil War had been fought at a safe distance, with military censors filtering all information and images that reached the American people from the front lines. In the 1960s, for the first time, the brave words and speeches of the government appeared in stark, real time contrast to the nightly television images of body bags, firefights and grieving families.

Since the Vietnam War, our military conflicts have been more brief, our casualties far lower, and the access of the press to the front lines relatively restricted. In the first Gulf War, there were no "embedded" correspondents, the war ended in a matter of days, and American casualties were in the dozens rather than the hundreds. An enormous amount of the war was fought from the air, with thousands of sorties delivering by far the greatest part of the damage to the enemy. Detailed pictures of the horrible carnage suffered by the retreating Iraqi army on the "highway of death" never reached the American public, until only recently.

Does Iraq meet this test? Yes. To the current administration and the Pentagon's credit, press access during the early stages of the war was extensive. True, embedded correspondents could still only report on what they saw, but their access to the action was significant. Whether or not the press is now everywhere that it should be in order to provide a complete picture, ample images of the death, destruction, pain and suffering being endured by American military personnel -- and now contractors as well -- greet Americans on a daily basis.

What have we done? What are we doing? The Vietnam War was also the first war that caused a significant number of Americans to ask themselves whether the means being wielded in their names justified the ends that they were told must be achieved. The nightly news presented searing images of curtains of napalm enveloping the jungle, incinerating whatever lay beneath, and "pattern bombing" by B-52s rained down on North Vietnamese cities (and, secretly, on Cambodia and Laos as well). A number of horrible magazine photographs and episodes are still etched in the memory of all who were alive at that time -- the naked, screaming young girl fleeing unknown horrors on a dirt road -- the point-blank execution of a fettered, wailing man by the Chief of the Saigon Police. The talk of "fragging gooks", and "torching Hootches". And finally, the disclosure of the horrors of the My Lai massacre. All of these images forcefully brought home that we could no longer automatically think of ourselves as white knights, sent out to protect the world from evil, with a God-given right and duty to remake it in our own image.

Does Iraq meet this test? Yes and No. Happily, there have been no reports yet of atrocities being performed by American service personnel, although there have been some tales of mistreatment of prisoners that were promptly punished. But the devastation visited on the Iraqi people has none the less been very great, as well as the ongoing post-war misery born of joblessness, lawlessness, bad water, insufficient power, lack of sanitary services, and, increasingly, susceptibility to terrorist attacks.

While modern technology has allowed us to target our munitions with far greater accuracy, America's understandable reticence to risk casualties in its own ranks has led to accepting greater casualties among Iraqis. Rather than confront suspected enemies on the ground, they are taken out from the air -- sometimes after false identification. Perhaps more disturbingly, the administration has exhibited a chilling indifference to human rights, sequestering captured combatants (some of whom were children) indefinitely in Guantanamo Bay, without legal charges being brought, without access to legal assistance, and without visitors or even an indication for how long they will remain incarcerated.

How can this be? Because the administration claims that, unique in the world, no laws apply on this patch of land. It would appear that this administration believes that there is no such thing as innate human rights, only legally bestowed rights. Take away the law, and the individual stands naked and defenseless before the power of the state.

Divisiveness? A hallmark of the Vietnam era was the deep divisions that the conflict brought to American society, between generations, between family members, and between America and its historic allies. Not only were these divisions deep, but they were also traumatic. Protestors were reviled, returning veterans were sometimes jeered, and a handful of demonstrators at Kent State even died before the guns of the National Guard (spawning another indelible image -- this time of a college student bending with horror over a fallen classmate).

The political system came to revolve around the war as well. Presidential elections hinged on a candidate's position on the conflict, and Lyndon Johnson gave up his own hopes for a second elected term largely in reaction to the Vietnam situation.

Although to my knowledge it has never been reported publicly, the day of the Kent State shootings, a number of Senators stood up in Congress to say that the Ohio National Guard should receive medals for their valor, and that more demonstrators should have been shot. I know. I was there that day and heard it. The networks that covered Nixon's second inaugural also, I am told, did not show the army sharpshooters that stood at the ready along the tops of every building with a view of the portico of the Capitol. I saw that, too. Nixon ran for a second term on a promise to "Bring us Together", but that was an impossible task, due to the ongoing reality of the war.

Does Iraq meet this test? Not Yet. To date, protests against the Iraqi war, and the earlier Afghanistan campaign, have been far more limited, by Vietnam War standards. Even if the current situation lasts as long as the Vietnam conflict (a horror to be avoided at all costs), it cannot yet be known if the degree of divisiveness that the Iraqi situation engenders will ever be as deep. Perhaps that does not speak very well for America as a people, if the suffering of the Iraqi people is very great. But already we are rolling towards what is likely to be the most hotly contested election in a generation, with greater, and equally heated, conviction being shown by the supporters of each candidate. If the flag-draped caskets continue to fan out across the nation as the months pass, the divisiveness among Americans spawned by the war is likely to increase.

And finally, is Iraq a "Quagmire"? Perhaps the most significant aspect of what "another Vietnam" represents is its unquestioned status as the epitome of a "Quagmire". What that verbal standard implies has subtleties as well as overt connotations. Most obviously, a quagmire implies a tar baby-like situation that, once embraced, cannot be escaped for years to come. Once America entered into a military protector relationship with its proxy government in South Vietnam, it became very difficult to abandon our weak ally to the certain fate of defeat. Thus, a quagmire has at least two clear elements: it is inescapable, and it is obvious even while it is ongoing that there is no known, acceptable escape route.

More subtly, though, a "quagmire" in the Vietnam sense is also a situation that has the potential to expose America as a "helpless giant", or "an impotent superpower". These are two more verbal standards that were much in vogue during the Vietnam years. Strangely enough, they were employed to justify further military effort, rather than to justify a prudent disengagement. To many, the quagmire aspects of Vietnam raised the prospect of an eventual, ignominious defeat, resulting in Richard Nixon vowing that he would not be the first American president to preside over a lost war. Peace alone was not deemed by many Americans to be an adequate goal, but "Peace with Honor" was, regardless of whether our stated goals were reliably secured. Even at the time it was signed, the final Armistice entered into with North Vietnam was clearly a cosmetic agreement that permitted a face-saving exit. So long as a sufficient time elapsed before Vietnam fell, our military withdrawal would be politically viable.

Does Iraq meet this test? Not yet - but the situation does not look good. There are still reasons to hope that the pain and suffering of Iraq will gradually lessen. But as I write this, with a decision over whether to stage a final assault of Fallujah under debate and a seemingly endless supply of avid suicide bombers infiltrating Iraq from the surrounding Islamic countries, the situation is at best bleak. One can hope that most Iraqis will put aside their animosity against the United States in an effort to rebuild their own country. But Mid Eastern peoples have of late shown an unfortunate propensity for self-immolation over religious and ethnic issues, and the historical divisions between Kurds, Sunnis and Sufis lie just below the surface.

Final Conclusion: Is Iraq "another Vietnam?" Based on the analysis above, this writer about standards must conclude that the answer is "yes". While some aspects of the two situations naturally differ, verbal standards are not precise instruments. Some aspects of the Vietnam experience are lacking (only time will tell whether the Iraq situation drags on as interminably as did the Vietnam tragedy). And one must hope that the death toll in Iraq will never approach the levels of carnage experienced in Vietnam. But on the other hand, serious concerns over government deception have arisen much earlier in this war, and new and more serious impositions on human rights have been visited on the enemy by the current administration than ever occurred under either the Democratic or the Republican administrations that presided over the Vietnam conflict.

I believe that any reasonable review of the situation must therefore find that the Iraqi situation demonstrates the "required elements" of the verbal standard, "another Vietnam".

In the end, perhaps the best test for whether a given situation measures up to a verbal standard is the test offered by Justice Potter Stewart. In 1964, the Supreme Court was tasked with deriving a legal definition of what pornography is. Stewart famously opined: "You know it when you see it".

I believe that anyone who lived through the Vietnam era would conclude that the current Iraq situation meets Justice Stewart's test. Iraq is indeed "another Vietnam", with its final duration, cost and suffering being the sole attributes remaining to be defined

We can only hope that the situation in Iraq does not set a new standard for needless, endless, fruitless tragedy, born under suspicions of deception, perpetuated out of blind conviction in our own right to remake the world, and tolerated domestically out of complacency and blind deference to authority. We have been here before. It is more than tragic that we find ourselves here again.

Comments? Email:

Copyright 2004 Andrew Updegrove

The opinions expressed in the Standards Blog are those of the author alone, and not necessarily those of Gesmer Updegrove LLP

# # #

Useful Links and Information:

For a detailed timeline of Vietnam, from the May 7, 1954 victory of Vietnamese forces against the French at the battle of Dien Bien Phu, to the fall of Saigon on April 30, 1975, see:

For a comparison of the Gulf of Tonkin Incident to Congressional approval of the Iraq War, see:
Gar Alperovitz, "Remember the Gulf of Tonkin", Washington Post, September 22, 2002

For an overview of the secret bombing of Laos and Cambodia, see:
WGBH Boston: VIETNAM: a Television History - Laos and Cambodia

For a brief overview of the "Domino Theory" and whether or not it is applicable to the current situation in the Mid East, see:
Wikipedia entry, "Domino Theory":

Postings are made to the Standards Blog on a regular basis. Bookmark:


Every day, we scan the web for all of the news and press releases that relate to standards, and aggregate that content at the News Section of For up to date information, bookmark our News page, or take advantage of our RSS feed: Updates are usually posted on Mondays and Wednesdays. The following is a selection of the many stories from the past month that you can find digested at

New Consortia

There you are! People have been talking about a new GRID consortium being planned for quite some time. This month it finally left stealth mode and announced itself. Companies represented on the Board include EMC, Fujitsu-Siemens, HP, Intel, NEC, Network Appliance, Oracle, and Sun.

Technology Companies Form Enterprise Grid Alliance (EGA) Consortium.

The Cover Pages, April 20, 2004 --A new Enterprise Grid Alliance (EGA) consortium has been formed to develop enterprise grid computing specifications and grid interoperability solutions. EGA working groups will assemble, profile, and create new specifications as needed to encourage and accelerate movement to an open grid environment. EGA members preparing specifications, test cases, or reference implementations will agree to license their essential patents under royalty-free terms.

New Iniatives

Laboring in the Vineyards:The Unicode Consortium doesn't get a lot of press...but it should. The low-profile organization is dealing with the lowest common denominator building blocks that allow human beings to communicate with each other: text characters. Now, they are going a bit farther, and addressing "locale data" issues that will further the goal of allowing everyone to communicate with everyone, and access everything. Even if they come from a remote part of the world, and don't speak a widely known language

Unicode Consortium Hosts the Common Locale Data Repository (CLDR)

Unicode Press Release, Mountain View, CA, April 21, 2004 -- The Unicode Consortium announced sponsorship for the CLDR Project and its Locale Data Markup Language (LDML), designed to facilitate standardized methods for software globalization. The Common Locale Data Repository (CLDR) provides a general XML format for the exchange of locale information for use in application and system software development, combined with a public repository of XML-encoded locale and cultural data (e.g., date, time, currency, collation, text translation and transliteration).

Open Source

Dear Darl: In case there was any doubt about the robustness of the capitalist system, a start-up has just announced that it will underwrite the risk that Darl McBride is right about Linux. Given that the new company (a) says that it has examined the Linux code, and (b) has no stake in whether SCO wins or loses, this is more interesting news than whether large vendors already committed to Linux offer to indemnify their customers

Insurance group: Linux free of copyright violations

InfoWorld, April 19, 2004 -- A start-up company looking to provide legal insurance against copyright claims against open-source software has declared the Linux kernel free of copyright infringement. Open Source Risk Management LLC (OSRM) on Monday announced that it cannot find any copyright violations in the 2.4 and 2.6 Linux kernels, counter to claims from The SCO Group Inc. SCO is suing IBM Corp. and other Linux users, saying the Linux operating system violates its Unix copyrights. "We are saying that SCO has no copyright claim," said Daniel Egger, founder and chairman of OSRM. "We think they will lose." OSRM also announced it will offer indemnification on legal costs for open-source software, priced at about 3 percent of the desired coverage, for example, $1 million of legal protection for $30,000 a year

I am GNU, Hear me Roar:While the ongoing SCO offensive has been garnering most of the Linux legal headlines, a little noticed decision in the Munich district court represents a milestone for the open source movement. For the first time, a court has enforced the obligations to share and pass on required licensing obligations under the GNU General Public License.

Munich Court Grants Preliminary Injunction for Infringing Use of GPL Licensed Software, Berlin, Germany, April 14, 2004 -- The Munich district court granted a preliminary injunction against Sitecom Germany GmbH ( This injunctive relieve has been applied by the netfilter/iptables project ( Sitecom is offering a wireless access router product (WL-122) based on software licensed under the GNU General Public License (GPL), developed by the netfilter/iptables project...According to the court order, Sitecom did not fulfill the obligations imposed by the GNU General Public License covering the netfilter/iptables software. In particular, Sitecom did not make any source code offering or include the GPL license terms with their products...."To my knowledge, this is the first case in which a judicial decision has been decreed on the applicability and the validity of the GNU GPL", says Dr. Till Jaeger, partner of the Berlin and Munich based law firm JBB Rechtsanwaelte ( that represented the netfilter/iptables project in the litigation.

Never say "never" : In another indication that no company is an island, Microsoft published the source code of one of its tools this month. Granted, an installer is hardly an operating system, but it is an acknowledgement that it takes a lot less energy to go with the flow than to constantly swim upstream

Microsoft Airs Tools' Source Code Online

CNET, April 6, 2004 -- Microsoft published the code for one of its products on an open-source software development Web site late Monday, departing from its hard- line stance against making the underlying components of its technology available to the general public. Microsoft revealed the code for its Windows Installer XML (WiX) software, a set of tools used to build installation packages for the company's Windows products from XML source code. According to the information posted on the SourceForge site, a resource for open-source collaboration projects, the actual code Microsoft published supports an environment that software developers can use for creating Windows setup packages

New Standards

Partners in Standards:The following announcement has an obvious, and a less obvious news aspect to it. The obvious element is that OpenGIS is advancing some of its important work through cooperation with ISO, the global standards body. The less obvious element involves the details surrounding how the two will handle the housekeeping aspects of the standard: going forward, the specification will track towards adoption as an ISO standard, while OGC's own ongoing work will be called a "Recommendation Paper". Different standards bodies (e.g., ISO, IEEE, BTU, etc.) all have different ways of addressing the work of other standards organizations, and there is less uniformity in how standards are "shared" than one would expect. This is particularly true when a standard has been generated through a consortium that wishes to maintain an ongoing role in the maintenance of the standard.

Geography Markup Language (GML) Version 3.1 Public Release from Open GIS Consortium

The Cover Pages, March 26, 2004 -- The Open GIS Consortium (OGC) has approved the release of the "OpenGIS Geography Markup Language (GML) Implementation Specification" Version 3.1.0 as a publicly available Open GIS Recommendation Paper, tracked for dual release as ISO 19136. GML defines XML encoding for the transport and storage of geographic information, including both the geometry and properties of geographic features. The release contains a 601-page prose document and 33 XML Schema files

Accreditation and value propositions: :It used to be the case that a major reason to seek accreditation by the nationally recognized body of your host country was the ability to have your standards referred for global adoption. Here is a second important standard that is being adopted by ISO that was developed not in an accredited SDO, but in a consortium.

ISO Approves ebXML OASIS Standards, Geneva, Switzerland and Boston, MA, March 29, 2004 -- The International Standards Organization (IS0) has approved a suite of four ebXML OASIS Standards that enable enterprises in any industry, of any size, anywhere in the world to conduct business over the Internet. The submissions from OASIS will be published as ISO technical specifications, ISO/TS 15000. The new ISO 15000 designation, under the general title, Electronic business eXtensible markup language, includes four parts, each corresponding to one of ebXML's modular suite of standards: ISO 15000-1: ebXML Collaborative Partner Profile Agreement ISO 15000-2: ebXML Messaging Service Specification ISO 15000-3: ebXML Registry Information Model ISO 15000-4: ebXML Registry Services Specification Until now, the technology available for most businesses to exchange data was electronic data interchange (EDI), which made significant contributions to productivity and inventory control. Many companies, however, find EDI expensive and difficult to implement. The ebXML initiative, using the economies of scale presented by the Internet, breaks through these obstacles.

Other New Work Product

RFID in a Nutshell:If you've been reading our ongoing coverage of RFID technology, standards and uptake and don't know much about what RFID is all about, here's a chance to catch up. AIM has put together a CD compendium of everything you need to know to get with the program

AIM North America Releases the RFID Knowledge Base, Warrendale, Pennsylvania, April 7, 2004 -- AIM North America announces the publication of the "RFID Knowledge Base", an interactive CD containing key educational information about RFID technologies and solutions designed to address informational needs of systems integrators and VARs. The "RFID Knowledge Base" is a compilation of material published in AIM's "RFID Connections" e-newsletter, material contributed by AIM North America member companies, and material from independent sources. The content includes an Introduction to RFID, Case Studies, RFID Basics, Justifying and Implementing RFID, RFID Issues and Insights, and Additional Resources and Related Links. "The Knowledge Base is a comprehensive resource for anyone in need of understanding RFID technology and its application", stated Dan Mullen, President of AIM. "The incredible demand for accurate information about RFID in North America is apparent and AIM is a natural resource for this information."

A welcome addition: Out of all the verbiage that exists on standards, there is surprisingly little dedicated to the topic of how to actually create them. Happily, ETSI (the European Technical Standards Institute) has decided to put its "Making Better Standards" book available on-line, thus providing a ready reference to all on how to create useful standards. Also happily, the on-line book starts with defining what the market wants and needs, which is a refreshingly pragmatic starting point that is sometimes given less attention than it deserves. While the site is oriented towards writing communications standards (and particularly ETSI standards), it will still provide a useful primer for anyone new to the process side of standards setting

ETSI puts on record its commitment to making better standards, Sophia-Antipolis, France, 2nd April, 2004 -- Technical standards play a vital role in ensuring the success of modern communications systems, but can only do so if those standards are 'fit for the purpose'. A new website, launched today by the European Telecommunications Standards Institute (ETSI), provides an easy-to-use guide to anyone involved in writing (or reading) communications standards to ensure the highest possible quality of their efforts.... ETSI's Technical Committee on Methods for Testing and Specification (TC-MTS) has produced the web site as a complete overhaul of their immensely successful paper book version of “Making Better Standards” that was first published in 1996....The guide answers questions about what the market is for standards and the need for standards that are both relevant and timely. It also addresses the structure of the standard and differing methods of validation, from Walkthrough to Prototyping. Some of the topics covered by the web site: Market Expectations; Planning for Standardization; Protocol Standards; Regulatory Environment; Specification Languages; Test Specifications; Validating Standards; What Makes A Standard "Better"; The standard setting site may be found at:

Standards and Society

Where there's residue, there may be detectible arson: Standards are not only all around us, but they take diverse forms. Paint chips, in their own way, are a standards tool: specific colors identified by numbers. NIST has just helped create a physical standards tool of its own: a "Standard Reference Material". The liquid will help calibrate law enforcement instrumentation to better enable it to detect residues left behind by 15 common arson accelerants. The hope: to raise the dismal 2% national conviction rate for that crime

Standard Helps ID Fuels Used in Arson

NIST Tech Beat, March 25, 2004 -- Faced with a growing number of ignitable chemicals with similar characteristics, arson investigators have their hands full trying to tell residues of insecticide, for example, from those of gasoline. But identifying fuels used to set fires will be easier now, thanks to some help from the National Institute of Standards and Technology (NIST).

Story Updates

Small pieces, loosely joined: : In our last issue of the Consortium Standards Bulletin, we noted the increasing fragmentation of standard setting methods and results. See: The Balkanization of Standard Setting. OASIS, the multifaceted standard setting organization that the world looks to advance the state of the art and broaden the reach of XML, has just held a symposium to be sure that there are no gaps left behind as multiple consortia and unofficial coalitions of individual companies continue to rush pell-mell into the future, each adding one piece to the chaotic puzzle at a time.

OASIS to Host Open Symposium on Reliable Infrastructures for XML, Boston, MA, 8 April, 2004 -- OASIS, the international standards consortium, announced plans to host the Symposium on Reliable Infrastructures for XML, 26-27 April 2004, in New Orleans. The event, which will be open to the public, will offer a forum for the international community to exchange ideas and present results of standards work-in-progress. Attendees will identify unaddressed topics in need of standards development and areas where coordination between efforts would promote interoperability. "Today, many different (and partially interchangeable) technologies are available that propose to increase the reliability of XML-based messaging and networking infrastructure....The OASIS Symposium will focus on exploring the current state of these technologies and identifying gaps where open standards are needed." For more information see:

More "Private Specifications": In the March issue of the Consortium Standards Bulletin we reviewed the fragmentation of standardization efforts, including the burgeoning trend of groups of companies that create draft specifications, and then submit them to standards bodies for formal adoption -- and indeed sometimes skip that last step entirely. (See the editorial, The Balkanization of Standard Setting and following story) The following article in eWeek reports on the announcement of the latest such specifications, and the market's reaction to the technique.

Microsoft, Others Publish Metadata-Exchange Standard

eWeek, March 31, 2004 -- Microsoft, IBM, BEA Systems Inc. and SAP AG on have announced the publication of the WS-MetadataExchange specification. In addition, Microsoft, IBM and BEA announced an update to the WS-Addressing specification. Both are part of an existing Web services architecture laid out by Microsoft. Eric Newcomer (IONA Technologies) said: "We are very interested in seeing these private specifications progress toward standardization in open forums, as the specification owners have promised. And we are happy to participate in the feedback sessions they hold toward that end. However, we find ourselves in a very difficult situation with regard to implementing specifications such as these in our Artix product line, since we cannot be sure what direction they will take in the future.",1759,1559024,00.asp

What a Long, Strange Trip it's Been: PKI certificates-- a way to establish secure email communications through "Public Key Infrastructure" -- is a standards-based approach using digital certificates that has been around for a long time. Originally, there was an independent consortium created to foster the approach, which later merged into OASIS. After languishing on the side lines for some time, the PKI TC in OASIS is suggesting that its time to move for widespread adoption of PKI technology, and is also laying out a roadmap for how that can be done. At the same time, they are getting a welcome boost from the Department of Defense, which has a looming April 1 deadline for over 350,000 defense contractors to sign up for PKI digital certificates if they want to keep securing lucrative defense contracts. As with RFID tags, when the DOD talks, industry listens. The resulting mandatory implementations can provide incentives to these contractors to make more use of their investment in this technology in their communications with third parties.

Action Plan Developed for PKI Adoption

ComputerWorld, March 25, 2004 -- An e-business standards watchdog last month unveiled a comprehensive action plan aimed at kickstarting the adoption of Public Key Infrastructure (PKI) technology.... Security vendors have long touted PKI technology (which uses digital certificates to authenticate e-mail, individual and enterprise transactions) as the answer to most network computer problems. But it has been hampered by cumbersome imple­mentation, differing and incompatible standards along with issues of legacy system integration. PKI has evolved and so too should the industry's understanding of the technology and its ability to drive Web services and e-business,... The PKI Action Plan addresses some of the primary obstacles to widespread PKI adoption; these adoption barriers include: poor or missing support in software applications, high costs, poor understanding of PKI among senior managers and end users, interoperability problems and lack of focus on business needs.

DOD to Vendors: Join PKI System or Take a Hike

Government Computer News, March 22, 2004 -- If vendors don't register by April 1 2004 for encryption certificates to do business with the Defense Department, DOD intends to severely limit their ability to work on contracts. DOD plans to enforce a requirement that DOD contractors participate in the Interim External Certification Authority program. IECA requires DOD contractors to have one-year encrypted digital certificates to ensure the security of vendor communications with the department. Roughly 350,000 contractors that are doing business with the department need certificates,...The directive requires the "exchange of unclassified information with vendors and contractors" be conducted using public-key infrastructure certificates obtained from approved certificate authorities.

What's Up/What's Down

It couldn't happen to a nicer guy:What do Mark Andreesen and Tim Berners-Lee have in common? Well, it used to be that what they both were principally known for was the profound impacts that each had made on the how we use the Internet. But Mark got rich through his efforts, while Tim, well, at least Tim was still at the helm of a venture that continues to have a profound on the world. Now Tim and Mark have something new in common: Tim is now rich as well, as a result of his technical wizardry and vision being recognized with a substantial prize. And he's still got that job, too.

Web Inventor Berners-Lee Wins Technology Award, Helsinki, Finland, April 15, 2004 -- World Wide Web inventor Tim Berners-Lee won one million euros ($1.23 million) on Thursday, the largest single amount of money he has made from an invention that has made many others very rich. Berners-Lee, 48, was named the first winner of the world's largest technology award -- the Millennium Technology Prize -- by the Finnish Technology Award Foundation at a ceremony in the Finnish city of Espoo. Berners-Lee launched the World Wide Web in 1991 and gave the world easy access to information, revolutionizing the way it worked and communicated

Who's Doing What to Whom

But we still need to talk about that "hairball" remark: What, you may ask, does the new rapprochement between historic enemies Sun and Microsoft mean for standards? While Sun CEO Scott McNealy may no longer refer to the Windows OS as a "welded shut hairball", he has proclaimed Sun's continuing dedication to open standards. In the following article, Stephen Shankland tries to peel a few layers off of the standards onion, quoting analysts and players as regards specific technology areas, and the impact that the Sun/MS settlement may have on the related standards.

Sun says Microsoft pact not a blow to standards

CNET, April 6, 2004 -- Sun has been one of the most vocal advocates of open standards, arguing that customers should be able to choose from technology from multiple suppliers and shouldn't have to fear getting locked in to any one company's technology. The rhetoric has been designed to undermine Microsoft, whose software has long been derided by Sun Chief Executive Scott McNealy as a "welded-shut hair ball."