Consortium Standards Bulletin- March 2004
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
   Home > Consortium Standards Bulletin > March 2004
  Untitled Document
Untitled Document


MARCH 2004
Vol III, No. 3


Standard setting has historically benefited from the evolution of new processes and structures. Throughout that evolution, the commitment to openness has been a constant. Lately, that commitment seems to be weakening. Print this article

Coteries of companies develop specifications and shop them to consortia; Microsoft wants the industry to adopt (and license) its Caller ID anti-spam specifications; open source projects are everywhere (and variously structured); and Bloggers are flaming each other over competing flavors of content syndication. Is this any way to develop standards? Print this article

Administrative Law Judge Stephen McGuire gives round one to Rambus. And there’s only one round left. Print this article

There once was a time when running a standard setting organization was a pretty sedate and predictable way to make a living. Now you need to put out press releases to deny rumors that combustible RFID tags are contaminating the money supply. Print this article

NEW AT CONSORTIUMINFO.ORG : Print single CSB articles or entire issues in PDF form, search the archive, and find over 50 new consortium listings.

News Shorts:   New Consortia are Back; Voice Comes to the Web; USPTO Gives Eolas the Thumbs Down; Security Standards Continue to Proliferate; Open Source Marches On in Governments; Intel Calls for Worldwide Digital Management; and much more. Print this article

Print this issue




Andrew Updegrove

A recurring question that underlies the legitimacy of modern standard setting methods is this: “What is a standard?”

Almost immediately the definitional process becomes complex, and even contentious. Is the Microsoft Windows operating system a standard, or not? In other words, should we judge a specification only by the process that created it, regardless of its uptake (or the lack thereof), or is success the more important criterion? Most people resolve this conundrum by adding a results oriented modifier, calling widely-adopted commonalities like Windows a “de facto standard”.

But what about process-based “standards”? At the traditionalist end of the spectrum, there are those that advocate the use of the word “standard” only in relation to the work product of an accredited standards development organization, relegating the work product of even the most well-respected consortia to the second-class status of mere “specifications”. At the other end are those that find consortium-produced specifications, and even open source project derived work product, to be perfectly entitled to be called standards.

In our view, the essential element of a standard (sans the words “de facto”) is that it has been produced through an open process. Admittedly, this simply shifts the discussion to what the attributes of an “open process” are. But this shift is the most vital step along a line that begins with proprietary control and ends with standards that can be safely implemented without fear that some individual company, or group of companies, can exercise undue control over the future development and permitted uses of the standard.

Of course, all this becomes only so much dancing on the head of an etymological pin if the market ceases to care about process at all. Increasingly, some companies seem to be promoting this concept.

In this issue of the CSB, we review the burgeoning number of situations where a company, or group of companies, creates a specification outside an SSO. In some cases, the resulting work is shopped around to existing SSOs and offered for adoption and maintenance. This may be a useful and productive exercise, where an SSO might not have wished to allocate resources to the internal development of the specification in question, but might be quite happy to accept it for ongoing maintenance. The practice can be suspect, however, if the contributors of the specification have undue influence over the SSO that accepts the grant. And even when such is not the case, something may be lost where too few points of view are brought to bear in creating a specification during its initial conception.

Recently, SSOs are sometimes being bypassed entirely, with the developers of a new specification encouraging direct adoption by the industry without any assurances of continuity or other protections at all.

Another disturbing trend is the development of specifications in too-casual and chaotic a setting, even if that setting is arguably “open” in some respects, as seems to be occurring in the case of RSS standards. While there may be no proprietary interests at work, there are risks that specifications that are conceived in such a haphazard fashion may later prove to infringe third-party patent claims, to the chagrin of all those that have already implemented. The thousands of open source projects that are ongoing today run similar risks.

While we have always supported experimentation and evolution in the standard setting process, we believe that this trend of casualness is one that is unhealthy for the industry. Without process there is no protection from proprietary self-interest, or even from abandonment or tactical withdrawal of a specification by its owner. And while some material being offered to the marketplace may be purely functional rather than fundamental, it is easy to become complacent in accepting such offerings and incorporating them into products, perhaps failing to see the importance that these Trojan horses may later play.

Testing the envelope is how productive evolution takes place, in the virtual as well as the natural world. But not every mutation is healthy, or deserves to propagate. Those who build to a “standard” would be well advised to be sure that the process that created that standard met minimum procedural requirements. And the industry at large would do well to discourage, rather than encourage, companies to spend their time developing specifications outside of an open process.

Comments? Email:

Copyright 2004 Andrew Updegrove



Andrew Updegrove

The creation of standards is a real-world enterprise. The creation of ICT standards is an especially real-time exercise as well. In consequence, process must be the servant of the need, as well as the guaranty of the openness of the result.

As reviewed in our last issue of the CSB in an article entitled “Past, Present and Future: the Accelerating Pace of Change”, the process of standard setting has evolved to meet the needs of those that require standards to conduct business. And while some of the results of this evolutionary process have had their detractors among traditionalists, the utility of the standards being created today is manifest.

As also noted in last month’s feature article, the pace of change in standard setting is continuing to accelerate, in response to the ever-quickening rate of technological evolution. As in any other real-world situation, this creates tension between expediency and quality. Over the past year, we have seen a number of developments that lead us to believe that the balance between expediency and quality may be tipping in the wrong direction. This article will examine a few of these examples, and suggest that the time has come to reexamine process in order to rebalance the equation.

I. The Dangers of Subcontracting

Consider the following recent announcement, as reported in InfoWorld on February 17, 2004:

Update: Microsoft heralds Web services for devices
BEA, Intel, Canon also part of WS-Discovery specification

Not that long ago, such an announcement might have described a joint venture among a group of companies to create a technology for their exclusive use. Today, it suggests a far different story.

What this announcement, and an increasing number of press releases of similar tenor, represent is a growing trend for a small and self-selected group of companies to join together to create a specification that serves their unique interests, and then either offer it to the market as a de facto standard, or market it to existing standards bodies for adoption, perhaps going through a “public comment” period before making the hand off. Not infrequently, one or more working groups are already active in existing SSOs trying to solve the same problem. The goal of the independent group of companies in this case is to head the formal process off at the pass, invariably by offering the alternative solution to another standard setting organization.

Not surprisingly, this sort of activity is greeted with hostility by those that suspect that the companies involved are trying to secure unique advantages that they could not gain, had they played by the traditional rules and worked with the entire industry to solve the problem within an existing standards organization.

Exactly this sort of suspicion was voiced in the case of this new Web services specification noted above. Sun Microsystems immediately denounced the announcement as yet another attempt to lead Web services down the track desired by its competitors. Microsoft replied that it would accept industry feedback and then offer the specification to an “as-yet-unnamed industry standards organization”, presumably either OASIS or the W3C.

If all of this sounds familiar, it’s because it has happened so often before. On August 9, 2002, Microsoft, IBM and BEA announced the publication of a suite of specifications <> to “collectively describe how to reliably define, create and connect multiple business processes in a Web services environment, and help organizations coordinate business processes and transactions within the enterprise and with partners and customers across heterogeneous systems and within the enterprise.” Included in the specifications was a new language to describe business processes – “Business Process Execution Language for Web Services”, or BPEL4WS. BPEL4WS was eventually offered to OASIS, which accepted it after securing appropriate guarantees from the technology owners that the underlying intellectual property rights would be made available on royalty free, RAND terms.

And it has already happened again, with the announcement on March 5 (this time by Microsoft, IBM, BEA and SAP) of yet another new specification: Web Services Metadata Exchange for Service Endpoints (WS-MetadataExchange) <> The latest specification continues the rapid release of the deliverables described in the “roadmap” for Web services previously conceived by Microsoft and IBM.

The question naturally arises whether the proponents or the detractors of this process have it right. To Microsoft, IBM and their partners, there is an urgent need for Web services standards that is not being met by the existing standards bodies. And it is certainly true that independent organizations like OASIS and the W3C will not accept an offered specification unless they believe that it is robust and appropriate, and will be available to implementers on appropriate license terms. Finally, there is the fact that there is no single standards body that is creating all Web services standards, and a challenge of coordination of development and result therefore arises (see the May 2003 issue of the CSB: “Who Should Set the Standards for Web Services?” <>)

But the nagging question persists: can it be a good thing for a small group of companies, self selected and understandably motivated by proprietary goals, to become de facto subcontractors to the standard setting process? Certainly, it is expedient. And for the majority of a standards organizations’ members that is made up of non-competitors of the developer group, the practice may even seem benign. But ultimately standards are based on trust, and it is primarily process that offers the pragmatic assurances upon which trust is based.

II. Into the Lion’s Den?

If allowing companies to pre-bake the cake before handing it off to the baker for sale is risky, what if a company wants to skip the baker entirely?

On February 24, 2004, Microsoft Chairman and Chief Software Architect Bill Gates gave the keynote address at this year’s RSA Conference. In that address, he announced (in the words of the Microsoft press release):

[A] detailed vision and proposals on how technology can be used to help put an end to spam, including outlining the company's Coordinated Spam Reduction Initiative (CSRI) and technical specifications for the establishment of Caller ID for E-Mail…. Microsoft believes some relatively simple but systemwide changes to the e-mail infrastructure are needed to provide greater certainty about the origin of an e-mail message and to enable legitimate senders to more clearly distinguish themselves from spammers. (emphasis added)

While Gates stated that public comments on CSRI would be welcome, he did not state that Microsoft intended to offer CSRI to a standards body, either now or in the future. He also went on to state that Microsoft had certain unnamed patent claims underlying elements of CSRI, which would be made available under royalty-free license terms. The FAQ sheet <> describing “Microsoft’s Anti-Spam Roadmap” that appears at the Microsoft online press room does not address these licensing terms. However, patent license <> under which the CSRI can be implemented may be found at the portion of the site at which the CSRI specification is posted for comment.

At first blush, the terms of the patent license are quite reassuring, and have great similarity to the type of licensing commitment that a standards body would typically require. For example, the license grant language reads as follows: “Microsoft and its Affiliates hereby grant you (“Licensee”) a royalty free, fully-paid, non-exclusive, worldwide license…”, in exchange for a cross license of any claims of the Licensee that would be infringed by an implementation of the specification. But when one looks at what isn’t in the license, the advisability of implementing CSRI becomes less obvious. For example:

  • The license is not expressly irrevocable
  • The reciprocal license of patent claims required from Licensees benefits Microsoft, but not the owners of other patent claims that might be infringed by an implementation of CSRI (except to the extent that Microsoft wishes to assert its license rights – which it has no obligation to do)
  • The licenses are not transferable, which means that every company in the distribution chain must obtain a license directly from Microsoft – which thereby collects more and more patent cross licenses, which again primarily benefits Microsoft
  • Microsoft has not disclosed what its patent claims are, and as a result, implementers are not able to design around those claims (as an SSO work group might), in order to avoid the patent claims entirely
  • Implementers can ask for changes or new features, but they can't demand their inclusion. Only Microsoft can decide which way CSRI evolves in the future
  • Microsoft can refuse to grant a license to a specific company if it so chose, since it has not made a pledge to any third party (like a standards body) that it will license on a non-discriminatory basis
  • The definition of the specification in the license does not state whether it applies to future versions of CSRI. Hence, Microsoft has not expressly stated that it would license future versions of CSRI that might include new patent claims owned by it
  • If Microsoft wanted to move on to another solution in the future, it could discontinue supporting CSRI, even if existing implementers wished to continue using it
  • Microsoft could simply discontinue granting further licenses to new implementers at any time
  • And finally, if it wished, Microsoft at some future date could decide to bundle CSRI with Windows at no extra cost, thereby economically undercutting any of the companies that had incorporated CSRI into their own products

In short, the structure that Microsoft has offered to the industry is most akin to a users group, dressed up with a quasi-standards based patent license.

The Microsoft proposal is hardly a new concept. Vendors have tried to straddle the “owned but open” fence for many years, most notably as personified by Sun’s abortive effort to have Java accepted as a standard through the ISO PAS process, followed by its long-term maintenance through the Java Community Process <> But while some would find the Java process to be an acceptable compromise, that is hardly a rousing endorsement for allowing such processes – especially in watered-down form – to proliferate.

III. Applying Chaos Theory to Standard Setting

Perhaps it should not be a surprise that the wild and wooly world of blogging is not pursuing a staid path to standards development. But given the importance of RSS feeds for purposes that extend beyond the bloggers “art”, it may be regrettable. RSS, by the way, stands for either "Really Simple Syndication" or "Rich Site Summary" -- division begins with the fundamentals on this standard.

As a result, an ongoing dispute in this area was widely reported in on-line news outlets as mainstream as CNET, and as (how to say) alternative as The Temple of the Screaming-Penguin. At the center of the dispute is one David Winer, who until last summer was the gatekeeper of RSS, which had originally been created by Netscape and was now owned by Winer’s UserLand. The most intense part of the tempest may have been sparked by a posting at (of course) a Blog.

That entry, innocently titled “I Like Pie”, was written by Tim Bray, a member of the W3C Technical Architectural Group. Bray was advocating consolidating competing RSS flavors into one specification, and bringing that version under the aegis of a standards organization. In what was attempting to be a balanced review of Winer, Bray included the remark: “ I observe that there are many people and organizations who seem unable to maintain a good working relationship with Dave.” With that, the online flamethrowers came out, and battle was joined.

At the same time, a competing standard (Atom) was being developed with the support of Google, IBM and various Blog tool vendors, offering further opportunities for division. On July 15, 2003, it was announced that the RSS specification had been conveyed by UserLand to the Berkman Center for Internet & Society at Harvard Law School. The specification became available under a Creative Commons license, as well as subject to the supervision of an Advisory Board. Not exactly a standards organization, but perhaps an improvement over the sole authority of a single individual.

Earlier this month, Winer suggested (yes, in a Blog post) a rapprochement with the supporters of Atom, proposing that the two specifications be merged into a backwards-compatible new version, which would be placed under the supervision of an Internet Engineering Task Force (IETF) working group. The post begins:

I'd like to make a constructive offer to the people who are working on Atom. And before stating the offer, let me say that I am open to counter-offers.

Certainly, negotiation by Blog is a new and novel way to build consensus around standards. And inevitably, such examples point out the fact that standards are too important to be produced by Brownian motion, with individual personalities jostling each other in chaotic fashion, and achieving useful, but fragile, results more by accident than design.

IV.The Cult of Personality

The RSS example provides an apt segue into another disturbing trend in the development of modern commonalities, particularly in the world of open source: the idea that salvation can be found through the strong leadership of the Great Leader. While the most famous example of the Technical Visionary as Benevolent Despot is Linus Torvalds, the genealogy of this approach extends back (at least) as far as Robert Scheifler, who for many years in the late 1980s and early 1990s was the director of the X Windows Consortium. While that consortium had a board of directors and a large members’ plenary, there was no question who was the decision maker when it came to technical matters.

As with the evolution of Linux, the X Windows software was well designed and widely implemented. In fact, the license agreement used to make it commercially available (created by Scheifler and this author) has been cited by Carl Cargill, the Director of Standards of Sun Microsystems, as the progenitor of the modern open source license.

But not all efforts led by an individual will be so effective. To state only the most obvious point, there are inevitable issues that relate to excessive dependency on a single individual, and not all individuals will be similarly effective even while they remain fully committed. Who knows what efforts, currently in process in some obscure corner of, might achieve greatness if they were playing out on a wider stage, supported by the right cast and resources? There needs to be a way to have the best of both worlds – the creative ferment of open source, the supportive structure of a standard setting organization – and an easy and natural way for the best projects to progress from one platform to the other.

V. A Call for Rational Consolidation

The late evolutionary theorist Stephen J. Gould is most famous for being the co-inventor of the concept of “punctuated equilibrium”, which postulates that species evolution occurs in comparatively rapid bursts, interrupted by longer periods of stasis. Certainly, the world of standard setting seems to be in the middle of such a explosion of creative change today. The challenge for tomorrow is how to harness the creativity and promise of the various trends described above, and wrap them in an adequate envelope of process that will make these new experiments worthwhile additions to the standard setting toolkit.

Is it possible to domesticate each of the activities described above? Perhaps not. But there are some logical routes to consider before we abandon the quest:

  • Subcontracting: Certainly, there could be a discussion of the parameters within which subcontracting could appropriately occur. Perhaps criteria could be derived that would set preconditions for such activities, and failing which, SSOs would not accept the results.
  • Avoiding the Lion’s Den: Obviously, the Nancy Reagan approach is the most effective (“Just say no”). More realistically, asserting greater collective pressure to (at least) include minimum licensing requirements before adopters sign up would be an improvement. Best of all, of course, would be for vendors to avoid, whenever possible, implementing any specification with the potential to become a de facto standard, unless it has been turned over to an SSO.
  • Escaping Chaos: It may be that there is a transient place for chaos in standards development, particularly in very new and creative areas. In such situations, it may be necessary to allow the eddies to swirl for a while before things settle sufficiently to see goals clearly. But there is danger in becoming dependent on the product of such a process, due to the myriad risks that attend it. The industry would be better served by generating an alternative solution through a trusted process than to casually adopt a specification that has too unreliable a future.
  • Fighting Fascism: The world learned conclusively in the 1940’s that succumbing to the temptation to blindly follow supreme leaders in the 1930s had been ill-considered. There is a place for great technical visionaries in standard setting, to be sure. That place is as the executive or technical director of a proper process that is designed to deliver dependable results, and to be able to survive the loss of the leader.

Ultimately, standard setting must be about process. It need not be about only a single process, but whatever process is employed must meet the same minimum standards of openness that have become well recognized through experience. Its time for the standard setting world to begin using a bit more self discipline in how it goes about the business of setting standards.

Comments? Email:

Copyright 2004 Andrew Updegrove





Andrew Updegrove

Long-time readers of the CSB will be well aware that we have followed the saga of chip technology licensing company Rambus closely. Since our last issue, Rambus has won (again) in its ongoing, ten year old battle to assert patent claims against those that implement certain SDRAM standards. This time, Rambus defeated the Federal Trade Commission (FTC) in the first round of the government agency’s effort to sanction the licensing company for its behavior in the JEDEC committee that created those standards. The action began on June 19, 2002, when the FTC filed a complaint against Rambus, charging the technology company with “deliberately engaging in a pattern of anticompetitive acts and practices that served to deceive an industry-wide standard-setting organization, resulting in adverse effects on competition and consumers.”

First-hand observers of the trial held last summer before Administrative Law Judge (ALJ) Stephen J. McGuire were not optimistic about the outcome of the trial, and with good reason: on February 17, McGuire dismissed the government’s complaint. And, in a 348 page opinion released a week later, the judge resoundingly rejected the FTC’s case on every count.

In the decision, the ALJ summarized the questions at issue as follows:

  • whether Rambus engaged in a pattern of deceptive, exclusionary conduct by subverting an open standards process
  • whether Rambus used that conduct to capture a monopoly in technology-related markets
  • whether Rambus' conduct violated antitrust law
  • whether Rambus' conduct resulted in anticompetitive injury

Judge McGuire then answered each question with a definitive “no”. Perhaps the holding that dismayed observers the most was the ALJ’s conclusion that the JEDEC policy merely encouraged, rather than required, early disclosure by participants in the standard setting process. Given that threshold determination, much of the rest of the decision became inevitable.

FTC Complaint Counsel Geoffrey Oliver filed a motion on March 1 to appeal the ALJ’s decision to the FTC Commissioners. The initial due date for briefs supporting the appeal was March 26, but on March 19, the FTC awarded an extension until April 16. Once those briefs are filed, it is likely to be a long wait before the Commissioners announce a decision. If their verdict is the same as McGuire’s, it will be bad news for the defendants in the various law suits still outstanding between Rambus and those chip vendors that still refuse to pay royalties.

And it will be worse news for the integrity of the standard setting process, where it will appear to some that there is more to be gained by gaming, than by abiding by the rules.

Note: The host of this site, Gesmer Updegrove LLP, filed pro bono “friend of the court” briefs in the earlier case of Infineon v. Rambus with both the Federal Circuit and the United States Supreme Court. The latter brief was supported by five accredited standard setting organizations and six consortia, representing over 8,600 companies, universities and government agencies. We will be filing a new brief, based on the Supreme Court brief, in support of the FTC. The Supreme Court brief, and the full list of SSOs that supported it, may be viewed at

If your organization or company is interested in supporting this brief, please contact AuConAdd();"> There is no charge to participate, as Gesmer Updegrove will be filing the brief on a pro bono basis.

Comments? Email:

Copyright 2004 Andrew Updegrove





[] [] March 20, 2004

#14 Dan Mullen, Andrew Jackson and the Dark Side of the Web  There once was a time when running a standard setting organization was a pretty sedate and predictable way to make a living. You called meetings, you worked on keeping in touch with the members, and you made sure that the bills got paid. Sure, there were internal politics to deal with, and a bad economy would make you struggle to keep membership up, but those were small problems to face in exchange for a decent job and a pretty secure future.

But that, as the current saying goes, was then, and this is now. "Now" brings new demands on the director of a standards organization, like squelching Internet rumors about combustible RFID tags contaminating the money supply.

Come again?

Our story begins, allegedly, at, "The Home of Henry Makow Ph.D." according to its home page. Dr. Makow's site also states that it is dedicated to "Exposing Feminism and the New World Order". Articles listed in the "Best of Henry Makow" side bar have rather bewildering titles, such as "Communism: Wall Street's Utopian Hoax". Today, the lead article is entitled "Americans are Rothschild Proxies in Iraq". Obviously, such a site represents the type of reliable news portal to which one would give great credence.

Dr. Makow also has certain concerns regarding privacy, as in maintaining one's privacy from the watchful eyes of Big Brother. All of which quite naturally brings us to David and Denise.

D&D supposedly sent a letter to Dr. Makow (Ph.D.) indicating that Dave's wallet had set off the shoplifting monitor in the local truck stop. The problem seems to have arisen from the fact that Dan's wallet was stuffed with new $20 bills. Ever on the alert for the perfidious actions of Big Government, D&D immediately suspected... RFID tags!

Yes, RFID tags -- those insidious New Age tools for tracking honest citizens. D&D knew that they had to get to the bottom of the situation. As Denise (who is fond of single quotation marks) reported ("sics" omitted, so as not to interrupt the flow of Denise's prose):

Dave and I have brainstormed the fact that most items can be 'microwaved' to fry the 'rfid' chip, thus elimination of tracking by our government. So we chose to 'microwave' our cash...

Sure enough, as indisputably proven by the pictures supplied by D&D, the right eye of Andrew Jackson on every new twenty ‘exploded’. Denise reported that:

Now we have to take all of our bills to the bank and have them replaced, cause they are now all 'burnt'.

D&D have wisely decided to wrap all of their larger bills in aluminum foil from now on, to thwart future government efforts to track them.

Leaving aside for the time being whether tracking D&D (assuming they actually exist) might not be such a terrible idea, we now move to Alex Jones', which "adapted" the D&D letter, reproduced the pictures, and included the following helpful note:

This article has been linked all over the Internet. We want to make it clear that $20 bills will only 'pop' or 'explode' in certain microwaves. We've had E mails saying they do, they don't, 'you're all kooks' etc etc. What is confirmed is the public policy to embed US and European money with high tech tracking devices as part of the hulking surveillance society.

It is an interesting coincidence that Alex also finds single quotes to be a useful device to attractively set off common words.

A brief Google search does indeed confirm that the article -- Alex's, anyway -- has been linked all over the Internet. It also can lead one to interesting observations, such as the following, which appear at, unambiguously titled "$20 RFID story is horseshit...":

JC Suez: The same thing happens if you take a stack of copy paper and microwave it. A central point in the stack heats and eventually ignites and burns up and down the stack from that point.

Alex sez: Also of that they say they are messing with the NEW twenties, but in fact those are the old ones. you can tell because the portrait of Jackson has the circle around it, which is absent in the new twenties.

Indeed. Now, one might assume that the nascent RFID industry might not be too concerned about the invidious news revealed by Dr. Makow (or Alex Jones, as the case may be) through the good offices of alert citizens Dave and Denise. But you would be wrong.

Simply ignoring vicious rumors about RFID tags is not what you would expect from the alert director of a standards organization that is promoting RFID technology. Why? Because you're not paid the big bucks to just sit around and let this type of thing get out of hand.

Of course, what you would do is put out a press release. After all, news organizations invariably give front page coverage to standard setting organization news. So that is exactly what AIM President Dan Mullen decided to do, after first confirming his facts. Thus it is that Mr. Mullen did what every other good 'Netizen had already done: he pulled out his wallet and tossed the contents into his microwave.

From the resulting press release (Mullen prefers double quotation marks):

Do $20 bills explode when placed in a microwave oven? Do they contain hidden radio frequency identification (RFID) tags? Can the government track you through your cash?... In order to determine whether or not there was a security feature in the $20 bill that would cause this phenomenon, AIM North America tested a new $20 bill in a microwave oven. After 1 minute on high, the bill was barely warm. Next, an RFID tag of the type used by commercial laundries was placed adjacent to Andrew Jackson’s portrait on a new $20 bill and again placed in the microwave oven. After only 2 seconds, the antenna and chip on the RFID tag began to “fry.” After 20 seconds, the destruction of the RFID tag did set the bill on fire. The area around Andrew Jackson’s right eye, where the “covert” tag is supposed to be hidden, was entirely unaffected. The attached scans are of the $20 bill that was tested. The burned area to the right of the portrait is what remains of the laundry tag.

All this sounds well and good. However, on closer review of the press release, we noted that no "scans" were actually attached to it!

We knew what Dr. Makow would do, so we immediately dialed up Mr. Mullen. Tellingly, no one answered the phone! Apologists will, of course, quibble by noting that today is a Saturday. I expect that Henry Makow, Ph.D. would know better. So we checked the AIM website, to find a link to the press release that we had earlier received, addressed to "". It wasn't there!

Unnerved by this strange descent into the dark side of the Web, conspiracy theories and phantom press releases, we closed our browser down.

And there the matter stands, at least for now. Having no $20 bills in my pocket, this journalist is unable to confirm or deny the real or imagined presence of RFID tags in $20 bills (either new or old). And so, in the modern Fox News fashion, I can only conclude this entry with the simple question:

RFID Tags in $20 bills: Big Brother Plot or Horseshit?

... You Decide!

Comments? Email:

Copyright 2004 Andrew Updegrove

# # #

Useful Links and Information

The strange world of Dr. Henry Makow, Ph.D.

The equally strange world of Alex Jones:

PrisonPlanet article on exploding Jackson's: is the site of Lucifer Media, and produces other sites with names like "Church of Virus", "Modern Fetish" and "Comatose Rose" ("an online and paper zine serving the Canadian goth-industrial-metal-experimental community')

Full text of AIM mystery press release:

Date: 16 March 2004
For more information, contact: Dan Mullen
AIM North America

The Myth of the Amazing Exploding $20 Bill
Do $20 bills explode when placed in a microwave oven? Do they contain hidden radio frequency identification (RFID) tags? Can the government track you through your cash? One website recently published an “expose” about the supposed presence of an RFID chip placed behind Andrew Jackson’s right eye on the new $20 bill. This site claims that microwaving bills will cause the RFID tag to explode and burn the bill, thus exposing the “hidden” tag. Microwaving an RFID tag will cause it to create some spectacular sparks and will cause it to “pop.” As a result, some people have begun wrapping their cash in aluminum foil to “foil” reading of the RFID tag.

In fact, placing a $20 bill in a microwave oven does not cause it to explode, burn or affect it in any way. In order to determine whether or not there was a security feature in the $20 bill that would cause this phenomenon, AIM North America tested a new $20 bill in a microwave oven. After 1 minute on high, the bill was barely warm. Next, an RFID tag of the type used by commercial laundries was placed adjacent to Andrew Jackson’s portrait on a new $20 bill and again placed in the microwave oven. After only 2 seconds, the antenna and chip on the RFID tag began to “fry.” After 20 seconds, the destruction of the RFID tag did set the bill on fire. The area around Andrew Jackson’s right eye, where the “covert” tag is supposed to be hidden, was entirely unaffected. The attached scans are of the $20 bill that was tested. The burned area to the right of the portrait is what remains of the laundry tag.

Due to the power of the Internet, the source of the rumor

( has been quoted internationally (at least one article based on this was published in Flemish on a Belgian web site). Even a casual examination of a $20 bill will reveal that there is no hidden tag. Holding the bill up to a bright light would expose any RFID chip or antenna.

For reliable information on RFID, please visit

AIM’s mission is to stimulate the understanding, adoption and use of technology by providing timely, unbiased and commercial-free news and information. For information about upcoming educational events, visit the AIM calendar at

Postings are made to the Standards Blog on a regular basis. Bookmark:


With this issue, you will notice some of the new features that we’ve recently added to the Consortium Standards Bulletin: you can now print a single article from the CSB or the entire issue in a properly formatted Word/pdf version. Over the next month, we will be converting each article in the preceding 13 issues of the CSB to the same format and print availability – more than 50 articles in all.

We have also created a convenient title and abstract index of all articles to date, with links that will take you to entire issues, or individual articles. You can find the new index here.

We are also in the process of expanding the Consortium and Standards List, which is already the most complete index on the Internet of consortia, accredited standards bodies and related resources, with over 300 organizations listed. For each listing, we provide an overview of the SSO and its mission, as well as links to its site, and, if a password is not required, its specifications and intellectual roperty rights policy as well. You can find the complete list at

Please know that we always welcome contributions of articles, press releases and links to other appropriate sites. Here’s how you can help:

  • If you are not already sending your press releases to us for inclusion at the News Section <> of our site and in the CSB, please add to your press release distribution list

Finally, we welcome you to link your site to ours. Linking information can be found at: If you have other suggestions on how we can work together, please contact AuConAdd();">



Every day, we scan the web for all of the news and press releases that relate to standards, and aggregate that content at the News Section of For up to date information, bookmark our News page, or take advantage of our RSS feed: Updates are usually posted on Mondays and Wednesdays. The following is a selection of the many stories from the past month that you can find digested at

New Consortia

New consortia are back in style: In yet another sign of better economic times...consortia are back. Ever since the bursting of the Internet budget, funding new consortia has been near the bottom of corporate budget priorities. But now, new organizations seem to be popping up like crocuses in spring. The last several weeks witnessed the announcement of a spate of new organizations, particularly in the areas of security and anti-spam technology, for which the annual RSA convention provided an appropriate platform to publicize a debut. Gesmer Updegrove LLP , which sponsors the CSB , helped form the two new consortia noted below. The first, the Cyber Security Industry Alliance , was one of the new organizations announced in San Francisco at this year's RSA conference, and marks a new direction for the technology industry: banding together to influence legislation - a practice heretofore largely the province of non-technology companies. The second new organization, the Near Field Communications Forum , announced its launch at CEBIT in Hanover, Germany. The NFC Forum addresses a new use for wireless communications: the "touch" space, where two devices (one of which might be a camera, cell phone or smart card), need only touch, or be placed next to, another device in order to communicate with it.


CSIA Press Center, San Francisco, CA, February 25, 2004 -- The Cyber Security Industry Alliance (CSIA) was launched today at the RSA® Conference 2004 by a group of 12 innovative security software, hardware and service vendors. The CSIA is a non-profit corporate-membership organization, whose mission is to improve cyber security through public policy initiatives, public sector partnerships, corporate outreach, academic programs, alignment behind emerging industry technology standards and public education. The CSIA also announced the appointment of Paul Kurtz as its executive director. Most recently, Kurtz served as Special Assistant to the President and Senior Director for Critical Infrastructure Protection on the White House's Homeland Security Council (HSC). He was responsible for developing the White House's strategy and policy for protecting the nation's critical infrastructure.

Nokia, Philips and Sony Establish the Near Field Communication, NFC, Forum

NFC-Forum News, Hanover, Germany, March 18, 2004 -- Nokia Corporation, Royal Philips Electronics (NYSE: PHG, AEX: PHI) and Sony Corporation today announced that they will establish the Near Field Communication (NFC) Forum to enable the use of touch-based interactions in consumer electronics, mobile devices, PCs, smart objects and for payment purposes. Touch-based interactions will allow users to access content and services in an intuitive way by touching smart objects and connecting devices just by holding them next to each other. The new forum will promote implementation and standardization of NFC technology to ensure interoperability between devices and services.

New Standards/Specifications

Convergence is everywhere: Just as physics is seeking a unified theory of everything, it seems as if the ITC world's goal is to connect everything with everything. It's a rather breathtaking goal, and some of the component pieces are impressive as well. As reported in the following press release, the W3C is taking steps to "give voice to the Web" with the granting of full Recommendation status to two new specifications. Utilizing such standards, telephones will be able to add another two billion nodes to the Web.

World Wide Web Consortium Issues VoiceXML 2.0 and Speech Recognition Grammar as W3C Recommendations, March 16, 2004 -- Giving voice to the Web, the World Wide Web Consortium (W3C) has published VoiceXML 2.0 and Speech Recognition Grammar Specification (SRGS) as W3C Recommendations. The goal of VoiceXML 2.0 is to bring the advantages of Web-based development and content delivery to interactive voice response applications. SRGS is key to VoiceXML's support for speech recognition, and is used by developers to describe end-users responses to spoken prompts. Today's announcement marks the advancement to Recommendation status of the first two specifications in W3C's Speech Interface Framework. Aimed at the world's estimated two billion fixed line and mobile phones, W3C's Speech Interface Framework will allow an unprecedented number of people to use any telephone to interact with appropriately designed Web-based services via key pads, spoken commands, listening to pre-recorded speech, synthetic speech and music.

Intellectual Property Issues

Over the past year, there have been a number of cases where patents have been asserted against important technologies as well as less significant, but still pervasive, technical features. This has raised concerns relating not only to the disruption that would occur as a result of the patents asserted, but also over the system that allowed the patents in question to have been granted at all. On the "left" of the IT thought spectrum, there are those that think that certain enabling areas of technology ought not be subject to private ownership at all, even if they are otherwise valid and eligible for patent protection. More conservative participants would settle for greater consistency and care by the USPTO. We covered this situation in detail in the November issue of the Consortium Standards Bulletin, entitled "Do IT Patents Work?" The stories continue to issue briskly in this area, as evidenced by those that we have selected from this month's news, and included below.

Not over yet, but a step in the right direction: Microsoft in particular, and webmasters in general, got a much needed boost when the U.S. Patent and Trademark Office USPTO) made a preliminary judgment against certain patent claims of Eolas Technologies and the University of California. Last year, the two patent owners had won a $521 million judgment that Microsoft said would lead it to make changes to its Explorer Web browser. Following an appeal by Tim Berners-Lee of the W3C to Undersecretary of Commerce for Intellectual Property James E. Rogan to initiate a reexamination of the Eolas patent, the USPTO agreed to review the patent claims in question. The process is not over yet, but the first inning goes to Microsoft...and grants a reprieve (at least) to Webmasters everywhere.

Eolas Patent Invalid
By Matt Hicks

eWeek, March 5, 2004 -- A federal patent examiner's initial review has found the Web browser patent at the center of a major verdict against Microsoft Corp. to be invalid. Eolas Technologies Inc. won a $521 million jury verdict last year in its patent-infringement lawsuit against Microsoft, prompting an outcry from the Web's major standards body. The World Wide Web Consortium (W3C) asked the U.S. Patent and Trademark Office to re-examine the patent, which it did starting in November.,1761,a=121070,00.asp

This Can't be Good: As noted in the story above, a great hue and cry was raised in the Web world as a result of the patent infringement claims brought by Eolas Technologies against Microsoft. Now its Microsoft that is playing the heavy, as noted in the following story: Microsoft has been granted a patent covering certain aspects of using XML, one of the core technologies of the Web. The immediate questions arise: how will Microsoft wield this patent, and what will be the reaction of the IT community?

Microsoft Locks Up XML Patent, February 12, 2004 -- The speculation as to whether Microsoft intends to patent XML technology is over. Microsoft has been granted United States patent 6,687,897 for "XML script automation." The patent, awarded by the U.S. Patent and Trademark Office on February 3, appears to deal with basic XML functionality. Specifically, it describes a method for unpacking multiple scripts contained within a single XML file. According to the application filed by Microsoft, the patent involves "systems, methods and data structures for encompassing scripts written in one or more scripting languages in a single file." "The scripts of a computer system are organized into a single file using Extensible Language Markup (XML)," Microsoft's patent document continues. The document explained that each script is delimited by a file element and the script's instructions are delimited by a code element within each file element. When a script is executed, the file is analyzed to create a list of script names or functional descriptions of the scripts.

Towards a more perfect patent policy: Over the years, a number of investigators have endeavored to contrast the intellectual property rights policies of standard setting organizations for various purposes. Recently, GTW Associates, an international standards and trade policy consultancy with expertise in the strategic role of standards in competitiveness of businesses, organizations and countries in the global marketplace, was commissioned by the Japan New Energy and Industrial Development Organization and the Japan Strategic Program for the Creation, Protection and Exploitation of Intellectual Property to create a set of criteria for the "Evaluation of a Patent Policy for a Standard Setting Organization". The first draft of those criteria has been posted at the GTW website. Comments are invited through April 15, 2004.

DRAFT Criteria (in development) for the Evaluation of a patent policy for a Standards Setting Organization -- This First public DRAFT Dated Monday, March 15, 2004 is intended only for stimulation of discussion and comments...[and is] intended to be helpful to compare and contrast the patent policies of standards developing organizations in a consistent and logical manner....The preparation of a list of criteria for comparison of IPR policies hopefully will represent a step towards such clarity and transparency. GTW Associates requests comments and feedback on the list by April 15, 2004 in order to refine and improve it before applying the criteria to evaluate the patent polices of several standards developing organizations. Comments may be sent to or may be posted to an on line discussion web GTW welcomes additional criteria that may be helpful and editorial suggestions or revisions of the preliminary criteria to improve their wording.

Standards and Society

Regrettable but necessary: As we have reported on a regular basis, while the war on terror plays out in vivid detail across television screens and in newspaper headlines, a large number of standards bodies are working quietly behind the scenes to set diverse standards that help lower risk, and facilitate response to emergencies when they occur. The following three stories cover the adoption of nine different standards by the U.S. Department of Homeland Security in the areas of radiological and nuclear detection, an OASIS first-responder equipment specification, and a NIST security standard for government computer systems.

U.S. Department of Homeland Security Adopts First Responder Equipment American National Standards

ANSI News and Publications, Washington, DC, February 26, 2004 -- U.S. Department of Homeland Security Under Secretary Charles McQueary and Under Secretary Mike Brown announced today the DHS adoption of five American National Standards for personal protective equipment for first responders. The standards were developed by the National Fire Protection Association (NFPA) and subsequently approved by ANSI.

CAP v1.0 Approved by OASIS Emergency Management TC

The Cover Pages, February 26, 2004 -- Members of the OASIS Emergency Management Technical Committee have approved a Committee Draft specification for the Common Alerting Protocol Version 1.0 and have recommended its advancement for approval by OASIS as an OASIS standard. The Common Alerting Protocol (CAP) is "a simple but general format for exchanging all-hazard emergency alerts and public warnings over all kinds of networks. CAP allows a consistent warning message to be disseminated simultaneously over many different warning systems, thus increasing warning effectiveness while simplifying the warning task.

Federal Standard Issued For Improving IT Security, February 13, 2004 -- Computer security specialists at the National Institute of Standards and Technology (NIST) have developed a new standard to help federal agencies better protect their computer networks. The standard provides a new way to categorize government information and information systems...The standard was developed following passage of the Federal Information Security Management Act (FISMA) of 2002. Federal Information Processing Standard (FIPS) 199, Standards for the Security Categorization of Federal Information and Information Systems, introduces significant changes in how the federal government protects information and the computerized networks that store information by giving detailed guidance on how to categorize systems....The mandatory standard will be a critical component of an agency’s risk management program. As required by FISMA, NIST is also developing a companion standard that will specify minimum security requirements for all federal systems. A copy of the standard is available at

Story Updates

Convergence is not for wimps: Three of the themes that we have been reporting on over the past year come together in the following story: the rapid evolution and deployment of wireless standards; the convergence of IT and communications standards; and the need for new standard setting methodologies capable of making all this happen in the rapid fashion that modern technological innovation and product deployment demand. The following article suggests that the reach of product introductions may be exceeding traditional standard setting's grasp.

Wi-Fi Interoperability Problem on Rise, Hanover, Germany, March 18, 2004 -- Increasing complexity and stronger security is making it harder for new wireless computer networking products to hook up with each other, an industry group promoting the technology said Thursday at the CeBIT tech fair. The Wi-Fi Alliance said that 22 percent of the devices - such as wireless networking cards for computers, access ports and printer servers - submitted for testing at its four partner laboratories failed to work on a network on the first try.

On the other hand, what about the Patriot Act? Legitimate concerns as well as off the wall rumors (see Dan Mullen, Andrew Jackson and the Dark Side of the Web ) have swirled around the topic of RFID tags for some time now. In this article, security expert Jay Cline suggests that the industry would do well to nip this PR issue in the bud, before privacy concerns become so embedded in the public consciousness that a promising standards-based technology becomes the "fluoride in the water supply" issue of the decade.

The RFID Privacy Scare Is Overblown
By: Jay Cline

ComputerWorld, March 15, 2004 -- The privacy scare surrounding radio frequency identification tags is greatly overblown. No company or government agency will be secretly scanning your house to find out what products you've purchased, because there's no feasible way to do so. But if RFID chipmakers don't soon allay these fears, the escalating public emotion about this issue may effectively ban the most valuable implementations of this remarkable technology.,10801,91125,00.html

Pieces of the PDS: In the February, 2004, issue of the Consortium Standards Bulletin, we envisioned the "Personal DataSphere" In the following article, Intel CEO Craig Barrett comes out in favor of one key element of the PDS: worldwide agreement on digital rights management. While it's comforting to see Intel embrace one important feature of the PDS, it's disappointing to see the evolution of the PDS, as feared, continue to evolve in piecemeal fashion.

Intel's Barrett Calls for More Flexible DRM System
By: Martyn Williams, February 24, 2004 -- Intel CEO Craig Barrett says consumers should be allowed to manipulate the content they own. Speaking in Tokyo to promote Intel's vision of the future digital home, Barrett called for the adoption of a worldwide DRM (digital rights management) system that allows consumers the flexibility to manipulate the content they own in ways of their choosing,...At the heart of the company's envisaged digital home is a set of standards that allow computer and consumer electronics devices to interconnect and communicate with each other.

WS-I turns two: In the May, 2003 issue of the Consortium Standards Bulletin, we profiled the Web Services Interoperability Organization ('New Wine - Old Bottles: WS-I Brings a New Dimension to the Art of Making Standards Succeed") With its base set of specifications now complete, the WS-I is turning its attention to prioritizing second level specification objectives. We will continue to cover this intriguing organization as it embarks on phase two of its development program.

WS-I Casts Eye on New Profiles
By: Darryl K. Taft, February 16, 2004 -- As the Web services Interoperability Organization turns 2 this month, the group is looking forward to releasing new drafts and tools. Since establishing a base set of specifications for creating interoperable Web services in the past year, the group is now looking at how it will handle attachments and security.,4149,1528303,00.asp

What to do about "quagmires": over the past year, we have followed the evolution of Web services standards -- and the multiple standard setting organizations involved in this area -- very closely (see, for example, "Who Should Set the Standards for Web Services ?"). In this article, the IBM representative who is perhaps most in the middle of what he refers to as the "quagmire" of Web services standards describes the tunnel that he finds the IT world in -- and also talks about the light he sees at its end.

Q&A: Tom Glover, IBM and WS-I Web Services Exec, February 13, 2004 -- There's a lot of interest in Web services and the processes to create them. By most estimates, the market for applications that talk to one another to render services is a multi-billion-dollar affair. But it's also rife with uncertainty because of a quagmire of standards developed to solve barriers like interoperability, security and manageability. Tom Glover wears two Web services hats: senior program manager, Web services standards for IBM; and IBM director, president, and chairman of the board within the Web Services-Interoperability (WS-I) Organization.

Open Source

Marching to an open source drummer: Governments have always had their own way of looking at technology (after all, who else still gets to use WordPerfect besides U.S. agency staff?) That's something for the technology establishment to keep in mind, as a tipping point may be approaching in the open source area. The following article in InfoWorld surveys government open source adoption and attitudes, and concludes that 2004 may be a critical year for open source in the public sector.

U.S. Government Seeks the Open Road
By: Grant Gross and Ed Scannell

InfoWorld, March 12, 2004 -- 2004 may be the year for open source software to catch on in a big way in government agencies. Government agencies have implemented open source solutions that range from Linux-running, data-collection computers on Naval Oceanographic Office survey ships to a Web-based tool that allows the U.S. Agency for International Development(USAID) to quickly process the visas of foreign workers scheduled to train in the United States.

Other New Work Product

All kinds of output: Collaborations can result in all types of useful results besides standards: best practices, white papers, studies, awards, and so on. The following press release summarizes a report commissioned by International SEMATECH that looks into the multiple sources upon which semiconductors are based - which are often exotic, sometimes have single geographical sources, and often can be found only in volatile parts of the globe. The results of any study of this supply chain are often sobering, especially in light of the increasing dependency of just about everything on chips.

Techcet Group Releases Reports on Critical Semiconductor Materials, Genoa, NV, and Austin, TX, March 11, 2004 -- Techcet Group, LLC has released its 2003/2004 Critical Materials Reports highlighting resource and supply issues for raw materials vital to the semiconductor industry. The reports were commissioned by International SEMATECH and subsequently have been made available to other customers.


Chaos (?) and the art of standard setting: In the feature article of this issue, we review various centrifugal forces that are jeopardizing the stability of standard setting. One of those forces is the rise of informal, ad hoc, standards, evolved through. ..Blog posts (and sometimes flames). The following is representative of the news emanating from the realm of content syndication. For more, see A Retreat from Process Quality

Google Spurns RSS for Rising Blog Format: Atom
By: Paul Festa

CNET, February 11, 2004 -- Google's Blogger service is bypassing Really Simple Syndication in favor of an alternative technology, a move that has sparked more discord in a bitter dispute over Web log syndication formats. The search giant, which acquired last year, began allowing the service's million-plus members to syndicate their online diaries to other Web sites last month. To implement the feature, it chose the new Atom format instead of the widely used, older RSS.