Consortium Standards Bulletin- August 2004
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
   Home > Consortium Standards Bulletin > August 2004
  Untitled Document
Untitled Document


August 2004
Vol III, No. 8

Open Source- Coming of Age

Open source is at a crossroads: nurtured on the Internet by engineers, its now ready for primetime. But is the process that has brought us this far ready to meet the demands of commercial customers?   Print this article (PDF)
More and more end users are becoming interested in open source products, but some have doubts about the security, support, risk of infringement and completeness of open source software. What are the stumbling blocks between here and broad adoption, and how can they be cleared away? Three industry experts give their views.   Print this article (PDF)
All standards run the risk of running afoul of patent rights. Open source has the added risk of copyright infringement, and a reputation for a more distributed process. Recent events illustrate how addressing infringement fears is essential to convincing end users that investing in open source software is a safe, as well as a smart, bet.  Print this article (PDF)
We’re used to thinking about standards that specify all manner of measurable attributes - size, wavelength, voltage, and so on. But what happens when a standard must take time into account, especially when it’s a whole lot of time?  Print this article (PDF)
An interactive conference to address barriers to open source implementation.
Micro Fuel Cells Ready to Go; Wireless Standards Proliferate (Again); The SEC and Navy Embrace XML; New Report Assesses Chinese Standards Strategy; Spammers Out Gun Good Guys (Again); Implantable Human RFID Tags Seek FDA Approval; DVD Standards Battles (Again); Jericho Forum Flexes End User Muscles; and (as always) much more  Print this article (PDF)
STANDARDS ARE SERIOUS (AREN'T THEY?): Not all standards are gold.

Print this issue (PDF)



Andrew Updegrove

This month, we dedicate our issue to the increasingly important topic of open source. More specifically, we focus on the inevitable transition of the open source development model from a labor of engineering love to a more commercially influenced development process.

Our choice of open source as the theme of this month’s issue is occasioned not only by the great strides that open source has made in the last year, but also by an upcoming conference which we are proud to co-sponsor. Titled “Open Source - Open Standards: Maximizing Utility While Managing Exposure", the conference will seek to interactively identify and analyze the remaining hurdles that may stand in the way of ubiquitous adoption of open source-based products.

As a sort of “prequel” to the conference, this issue will address the same concerns. In particular, we are pleased to present an interview with three well-known experts in the open source world (two of whom will be presenters at the conference). Each of these contributors shares his thoughts on where the open source and business communities need to go in pursuit of a broader and more robust world of open source software.

We hope you enjoy this issue. If you do, consider attending “Open Source – Open Standards” as well.

    Best Regards,
    Andrew Updegrove
    Editor and Publisher



Andrew Updegrove

Adolescence is a heady period of life, as well as a turbulent and confusing one. All at the same time, there is boundless opportunity, impatience with external authority, the freedom to experiment, and the potential to fall flat on one’s face. It is also a time of transition, when old freedoms give way to new rules and responsibilities.

Open source has entered its adolescence. Most famously, Linux is gathering momentum, as evidenced by an ever-growing number of major customers, vendors and service providers that have committed to the cause. In fact, some less visible software has been even more widely adopted: the Apache Web server, for example, is estimated to enjoy a 64% market share.

But while open source is clearly flexing its muscles, those who develop and use open source software will have to grapple with the types of challenges as well as the opportunities that broader market opportunities entail.

The most obvious example of such growing pains is the assertion by SCO that Linux infringes upon its proprietary rights. Whether or not one gives credence to SCO's allegations, the very public campaign conducted in the press by SCO has made potential open source customers focus closely on the open source process. Can a global, virtual web of individual developers create a complex software product without risk of accidental -- or deliberate -- infringement of intellectual property rights (IPR)? If so, are there changes that need to be made to the process to ensure that result?

IPR concerns are not the only issues that open source proponents need to address. In order for the open source model to continue to make inroads into the corporate and government markets, a high degree of trust will be needed in the reliability, security, longevity, completeness, ease of use, service availability and rationality of open source products. Can such concerns be adequately addressed within an open source process? If so, again, what changes are needed to achieve the necessary results?

To date, the open source community has charted its own course from concept to code with a refreshing and creative variety of approaches. Existing standard setting organizations might well learn from these new models. Indeed, some consortia (such as the W3C) have already incorporated open source projects into their work program. Still, open source projects are most often run by individuals, and not companies. Indeed, in the case of the Apache Web server, it is the individual developers themselves that indirectly own the software, through membership in the Apache Software Foundation. Will the largest IT companies continue to be willing to commit their strategic direction to platforms over which they can have so little influence?

While vendor control can, of course, have its bad side, the participation of support providers is a precondition to broad implementation of many types of open source software. Commercial vendors and service providers are also more in touch with customer needs, and more interested in satisfying their demands. Before open source software can supplant installed, proprietary solutions, potential customers need to be convinced that open source-based products will rapidly evolve to meet real-world functional needs and concerns, rather than simply be “look alikes” of yesterday’s proprietary software.

In order for trust in open source software to be earned, we believe that the process by which open source software is created will need to become further institutionalized. In other words, some of the rebelliousness and free form approach of the open source movement may need to be tamed. While thousands of projects can (and should) continue to be launched in an impromptu fashion, more ambitious, business-critical projects will sometimes require a carefully conceived, funded and staffed structure in which to be conducted. At the same time, the creativity and the passion that has distinguished the emergence of the open source model to date must be preserved.

Already, this evolution is in evidence. While SourceForge continues to host thousands of projects that have only as much discipline as the participants desire, more formal efforts have been in operation for years, such as those underlying Linux, Apache, Eclipse, Mozilla, and OpenOffice, each of which has produced useful, respected and, in some cases, broadly implemented software. Other models have also been tested, such as the well-funded Open Source Development Labs.

In the years ahead, introspection and cooperation will be needed in order for all constituencies to create similarly useful software that is broadly adopted. Sometimes difficult compromises may be needed, as time to market concerns require new discipline, and those that provide funding attach expectations to those funds that may be new to the development process.

Successfully surviving adolescence brings depth, wisdom, greater perspective, and expanded potential – but it also requires the patience of all concerned. The challenge today is to guide the open source process through its teen–age years with as little angst, and as much fulfilled promise, as possible. Pragmatism, mutual respect and cooperation on all sides will help ensure a successful result.

Comments? Email:

Copyright 2004 Andrew Updegrove



Andrew Updegrove

Introduction: The evolution of open source from a concept to a broadly implemented development and licensing model has been impressive, and in many ways unprecedented. Rather than a top-down process launched by vendors, it has been very much a bottom-up movement launched by the employees of vendors, working on their own free time.

Today, the interests of vendors and individual engineers are converging. Significantly, customers are also taking an active interest in open source, rather than simply buying what vendors choose to offer them. Governments at all levels in particular have made public commitments to the open source licensing model, commanding further attention from the vendors that would like to provide the products and services that would accompany a migration to open source platforms by these substantial customers.

While this transition offers great promise, it also brings the challenge of meeting higher user expectations as the market emerges from its early adopter phase. Similarly, with success comes competition, as well as assertions of intellectual property rights. Playing on a broader court brings contact with more elbows, and the game can only pick up speed as the stakes are increased.

Three Experts: In order to dig deeper into the challenges that lie between early and broad adoption of open source, and to investigate how those challenges may best be addressed, we interviewed three leaders in the open source field. Each is a true believer in open source, but also a pragmatist that is committed to making open source a widely adopted reality. They are:

John Terpstra: John’s business card title is CEO of PrimaStasys, Inc., but his additional credits include helping form the Desktop Linux Consortium, being a long-term member of the Samba development team, and authoring several books. His reputation is as a visionary in the open source community, with particular expertise on the adoption of open source software in key business applications.

John Weathersby: John is the founder and executive director of the Open Source Software Institute (OSSI), a non-profit organization promoting the development and implementation of open source software for governmental and academic users. Before founding OSSI, he was a co-founder of SAIR Linux and GNU Certification, one of the early industry's leading Linux training and certification companies, which went from concept to becoming part of a publicly-traded entity in the space of 14 months.

Jim McQuillan: Jim is the founder of the Linux Terminal Server Project (LTSP), which has received worldwide recognition as the standard method of deploying thin clients in a GNU/Linux environment. He is also an owner of DisklessWorkstations.Com, a supplier of thin client hardware and services to the Linux market, and as a consultant deploys Linux based solutions in medical offices.

Questions and Answers: Our interviews were conducted by telephone and email on August 13 – 15. Here’s what our experts had to say:

CSB: First, a few questions for context: Why are you interested in helping open source become ubiquitous as a business reality and a licensing model?

JT: Our freedom to think and act, to share knowledge and information, and to deploy concepts to the maximum benefit of society, is vital for my children and yours. I believe that someone who works hard should be generously rewarded also. Open source software forces a change of business model from being software and intellectual property focused to one where satisfying the service needs of customers must lie at the heart of business activity (it always did anyhow). What open source promotes is nothing new, but it does attract much attention because it is more difficult to provide sustainable service than it is to “sell a license to use software”. Open source software as a business factor necessitates customer needs satisfaction through service.

JW: OSSI's mission is to promote open source within the public sector. The philosophical argument behind this is simple: we believe that the adoption and use of open source solutions within the government represents a very wise use of public funds. As taxpayers, we have the right to expect that our dollars will be used in the most fiscally responsible, as well as technically efficient, manner possible.

From a business perspective, OSSI was formed to help promote and facilitate the adoption of open source solutions within the government market segment. But achieving that goal will have a broader impact as well. The government is one of the largest purchasers and users of IT products and services in the world. Acceptance of open source as a viable technical and business solution within the public sector will drive the entire IT industry's business strategy to include open source as a part of all of their offerings.

JM: I think it is important that consumers and business are not locked into proprietary solutions -- especially when we talk about the standards used for communications and interoperability. No one company should be allowed to own the standards that have become so important to our everyday lives.

CSB: What specific open source projects are you personally involved in?

JT: I helped to form the Samba Team. I’ve been active in answering user requests on the Samba mailing lists, have managed the bug tracking system, and have written most of the project documentation over a 9 year period.

I was also a co-founder of the United Linux initiative, and am currently authoring a series of 5 to 7 books for the Prentice Hall, Bruce Perens Open Source series that demonstrate “by Example” how open source software can be deployed.

JW: Currently OSSI is working with a branch of the DoD to help secure the NIST/FIPS 140-2 certification for OpenSSL. We are also involved with the U.S. Navy on several projects that include technical and business case studies with regards to their present and future usage of open source. We've also began working on a Homeland Security project that will generate several open source applications for law enforcement agencies. These will be submitted to a public repository so that public entities throughout the nation can use and benefit from these efforts.

JM: I’m directly involved in LTSP, and in the deployment of Linux in general. I’ve been developing Unix and networking solutions since 1984, and have been working with Linux since 1995.

CSB: Now let’s turn to your views on the challenges that open source is facing today. What do you think are the biggest roadblocks standing in the way of broader adoption of the open source model?

JT: Linux and bsd-UNIX companies are targeting proprietary UNIX business. This is a much smaller market than is the Microsoft market. The vertical business solution market that is the stronghold of traditional UNIX operations also has a repurchase cycle that is much longer than that of the infrastructure computing market that Microsoft services. Having a long sales cycle and a smaller market can make open source less appealing for a product or service supplier. Also, I believe that open source oriented businesses must focus more extensively on how to meet customer needs. Much open source software is still too difficult for use by non-technical consumers.

JW: There are several obstacles that seem to challenge most open source adoption strategies, especially for public entities. These include: policy, support, the economics/business model, and general management's resistance to change.

We are working with entities now that want to more broadly adopt open source, but they have to do it in a way that is logical to their existing policy code and structure. Simply getting your arms around the concept of open source can be challenging if you approach the topic cold. It takes time, patience and persistence.

The question of support is a common concern. It's great that open source software costs less, but who does the user call if it breaks? That question is very easily answered, but it's one that always comes up and we, as the open source community and industry, must respect the concerns of the client/customer.

The support issue leads into concerns over economic and business models. The customer needs to have confidence that you, or someone else, will be there to fix it if it breaks, or blinks, or needs updating or tweaking, and not be out of business. The very public commitments by the largest IT providers have gone a long way to smooth many nerves by saying that open source is good enough for them to sell and service, so it must be good enough to buy. It is again, a matter of the buyer's confidence.

And finally, resistance to change. To some, open source is a brand new concept. To others, it's been around longer than almost all proprietary solutions. But the only opinion that counts is that of the person or people that are making a specific purchasing or strategic IT decision as to how viable open source would be within their system.…Selling open source takes patience, persistence and respecting the concerns of those who are considering adopting open source as part of their system.

JM: I don't see roadblocks in the way of broader adoption. They are more like speed bumps. There are many things that are just slowing down the adoption of open source technologies. For example, through the LTSP, I'm involved in deploying Linux in schools. Many people in the school systems are still resistant to teaching anything other than the Microsoft applications, citing that they want their students to be prepared for the real world. And somehow, teaching them only MS Word is going to do that. I think that's just wrong. They should be teaching the kids “word processing”, not “MS Word.” When kids take drivers education, they don't learn to drive “Chevy cars” do they ?

CSB: And now a similar question, but with a different twist: What are the biggest business challenges standing in the way of broad adoption of the open source model?

JT: First and foremost, few software vendors have identified where the sustainable short-term, medium-term and long-term business opportunities lie. There are 2.5 million UNIX systems and 16 million Microsoft Windows servers. The Windows users want a choice of alternatives and have little to choose from. UNIX users have greater choice. Which market would you rather sell to?

Second, open source oriented businesses need to better analyze the market and adopt channel strategies that will enable all viable market segments to be reached in a profitable and cost effective manner. Today most vendors work largely through the large equipment vendors and have ignored the small to medium reseller channel.

JW: A primary challenge is simply the amount of time it takes for any new model to become adopted and accepted as truly mainstream. We are seeing that acceptance now. Open source has crossed the chasm and is now being considered and adopted as a stable, secure, reliable solution.

But the most substantial hurdle that still remains in my opinion is the continued refinement of sustainable business models for those that offer open source products and services.

JM: Two things. First, t he opinion that open source software is not supported by vendors. People are afraid to use a product if they can't call the company and get help. The second is ease of use. Open source software still has some work to do in this area, but fortunately, it is being done.

CSB: What specific technical areas do you see that must be addressed?

JT: We need open public standards for all software. Open standards help to level the playing field. An additional area for improvement is ease of initial configuration to provide services equivalent to those provided by Microsoft Small Business Server. Administration needs to meet the same standards as well.

JM: Open source has already provided a much more robust operating system than we've seen from the commercial software companies. Certainly, we need to keep going in that direction, continuing to improve the reliability and security of the OS.

CSB: Do you see a need for standards in support of open source, and if so, in what areas?

JT: The vast majority of software that is used today lacks standards for file formats and for protocols. The dominant office productivity package (MS Office) uses proprietary file formats that change regularly. That change is often a catalyst for software updates and gives the vendor effective customer lock-in. Windows networking protocols are constantly being updated and yet remain insecure as evidenced by constant virus and worm threats. We need open public standards to help create more secure networking practices.

JW: The adoption of common standards is beneficial to all sections of the software marketplace and especially to the end-users.

JM: I don't think it is a matter of “standards supporting open source.” I think that ALL standards just need to be open. There are standards that are recognized by the standards bodies (ISO, ANSI, IETF, etc.), and then there are standards that people have simply adopted. The official standards are free to use, but the de facto standards, many times, are closed and controlled by individual corporations who stand to benefit greatly by keeping those standards closed. There should be real standards for document and data interchange, so that all software is free to use. The fact that most people think MS-Word “.doc” and “.xls” formats are OK to be used for sending documents around via email is absurd. If people are going to be sharing documents, the standard format of those documents must be open for all to use.

CSB: What clear milestones do you see that must be passed or hurdles surmounted in order for open source to become more prevalent?

JT: The first challenge is that open source vendors need to become more focused on customer needs. Rather than just offer a work-alike alternative, we need to gain a better understanding of information needs and how people use information, and then deliver the next generation of software products – all of which freely inter-operate through unified interfaces.

JW: Open source clearly has a stronghold within the backend system elements. I find it very interesting to watch how the challenge for desktop marketshare is unfolding. I don't know if Microsoft will ever be completely unseated from the desktop, but I think that they realize that Linux poses a serious challenge to their current position. At least they should. It will be very interesting to see how this plays out over the next few years.

JM: I see the open source world as a freight train moving down the tracks at 50 miles per hour. It's going to keep moving no matter what. Picking up boxcars along the way, giving it more mass and momentum and not stopping for anything. As more companies start using open source software, and announcing the fact that they are doing that, the speed of the freight train will increase.

CSB: Currently, the open source process spectrum ranges from SourceForge to the more ordered Linux community to the more consortium-like Eclipse model, to projects hosted within standards consortia like the W3C, to corporate-funded development shops such as OSDL. We’d like to ask some questions now that focus on how this diverse process landscape is likely to evolve in the years to come.

Do you believe that all of the current open source development models will perpetuate? If not, which ones will best serve business, government and other large enterprise users? And will individual engineers still play a vital role?

JT: Human nature is to coalesce around challenges. Like-mind problems attract like-minded people. The opportunity today is for better communication across project groups, and through greater recognition of the vital role of standards. Standards facilitate competition.

JW: One of the true strengths of open source is its non-exclusive development nature. This is obviously what makes all of these development models possible. But most projects are supported, or directly funded, by a corporate sponsor who has some vested interest in the particular program or solution. This is even true within the various communities.

I think these various development models represent a very healthy community and I hope that they all continue to some degree. But I also hope that open source never loses that “labor of love” element that makes it special. How could any project, strictly developed by business intentions, compare with the likes of a Debian [a free, Linux-based operating system] or so many other projects that were created and maintained by a small group of volunteers because it was “their thing to do?”

That's what makes us a “community,” and not just an industry.

JM: I think there's room for all of those places to continue to provide viable open source code. Good ideas are coming from everywhere. Sites like SourceForge allow anyone to jump into the game and contribute what they can. The barrier to entry is completely missing in the world of open source. Anyone with an idea can contribute.

CSB: Can a Linux-like process be tightened sufficiently to provide the same degree of assurance against infringement that a consortium or SDO process can (not that those are closed to foolproof, either)?

JT: They ultimately must. That is part of the paradigm shift in the wake of recent IP litigation. The answer is Yes! Already, the code version control system used by all major projects means it is easy to trace code origins. The benefit is that contributors expose their contributions in a way that increases public accountability.

JM: There are 2 issues here: Copyrights and Patents. With open source, we need to be certain that proprietary code doesn't end up mixed in with open source code. This requires careful watching of the code contributions. I think a more formal process of accepting code from contributors will eventually be worked out. The bigger problem is with software patents. This can stop a project dead in its tracks.

CSB: If changes are needed to a Linux-like process, what are they?

JT: Open source developers are learning to be more disciplined in code development and in the process of patch acceptance.

JW: To be seen...

JM: I wish I knew.

CSB: Is the age of process innovation over? Do you believe that current development models will go into a consolidation phase leading to less flexibility, or will process experimentation continue?

JT: Complex issue. I believe there will be a division into a purely research/experimental community and a more commercially oriented community.

JW: As any development process matures, the public face of innovation seems to slow, but I think we're just getting started. This market is so dynamic and there are so many really smart people involved and empowered with the freedom to create that I believe that the open source process will only continue to grow. It will evolve, of course, but that too is inevitable.

JM: It will always continue. People are always looking for a way to build a better mousetrap.

CSB: Same question regarding licensing models: is it time to quit experimenting and settle on a smaller number of well-understood licensing models, or is that not necessary?

JT: Licensing terms are emotive issues. I believe we have quite some ways to go before there will be a significant coalescence. He who writes the code gets to choose the license.

JW: At Linux World in August, Martin Fink of HP said that he deals with numerous new open source projects each week and he thinks there are more than enough existing licenses to deal with almost every circumstance that arises. I tend to agree with him on that.

Eric Raymond said that a lot of the various licenses are simply “vanity licenses,” which also makes sense. With that in mind, I believe that we will see a refining of the license process. The market will dictate that there is only a need for a handful of relevant licenses that covers the majority of issues.

JM: Maybe some consolidation is necessary. There are many choices of license to use in an open source project, and more are being invented every day.

CSB: As Open Source becomes more mission critical, who will pay the bills to support the process, and what will they expect to receive in exchange?

JT: As big companies become more committed to open source, they will want to hire the best people. Logically, they will want to hire out of the open source community. That will allow the same companies to have more influence over open source projects, as these employees will be more successful in getting the types of changes they would most like to see.

JW: I think that you'll see more corporate entities “adopt” programs and try to brand them as their own. This process will alienate some developers and you'll see forks and spin-offs, but that's part of the process.

Along these lines, I hope that we'll see more positive corporate adoption of programs like J Boss with Apache or Sun's sponsorship of OpenOffice through their StarOffice product. I think these are good examples of open source and corporate interests existing in a positive, symbiotic relationship between community and industry and end-user.

JM: As more commercial users are involved, I expect to see more corporate sponsorship of projects.

CSB: One last question to wrap things up: five years from now, what do you believe a typical Open Source project will look like (if there is such a thing)?

JT: Five years ago at Samba we put together our first bug-tracking system; today, we have a much more effective and efficient one, with the equivalent of a near full-time person who determines what is and what isn’t a bug, and triages what must be done when in response. In short, today we have more discipline and structure. The challenge one encounters in this type of evolution is between process and productivity, and in keeping the flame alive. This process (and challenge) will continue. As a result, some projects will scatter; and others will go back into enclaves. A significant number will scale into something new. The open source model is demonstrating greater scalability than the closed source proprietary model.

JW: From a government perspective I think you'll see more public sponsorship of open source projects, and this will benefit everyone. When public dollars go to develop, or enhance, an open source project, then that project becomes a true public asset.

I do not buy the argument that this creates a “competitive” situation between government and industry. I don't believe that we should have to pay time and again for the same program when it has become a commodity.

As for industry, I think we'll see more of the type of service plays we see growing now within HP, IBM, Computer Associates, and the other industry service leaders. From a product development side, we'll see more industry players leveraging open source to provide not only a cost saving and efficiency advantage to their customers, but they'll also see an increase in their profit margins as more open source solutions mature and become a part of the common resource that all can take from and contribute back to. In short, I see this as a very healthy model that will grow and prosper.

JM: In five years, I expect that there will still be many of the same issues. Microsoft will still be pushing their proprietary solutions, and the open source groups will still be here. But, I think the pie will be sliced up in very different proportions. Five years from now, I expect that open source will be much more prevalent than it is now. Much like it has changed from 5 years ago.

Open source is important. Extremely important. Open source software powers the Internet, and it will power the future as well.

Comments? Email:

Copyright 2004 Andrew Updegrove



Andrew Updegrove

Ever since SCO CEO Darl McBride launched his ongoing assault against Linux, the risk of infringement claims against the popular open source operating system has seemed reminiscent of United States Homeland Security concerns. In the last several weeks, the Linux patent claim danger alert (if there was one) would doubtless have been raised from yellow to orange, due to a number of revelations and reports in the press.

Most of the fear, uncertainty and doubt related to concerns that Microsoft intended, or at least someday might be able, to assert patents against its open source rival.

As with the recent major terror alert involving U.S. financial institutions (which stemmed from the discovery of years’ old information on a laptop), the infringement scare began with another piece of ancient intelligence – a two-year-old HP memo. The memo circulated around the Internet for a while before being posted at on July 19, following confirmation of authenticity by HP.

In that memo (reproduced in full at <>), then-vice president of strategic architecture Gary Campbell stated: "Basically, Microsoft is going to use the legal system to shut down open-source software. Microsoft could attack open-source software for patent infringements against (computer makers), Linux distributors, and, least likely, open-source developers." The memo informed HP executives that Microsoft was also "specifically upset about" Samba, Apache and Sendmail, each of which is a widely distributed open software program (and used to share files, host Web sites and route email, respectively).

It didn’t ease concerns when Bill Gates announced ten days later that Microsoft intended to file at least 3,000 patent registrations in 2004 – up from only about 2,000 filings in 2003, and closing rapidly on perennial patent champ IBM, which was awarded 3,415 patents in 2003.

The icing on the paranoia cake was the release on August 1 st of a report that found that Linux potentially infringed an impressive 238 patents. The analysis had been commissioned by Open Source Risk Management, which plans to sell infringement insurance to the Linux community. While over a third of the patents identified are owned by companies commonly thought to be Linux allies, that would still leave many patents in potentially hostile hands, or in the hands of those who might be acquired by those who do not have kind feelings for Linux.

True, the study also stated that the validity of none of the patents in question had as yet been tested, leaving open the possibility that many claims could be satisfactorily challenged as having been anticipated by “prior art.” Still, 238 is a very large number, and even if half of that number could be invalidated, a great deal of work could be required to either gain non-assertion promises from the various remaining patent owners, or to try to design around the patent claims in question.

The open source community and its allies quickly circled the wagons in response. IBM promptly announced through a representative at the LinuxWorld Conference and Expo that "IBM has no intention of asserting its patent portfolio against the Linux kernel, unless of course we are forced to defend ourselves.”

Similarly, a spokeswoman for HP stated that the two year old HP memo is “not relevant today,” and that HP has no knowledge of any specific patents that could be asserted by Microsoft against any of the open source software referred to in the memo.

Those that thought about the Microsoft announcement could also take comfort in the fact that only a few years ago, Microsoft was filing far fewer patent registrations: on the order of 1,000 a year. And, of course, patents take a number of years to issue, and may not be asserted against another implementer until such time (if ever) as the Patent and Trademark Office agrees to grant the claims applied for.

As a result, an ultimately granted patent applied for today could not be wielded against an implementer until some indefinite point in the future. Given that reality, it will be years before Microsoft would begin to rack up the same issued patent score as IBM, which has actually been granted over 3,000 patents per year for each of the last three years.

Eventually things settled down. If Tom Ridge were to be tracking the situation, he would probably have lowered the infringement danger gauge back to yellow. But the episode underscores the vulnerability of open source to the kind of FUD attacks that can undermine market penetration, with or without the filing of actual infringement suits.

Happily, it is within the power of the open source community to tighten intellectual property security far more easily than the Department of Homeland Security. The Linux development community, while large, does not have thousands of miles of unprotected physical frontiers to protect.

The lesson to be learned is that end-user trust can be earned through tightening the development process in such a way that infringement should be difficult, rather than easy, to imagine. Revelations that a Linux user might run afoul of 238 patents is hardly conducive to achieving that kind of trust.

It would be wise for not just the Linux community, but the open software community as a whole, to take whatever measures may be necessary to ensure that infringement claims in the future will be met with doubt about the veracity of the accuser, rather than fear, uncertainty and doubt over the possible reality of a threat.


Copyright 2004 Andrew Updegrove


#19 A Standard for the Ages

We’re used to thinking about standards that specify all manner of measurable attributes - size, wavelength, voltage, and so on. But what happens when a standard must take time into account, especially when it’s a whole lot of time?

Recently, a Federal Court of Appeals in the District of Columbia determined that even 10,000 years is not enough time, when it comes to setting standards for the safe storage of atomic waste.

The standard and the legal case at hand involve the proposed national nuclear waste storage site in Yucca Mountain, Nevada, in pursuit of which the Department of Energy (DOE) has spent more than $9 billion during the better part of two decades. The site is intended to provide permanent storage for the atomic wastes created over the past 60 odd years by the U.S. military and private sectors. Some of these wastes have half-lives of over one million years.

It is easy to agree that a permanent storage site should be designed to avoid, to the greatest extent possible, the escape of deadly radioactive wastes. It is also easy to agree that the design of such a facility must be durable, and that the site should be located where any accidental escape from a containment vessel would be unlikely to escape into groundwater or the atmosphere. But how safe is safe enough, when the time dimension shades into what (in human terms) is effectively eternity?

In the most recent Yucca Mountain legal decision, a three-judge panel of the Appeals Court did not rule on how close to eternity atomic waste must be contained in order to be completely safe, but it did conclude that 10,000 years is not close enough.

One of the factors that the judges found to be significant was a National Academy of Sciences (NAS) report that determined that the leakage paths of escaped wastes could be predicted for as much as one million years. Given that the NAS believed it possible to predict leakage for much more than 10,000 years, the court held that this relevant information must be taken into account in site design, especially since a 1992 federal law relating to the burial of atomic waste specifically obligated the DOE to heed the Academy’s recommendations.

But if 10,000 years is not enough, what is?

The NAS thinks that 300,000 years is the right number, given that it estimates that radiation leakage at Yucca Mountain is likely to peak in about 270,000 years. At that time, someone standing outside the public boundary of the site would absorb about 60 times the amount of radiation deemed to be safe.

Of course, such a determination begs the question of whether any standard can purport to be meaningful over such a vast period of time, given the vagaries not only of geologic processes, but the unpredictable behavior of people as well. Serious thought has been given to whether human society can avoid a serious breakdown during even the next few thousand years, and therefore whether any danger sign placed at the site could be understood by our descendants after such a collapse.

Ultimately, the atomic waste storage question has more to do with standards of responsibility than with setting requirements for geologists and engineers. However far a court may decide to push a standard, it is impossible to know whether humanity will be capable of knowing what lies under the mountain over such a span of time, if indeed humanity manages to continue to exist for so long a period at all.

In the end, those setting an atomic waste storage “standard” must accept that at best such a set of requirements can mitigate, but not eliminate, the danger to human and animal life in the distant future. Were the goal to be to effectively manage risk for the full duration of the existence of atomic waste, then standards would address production as well as storage. Only by limiting the creation of atomic by-products to those that have half-lives that are realistic in human terms can any standard purport to provide a potentially complete solution.

Presumably, such a result is not now technically achievable. As a result, what the courts must adjudicate is the standard of our society’s conscience – how long is “long enough,” knowing that the standard ultimately approved will never be truly sufficient? And is such a standard truly a safety standard at all, or rather a measure of the limits to our sense of responsibility for life on earth?

Comments? Email:

Copyright 2004 Andrew Updegrove

# # #

Useful Links and Information:

United States Court of Appeals Decision on Yucca Mountain:

DOE Statement on U.S. Court of Appeals Decision Regarding Yucca Mountain:
Available here.

Office of Civilian Radioactive Waste Management public website for Yucca Mountain:

National Academy of Science “Technical Bases for Yucca Mountain Standards”:

EPA Yucca Mountain website FAQs:
Radiation Protection Standards

Postings are made to the Standards Blog on a regular basis. Bookmark:



Maximizing Utility While Managing Exposure

We are pleased to co-sponsor Open Source – Open standards, an interactive conference designed to address the types of issues discussed in this issue of the CSB. The conference will be held on September 12 – 14, in Scottsdale, Arizona, and attendance is free to qualified attendees (hotel accommodations are not included). Keynote speakers include Glenn Otis Brown (Executive Director, Creative Commons), Larry Rosen (General Counsel, Open Source Initiative) and Bruce Perens (Senior Research Scientist for Open Source, George Washington University), and panelists will represent a wide variety of open source groups, media, consortia, academia, major corporations and other participants in the open source community.

The conference will focus on four key areas:

  • Business Risk and Exposure in Open Source Utilization
  • The Open Standards Deficit in Open Source: Problems in IP Management, Stability, and Market Growth
  • Implications for Open Source Adoption
  • Strengthening Open Source: Consideration of Alternative Solutions

For more information, visit:

To register, visit:




For up to date news every day, bookmark the
Standards News Section

Or take advantage of our RSS Feed

New Initiatives

But what about the Energizer® Bunny? Most of the popular press attention in the United States relating to fuel cells has focused on their problematic potential to power automobiles. Far less notice has been given to the ability of the same technology to be employed in miniaturized form to provide more operating time for portable electronic devices, such as cell phones and PDAs. This technology has now matured to the point where products are nearing market launch, and standards efforts are therefore being launched to ensure that fuel cells are as “swappable” as their more-familiar electric battery competitors.

Micro fuel cell standard body created
IDG News Services, August 2, 2004 --
The International Electrotechnical Commission (IEC) has formed a working group to draw up standards to ensure Compatibility between micro fuel-cells. The fuel cells could be launched as early as next year as an alternative power source for handheld electronic devices such as music players and digital cameras. Formation of the new Working Group 10 (WG10) of the IEC's Technical Committee 105 (TC105) was approved in a vote on 30 July, Toshiba said. It will be chaired by Dr Fumio Ueno, who works for Toshiba's display devices and components control centre. The group itself was proposed by Toshiba along with other Japanese companies. WG10 is tasked with setting an international standard to ensure compatibility between the fuel cells and their fuel cartridges - especially important since the cells need to be regularly recharged with methanol. ...Full Story

Thinking big while thinking small: New areas of standard setting, like the sun, are always emerging from just beyond the horizon. When they do, appropriate structures need to be put in place. The following item reports on ANSI’s efforts to get nanotechnology standard setting off to an orderly start.

ANSI Establishes Nanotechnology Standards Panel
ANSI News and Publications, New York, NY, August 5, 2004 -- The American National Standards Institute announced today the formation of the Nanotechnology Standards Panel (ANSI-NSP), a new coordinating body for the development of standards in the area of nanotechnology. Nanotechnology refers to the manufacturing or manipulating of matter at the atomic and molecular level, or nanoscale. The panel will convene September 29-30, 2004, in Gaithersburg, MD, to focus its initial work on nomenclature and terminology. ANSI was approached by the Office of Science and Technology Policy (OSTP) in the Executive Office of the President to address this area of standardization in support of academics, various industries, the investment community and government agencies that utilize nanotechnology. ...Full Story

Looking Ahead

If this is Wednesday, it must be Ultrawideband: The pace of wireless data standards development continues unabated, with each new standard extending attributes such as range, speed and security. This month’s news includes an update on a promising new technology that has been referred to as “Bluetooth on steroids”, which promises an increase in transfer rates that clock at two orders of magnitude. But even as the new technology was being announced, so was another, which some commentators said could “leapfrog” it. Our first item below focuses on the supposedly steroidal Ultrawideband technology, while the second describes one of the two (so far) entrants contending to be incorporated into IEEE 802.11n – a next generation standard intended to transmit data at over 100 mbits/sec. That technology is being proposed by a new group calling itself the WWISE (for World Wide Spectrum Efficiency) Consortium.

Ultrawideband: A Better Bluetooth
By: Matt Hamblen
ComputerWorld, August 3, 2004 -- The wireless personal-area network technology is more than 100 times faster than Bluetooth, but business applications are still a long way off. Ultrawideband wireless technology has been called "Bluetooth on steroids." Like Bluetooth, its personal-area network (PAN) cousin, UWB is designed to replace cables with short-range, wireless connections, but it offers the much higher bandwidth needed to support multimedia data streams at very low power levels. And because UWB can communicate both relative distance and position, it can be used for tracking equipment, containers or other objects. In a recent technology demonstration, Freescale Semiconductor, showed a UWB device that transmitted at a data rate of 110Mbit/sec at a range of up to 10 meters. That bandwidth is 100 times faster than Bluetooth and twice the capacity of the fastest Wi-Fi networks. It is enough to pump three concurrent video streams over a single UWB connection. ...Full Story

Industry Coalition Floats Proposal for 802.11n
By: Mark Hachman
eWeek, August 12, 2004 -- A second group is floating its technical proposal to replace Wi-Fi, in advance of a meeting next month to begin resolving the issue. The so-called WWiSE consortium comprising Airgo Networks, Bermai, Broadcom, Conexant, STMicroelectronics and Texas Instruments held a conference call Thursday morning to introduce its new proposal for the 802.11n standard. "WWiSE" stands for World Wide Spectrum Efficiency," a characteristic of the new proposed standard, the companies said. By 2006 or 2007, the then-completed 802.11n standard will replace the current mix of Wi-Fi technologies, or so the industry hopes. ...Full Story

What’s next under the Big Top: OASIS-watchers are well aware that this interesting organization maintains a “big tent” under which many different types of initiatives are welcome. In the following interview, OASIS CEO and president Patrick Gannon discusses what challenges his organization is interested in tackling next.

OASIS Expects User Input on Web Services, Looking to Grid
Computer Business Review Online, August 3, 2004 --
Web services standards for vertical industries and grid computing are on the agenda at OASIS. Patrick Gannon, OASIS chief executive and president, told ComputerWire he expects more input from end-users during the next year, who will help shape web service while work on standards around grid and autonomic computing will come to life through a number of new committees. The fire, though, seems to have gone from at least one previously hot topic -- the existence of rival standards for the way web services talk to each other, standards that have been stewarded by OASIS and the World Wide Web Consortium (W3C). ...Full Story

Standards and Society

Wi-LAN isn’t just about business WANS: While most of the press on the new Wi-LAN standard has focused on its potential to enable business use in urban areas, the technology also has the potential to expand the reach of Internet-based services in the third world as well. The following press release reports on the deployment of this new technology in Bangladesh, under a Sustainable Development Networking Program funded by the United Nations.

Wi-LAN and Norban Deploy Broadband Wireless Products for United Nations Development Program, Calgary, Alberta, July 28, 2004 -- Wi-LAN Inc. (TSX:WIN), a global provider of broadband wireless communications products and technologies, and Norban Communications Limited (Norban), a provider of network solutions and services including data, voice and video convergence, today announced Wi-LAN's broadband wireless access products and accessories have been delivered as part of the Sustainable Development Networking Program (SDNP), an initiative under the Ministry of Environment and Forest of Bangladesh funded by the United Nations Development Program. "The objective of the Sustainable Development Networking Program is to facilitate the establishment of Internet exchanges at selected Ministry of Environment and Forest offices. These Internet exchanges connect the various Dhaka-based internet service providers to optimize the flow of email traffic within the country," said Dr. Hakikur Rahman, Project Coordinator, SDNP. ...Full Story

Standards across the water: Even with the end of the Cold War over a decade ago, the United States and Russia haven't been known for maintaining close working relationships in the military arena. Indeed, when the Kursk sank in March 2000, Russia delayed asking for any international assistance until it was too late to be of assistance to the Russian sailors trapped below. Now, it appears that this may be changing. The new standardized escape mechanisms being cooperatively developed by nations around the world for naval use include individual ejector seat/survival suits akin, in concept, to those used by air force crews.

Collaborative Sub Rescue Efforts Focus of Working Group in Russia
Michael Foutch
Navy Newsstand, Washington, D.C., July 23, 2004 -- U.S. Navy representatives attended the annual NATO Submarine Escape and Rescue Working Group in St. Petersburg, Russia, from June 25-July 3. Representatives from 26 nations attended the summit on a humanitarian basis to standardize rescue procedures and equipment. According to Capt. Chris Murray, deputy director, Deep Submergence Systems (N773B) on the staff of the Submarine Warfare Division, this was the first NATO working group of its kind to meet outside a NATO country. "We're working on standardizing submarine escape and rescue equipment and procedures, so that any country can rescue or help another should the situation arise," Murray said. ...Full Story

Standards and your Business

Now you know what the “X” in “SarbOx” stands for: With the adoption of Sarbanes-Oxley, securities compliance is now a much more time consuming and expensive chore for public companies. The XBRL Consortium has been hard at work to make such tasks easier, and the United States Securities and Exchange Commission has announced that it will work with that Consortium to explore the benefits of permitting reporting companies to make use of the Consortium’s work product. Meanwhile, in other XML news, the Navy is saying “yes” to more existing commercial products (by endorsing XML standards), while saying “no dice” to vendors that like to add proprietary extensions to otherwise open standards-based products.

US Securities and Exchange Commission Evaluates XBRL for SEC Financial
Cover Pages, August 3, 2004 --
Announcements from the US Securities and Exchange Commission (SEC) and XBRL-US describe an initiative to assess benefits of XML-tagged data and to consider accepting voluntary supplemental filings of financial data using the XML-based Extensible Business Reporting Language (XBRL). XBRL is a royalty-free, open software specification that uses XML data tags to describe financial information analyzed and exchanged by accounting firms and government gencies. ...Full Story

XML Standards Battle is Brewing Over Navy's Data-Sharing Plans
Joab Jackson
Government Computer News, August 2, 2004 -- Proposed Navy rules for Extensible Markup Language use are forcing other agencies to take a stand on how they share and reuse data. The Navy plans to adopt international interoperability standards that would eliminate document type definitions, or DTDs, which many agencies now use for sharing documents...The new rulebook endorses the W3C XML component recommendations, as well as others from ISO, OASIS, UBL, and UN/CEFACT. By adhering to worldwide standards, the Navy intends to base its future applications more heavily on commercial products, Green said, in accordance with Office of Management and Budget Circular A-119, which encourages voluntary consensus standards. ...Full Story

Story Updates

What has 1.3 billion people and wants its own standards? The answer, of course, is China. In our May 2004 issue (Standards as Trade Barriers ) we used the recent fencing between the United States and China over Wi-Fi standards to highlight how standards can be impacted by national economic agendas. Now, Deloitte Touche Tomatsu, the global accounting firm, has released a detailed report on China’s standards agenda, making it essential reading for anyone who needs to know how the intentions of this closely managed, vast purchaser and vendor are likely to affect the global economy in the years ahead.

Changing China: Will China's technology standards reshape your industry?, August 4, 2004 -- Everyone recognizes China as a low-cost manufacturer and a huge potential market. But most do not realize China is emerging as a key player in shaping global technology standards. This new Deloitte Research study looks at how China's growing influence on standards could define global competition in the technology, media and telecommunications sector for years to come. From operating systems and software applications to storage media, wireless communications and satellite positioning, Chinese government agencies and companies are looking to break the hold of developed economies on standards and working to shape new technology standards for economic advantage….Full Story

Download the full report here.

Good news on spam? C’mon! In our June 2004 issue, we focused on the ever-rising challenges of security in an Internet-linked, email dependent world (see Standards and Security ) The following selections highlight several aspects of the continuing security battle. The first reports that regulatory efforts, in the form of the Can-Spam law, are failing to do the job (even compliance by legitimate businesses appears to be sagging), while the second reports on progress on a standards effort in the IETF that may bring some incremental relief. The next item, however, falls into the category of “if its not one damn thing, its certainly going to be another,” as a patent holder is now contending that some features of the very same standard infringe on its rights. The last item reports that Moore’s Law is about to consign a hitherto useful encryption standard to the ash heap of security history. Final score for the month: One step forward, three steps back.

Can-Spam Isn't Doing The Job
By: Gregg Keizer
InformationWeek, August 5, 2004 -- Compliance with the Can-Spam Act has fallen to a new low, according to recent data collected by MX Logic. In July, compliance fell for the first time to less than 1%--dropping to a measly 0.54% of all unsolicited commercial mail the company sampled during the month. MX Logic has been tracking compliance with Can-Spam since the federal law went into effect in January. Through April, MX Logic's numbers remained stable, with about 3% of spam messages complying with the law's requirements, which range from verifiable return addresses to measures consumers and businesses can use to opt out of mailing lists. In May and June, however, the number slipped to 1%. ...Full Story

IETF Prepares To Forward Sender ID
Jim Wagner, August 4, 2004 -- The Internet Engineering Task Force (IETF) is set to nominate Sender ID -- a consolidated e-mail address anti-spoofing technology -- as an Internet standard during its working group meeting Wednesday. Sender ID is the consolidation of Microsoft's Caller ID for E-mail and Meng Weng Wong's Sender Policy Framework (SPF). SPF is essentially a list of computers or servers (every Internet-connected machine has its own IP address) that are verified to send e-mail from a particular IP address. ...Full Story

Microsoft Faces Lawsuit Over Caller ID for E-Mail
Jim Wagner, August 11, 2004 -- A new patent battle is brewing -- this time over Microsoft's claim over Caller ID for E-Mail. F. Scott Deaver, owner of Failsafe Designs, says Microsoft is guilty of the "outright theft" of his product name and intellectual property (IP), and will seek legal and financial redress from the Redmond, Wash., software giant and anyone else that uses his technology that verifies e-mail is coming from the domain it claims. If that happens, it's sure to put a monkey wrench in the deployment of the Sender ID specification, which is a combination of Microsoft's Caller ID for E-Mail specification and the Sender Policy Framework (SPF), worldwide to combat one of the avenues used by spammers. The technology used in Caller ID for E-Mail is part of a specification called Sender ID currently under review by an Internet Engineering Task Force (IETF) working group as a proposed Internet standard to eliminate the use of spoofed e-mail addresses found in many of today's spam messages. ...Full Story

NIST says DES encryption 'inadequate'
Paul Roberts
InfoWorld, July 29, 2004 -- The National Institute of Standards and Technology (NIST) is proposing that the Data Encryption Standard (DES), a popular encryption algorithm, lose its certification for use in software products sold to the government. The advent of massively parallel computing has rendered DES inadequate to protect federal government information, NIST said. The institute, part of the U.S. Department of Commerce, is proposing that the government withdraw Federal Information Processing Standard (FIPS) certification for DES, a move that could have ripple effects throughout the technology sector and force a wide range of legacy systems into early retirement, according to one cryptography expert. ...Full Story

What if you gave a standard, and nobody came? In our March 2004 Issue, we noted the increasing number of “standards” efforts that have been launched outside of formal consortia and SDO settings (see Maintaining Process Quality), and expressed concern over this new dynamic. In contrast to that dour view, the “half full” side of the same glass is noted in a new report issued by the Yankee Group, which credits IBM and Microsoft for their aggressive efforts and manifest commitment to “build out” the Web services standards structure necessary to move products and services into a Web services-enabled world. Absent the degree of control that these two companies (and their technical allies) have enjoyed as a result of launching many of these standards efforts outside of a formal process, their commitment to Web services might have been more ambivalent, leading to the possible delay or failure of the new model to gain traction in the marketplace. Meanwhile, as reported in the second item, the machine gun fire of “pre-baked” standards continues, this time with the submission of the WS-Addressing Web services specification by the same companies to the W3C for consideration as a standard. And, following in the wake of the surprising Concordat recently entered into between former arch-enemies Microsoft and Sun, the latter (along with SAP) have joined in supporting the new specification, notwithstanding Sun and SAP’s earlier promotion of a rival specification intended to serve the same purpose.

Yankee: Web Services Gaining Momentum
By: Clint Boulton, August 2, 2004 -- To give their sluggish supply chains a boost, companies are increasingly deploying Web services in applications at the edge of their networks, according to new research from The Yankee Group….The research firm credits Microsoft and IBM with helping to drive the adoption of Web services standards. Without those top companies lending their support, businesses would be loathe to back the standards. Without the standards, customers would be reticent to try the technologies. Standards are important, but Yankee recognizes that core technologies based on them are also responsible for spreading the influence of Web services and SOAs. To that end, many organizations are realizing the business benefits of defining, recognizing and sharing products and services in a massive global directory, which makes UDDI (define) a hugely popular tool to help drive B2B commerce. Systinet, Novell and SAP are among those that make UDDI products....Full Story

WS-Addressing Specification Submitted to W3C
Doris Evers
InfoWorld, August 11, 2004 -- BEA Systems Inc, IBM Corp, Microsoft Corp, SAP AG, and Sun Microsystems Inc. have submitted the WS-Addressing Web services specification to the Worldwide Web Consortium (W3C) for consideration as a standard. Sun and SAP are new in the list of WS-Addressing backers. Sun, along with Oracle Corp., Nokia Corp. and several other companies in April 2004 submitted a competing specification called WS-MessageDelivery to handle Web services addressing. After the peace agreement with Microsoft earlier this year, Sun now endorses WS-Addressing. Sun plans to support WS-Addressing in products that are part of its Java Enterprise System, Julson said. During the W3C process, Sun and the other WS-MessageDelivery creators will offer comments on the WS-Addressing specification to come to a single addressing standard. SAP, for its part, intends to support WS-Addressing in a future version of its NetWeaver product, said Marc Goodner, a technology architect at SAP. "We see this specification as being important to help reduce complexity for our customers. ...Full Story

One world, one script (not): In our July 2004 issue (Standards and Social Responsibility ) we reported at length on the challenges of making the opportunities of the Internet available to all of the peoples of the world, and on the fact that the United Nations has taken it upon itself to become involved in the "governance" of the Internet. The following article, from a Malaysian news site, provides an excellent overview of the technical issues involved in accommodating the multiple character sets used around the world.

Finding a way to make the Net truly global
The [Maylasian] Star Online, July 27, 2004 -- THE Internet Corporation for Assigned Names and Numbers' (ICANN's) meetings in Kuala Lumpur last week were the first ever to incorporate discussions on Internationalized Domain Names. Friendly contention on related issues were apparent between parties with different ideas on how to approach the issue. Since its inception about 30 years ago, the Internet has primarily been based on the Latin (or Roman) alphabet, Arabic numerals and punctuation marks, all encoded according to the 8-bit American Standard Code of Information Interchange (ASCII) which serves well in the English-literate world or in countries like Malaysia where the national language is written in the Latin script. ...Full Story

Just have a pina colada and chill, Big Bro: In a March 20, 2004 Blog entry (Andrew Jackson, Dan Mullen and the Dark Side of the Internet ), we reported on the degree of paranoia that certain Americans have exhibited over the advent of RFID tags. Well, they won't be happy to learn that the FDA has moved one step closer to granting approval for the implantation of the same sorts of tags into the good citizens of the United States of America. Whether you believe that this is a good or a bad thing will depend on how you feel about your government. Given that the chips have been used in pets for 15 years, the FDA will focus primarily on privacy issues. But in the "don't worry, be happy (and wireless)" department, the following article reports that the Baja Beach Club in Spain is already using the tiny, easily injected chips to enable free 'n easy drink ordering at the beach bar.

Under-the-Skin ID Chips Move Toward U.S. Hospitals
By: Michael Kanellos
CNET, July 27, 2004 -- VeriChip, the company that makes radio frequency identification (RFID) tags for humans, has moved one step closer to getting its technology into hospitals. The Federal Drug Administration issued a ruling Tuesday that essentially begins a final review process that will determine whether hospitals can use RFID systems from the Palm Beach, Fla.-based company to identify patients and/or permit relevant hospital staff to access medical records…For 15 years, Digital Angel, a sister company under the Applied corporate umbrella, has sold thousands of tags for identifying animals.. ...Full Story

Still no armistice: As World War II was to World War I, so also is the next generation DVD format battle to the combat waged in the earlier VHS/Betamax wars. As was the case with the Second World War, the companies involved in the DVD fracas should know better – many are the same ones involved the last time around. If you go in for such things, you can follow this conflict battle by battle as it continues to unfold. The results of the latest skirmishes are recounted in the following articles, one of which appropriately adopts military terminology to tout the approval of H.264, a new codec that’s “ not afraid of anything… H.264 says, `Bring it [on].' "

Toshiba alliance heats up DVD standard battle
By: Michiyo Nakamoto, July 29, 2004 -- Toshiba and its partners on Monday raised the tempo in the battle to determine the next-generation DVD format, saying they are on track to launch next year a DVD recorder capable of storing more than eight hours of high-resolution content on one DVD disc. At the same time, Pony Canyon, Japan's largest distributor of pre-recorded DVD titles, announced it would launch movies in the Toshiba alliance's HD DVD format "at an early stage next year." Microsoft's Japanese unit, meanwhile, said the company's next Windows-based operating system, called Longhorn, would be compatible with HD DVD. ...Full Story

Sony PS3 will use the Blu-Ray disc format, August 4, 2004 --
Sony has revealed that its next-generation games console will use the Blu-Ray disc format, which is an evolution of the current DVD system and is designed to hold five times as much data on a similarly sized disc. A meeting in Tokyo yesterday saw the creation of a working group to draw up final specifications of a read-only Blu-Ray system, including representatives from many of the world's largest electronics companies such as Sony, Dell and Matsushita (Panasonic). Sony officials confirmed that the next generation of the PlayStation home console will be equipped with a Blu-Ray drives, allowing it to play back high definition movies as well as providing more space for game developers on the discs. ...Full Story

DVD Forum Ratifies H.264 Codec for Inclusion in HD-DVD Format
Geoff Daily, August 2, 2004 -- While we're still a year away from seeing any products on the shelves, the DVD Forum has laid another cornerstone for its next-generation blue-laser HD-DVD format by selecting the H.264 Advanced Video Codec (AVC) for encoding video that will be delivered on the new discs. Developed two years ago by the Moving Picture Experts Group--also responsible for DVD's MPEG-2 codec--and the International Telecommunication Union, H.264 "delivers at the same data rate as MPEG-2 high-definition instead of standard," says Frank Casanova, senior director of product marketing for Apple's interactive media group. When H.264 first entered the scene, many MPEG-4 proponents worried that it would compete with the nascent open standard. Instead, it has gone on to become an integral part of MPEG-4. "It's a codec that's not afraid of anything," says Casanova. "Slow or high-motion, fire or smoke, all of the things that scare other codecs, H.264 says, `Bring it.' " ...Full Story

Everything's coming up proprietary: In our May 2004 issue (Standards as Trade Barriers) we reported at length on the erection of standards-based trade barriers by nations in general and China in particular. This month brings news from South Korea on another home-grown Internet standard, and on its commercial consequences. And we have also seen a report (the second item below) that China, Wal-Mart's largest supplier country, is forming a new RFID standards group with the goal of avoiding foreign patent royalty payments otherwise payable if it implements existing RFID standards. However, this time China contends that its own standard will be compatible with international standards. In the last item (from the Peoples Daily Online), a senior Chinese government official has denied that China is working on an “Asia version” of Linux. But the story indicates that the real situation may be less black and white.

Seize the Portable Internet; Competition is Heating up
Telecoms Korea, July 26, 2004 --
The whole Korean IT industry is closely watching the move of the Ministry of Information and Communication, waiting for the MIC's announcement on the timetable for the appointment of the high-speed portable internet service, or 'Wi-bro' operator and tech standard policy slated for later this Month. As Wi-bro or Wireless Broadband, Korea's home-grown mobile Internet standard, is expected to attract 9 million subscribers in 5 years, creating over 2.6 billion dollar-worth market, telecom companies are desperate to seize this once-in-a-lifetime opportunity. There are four parties competing; KT and KTF on one side, Dacom, Powercomm and LG Telecom under LG Consortium and SK Telecom and Hanaro Telecom being on their own. ...Full Story [registration required]

China to develop its own RFID standard
By: Jim Carbone, July 23, 2004 -- China is developing its own standard for radio frequency identification (RFID) applications, according to market researcher iSuppli. The Chinese National Standards Management Committee has established a working group to develop the new standard. The purpose for China's own standard is to avoid paying royalty fees to non-Chinese owners of the main RFID specification. Chinese design houses and chip manufacturers will benefit if the Chinese standard is used. The Chinese RFID standard will be compatible with the international standard, but will have its own intellectual property. The major differences are likely to concern radio frequency and other technical factors. ...Full Story

China denies 'Asian standard' for Linux
People's Daily Online, August 10, 2004 -- China has never intended to develop an isolated Asian standard for Linux, which some Japanese industry executives fear, said a senior Chinese Government official last week. "We have never promoted a so-called 'Asian standard' in Linux development," Ding Wenwu, director of the Ministry of Information Industry (MII)'s electronics and information products management department, said last week. The Chinese, Japanese and South Korean governments last September formed an strategic alliance to promote Linux development and applications in Asia. ...Full Story

IT users rule UK! Regular readers of the CSB know that we have reported at length on the effectiveness of major customers like Wal-Mart and the United States Department of Defense to pull new RFID technology into the marketplace. Until these large customers announced that they would expect their major suppliers to get on board, the new standards-based technology had been expected to languish for many years. Now the customer power movement seems to have spread to Great Britain, where the Jericho Forum, a UK-based consortium of large IT customers, is issuing a report telling what they want from IT vendors – now!

Businesses to turn up the heat on suppliers
By: Daniel Thomas, August 11, 2004 -- Two weeks from today, the Jericho Forum, a group of 40 multinational companies, will start work on a roadmap detailing how the industry can best develop open, standard IT security systems. The intended outcome will be a report outlining the future needs of chief security officers at some of the World's top IT users, that will be used to influence the development of security products and standards. With members including BP, ICI, HSBC and Royal Mail, and a collective IT spend running into billions of pounds, will the Jericho Forum be the first significant manifestation of IT user buying power? ...Full Story


ET, call home: One could reasonably argue that language is the oldest human IT standard: after all, the purpose of speech is to pass information, requests and responses between two “users”, and agreement on a common language is necessary to achieve that end. Similarly, the existence of multiple languages results in a failure of interoperability between users, the creation of information “islands”, and translation issues. The following article describes a successful effort over more than 20 years to ensure that ICT standards exist that can ensure that communications occur effectively, not only in this world, but out of it as well, among all space-faring peoples.

CCSDS: International Space Communications Standards Organization Surpasses 300th Mission
CCSDS Secretariat, WASHINGTON, August 11 -- Today the Consultative Committee for Space Data Systems (CCSDS) announced that more than 300 missions have elected to use the committee's internationally developed standards and protocols to enable reliable communications in space. And the number continues to rise. Supported by NASA, the CCSDS is dedicated to furthering interoperability in the international space community through the development of standardized techniques for handling space data. Founded in 1982, the world's 10 major space agencies originally created the CCSDS as a forum to discuss common problems occurring in space communications. Since then, the organization has grown into an international working collaborative that includes the ten original member agencies, 22 observer agencies and over 100 industry associates worldwide. Industrial associates develop compatible products to meet the requirements of CCSDS enabled missions and ground-support complexes….Full Story

Bringing home the standards bacon: One difference between consortia and accredited standards development organizations (SDOs) is that many of the latter have been around longer, have larger staffs, and frequently take a more holistic view of what they can do in support of an industry than does the typical consortium. That’s because many SDOs started out as trade associations rather than standard setting bodies, and think of standard setting as being only one of their goals in life. One result is that while consortia are rarely politically active, SDOs often are. And, like any other interest group, they like to carve off a slice of bacon when they can, as reported in the following press release.

IPC Secures Congressional Funding for Embedded Passives R&D
IPC Press Release, Northbrooke, IL, August 11, 2004 -- Association Connecting Electronics Industries* announces its success in lobbying for the U.S. printed circuit board (PCB) industry, resulting in over one million dollars in research funding. Prior to adjourning for summer recess, the House and Senate finalized the fiscal year 2005 Department of Defense Appropriations Act, H.R. 4613, that includes $1.5 million for the Emerging/Critical Interconnection Technology (E/CIT) program's embedded passives research and development test bed project. The bill has already been signed by President Bush, and the funding will become available at the start of the federal fiscal year in October. This is the third year in a row that IPC has successfully lobbied Congress to fund E/CIT program R&D projects for the PCB industry. The E/CIT program is an industry - government partnership established in 2002 to strengthen the abilities of both the Department of Defense and the U.S. PCB industry to support the military's unique PCB requirements through an integrated program of research, education, and industrial extension. ...Full Story


The medium is sometimes not the message: History is replete with exalted physical bases for standards: the monetary Gold Standard that existed into the second half of the twentieth century, the carefully preserved platinum-iridium bars that represented the recognized length of the meter, and so on. To this august assemblage of official reference points, NIST has now added…five bottles of frozen, homogenized trout. Enjoy!

Something’s Fishy About New NIST Food Standard
NIST Tech Beat, July 30, 2004 --
Accurately measuring exactly what's in the food we eat, before we eat it, is a surprisingly difficult job. The latest effort by NIST to make the process both easier and more accurate is Standard Reference Material (SRM) 1946, which is a set of five bottles of frozen, homogenized trout from Lake Superior. With carefully measured values for about 100 chemical constituents, the SRM will help food industry and environmental researchers assure that measurements of both healthful ingredients and contaminants in fish and similar foods are accurate. Laboratories can validate their analytical methods and instrument performance by using them to analyze the SRM and comparing their results to the NIST values. ...Full Story