Vol III, No. 10
Standard Setting and Diplomacy
|| ABOUT THIS ISSUE
||WHY DOES STANDARD SETTING WORK?
| As Americans rush towards a closely contested election, its worth asking why a voluntary consensus process with no enforcement power is so successful at reconciling opposing interests, and gaining global support for its output. Print this article (PDF)
||STANDARD SETTING AND DIPLOMACY
| The United Nations has a multi-billion dollar budget and too often fails to create consensus around the most vital issues of the day, while the global standard setting infrastructure operates on a shoe string, and maintains hundreds of thousands of widely adopted standards. Perhaps there’s something that the diplomats can learn from the engineers. Print this article (PDF)
THE TECHNOLOGY STANDARDS BOOKSTORE
For the past two years, ConsortiumInfo.org and the Consortium Standards Bulletin have brought a wealth of information, news, ideas and analysis to visitors from around the world – all for free. Help us continue to expand our services on the same basis by patronizing our new bookstore. Print this article (PDF)
||SOY SAUCE, KIMCHI AND THE GOLDEN RULE
|Not long ago, Japan launched an effort to create an international food standard based on traditionally brewed soy sauce. Great was the hue and cry in Japan when American industry tried to qualify a cheap substitute under the same standard. Almost as great as it was in Korea, when Japan tried to hijack the standard for traditional Korean kimchi. Print this article (PDF)
||THIS MONTH'S TOP STORIES
| Pundits Wonder if "Enough is Enough" with Web Services Standards; Novell Pledges to Use Patents to Protect Open Source; IBM Moves to Support Open Source and Open Standards; French Firms to Create Secure Linux; IEEE Continues Wireless Standards Blitz; Patents Complicate Wi-Fi Landscape; AIM sees EPC Global in an "IP Mess", Twentieth Century Fox Gives Two Thumbs Up to Blu-Ray; W3C Throws Itself a Birthday Party; IT Health Standards Proliferate; Coming Soon: a Black Box for Your Car; much more Print this article (PDF)
Print this issue (PDF)
ABOUT THIS ISSUE
This month we examine the success of international, consensus-based standard setting in order to discover what lessons it may hold for diplomacy and international relations.
In our Editorial, we reflect upon the fact that the process of standard setting embodies one of the first truly international, democratic, voluntary - and successful - processes that the world has ever embraced. As such, it may well offer a model for internationalization in other fields as well. Indeed, it provides a real-life example of how such collaboration can develop and work.
In our Feature Article, we examine some of the specific aspects of standard setting that lead it to be effective. We then contrast these features to existing national and international structures to see how they compare, and what these other systems might profitably borrow from standard setting practices.
Our Standards Blog entry brings us back to earth, as we reflect on the fact that there are limits to goodwill and good behavior in standard setting – especially when the recipe of your national food is at issue.
We also take pleasure this month in announcing a new feature of the ConsortiumInfo.org site: a technology bookstore that we hope you will patronize in order to help us continue to bring you this journal of news, ideas and analysis free of charge.
As always, we hope you enjoy this issue.
||Editor and Publisher
WHY DOES STANDARD SETTING WORK?
As this editorial is being written, Americans are hurtling towards an election that contrasts two dramatically different views of the role that the United States should be entitled to play in the modern world.
On the one hand, Neo Conservatives in the Bush administration believe that power is reality, and that it is the prerogative of those with power to reshape the world to match their vision of what the world should be.
Senator Kerry, on the other hand, claims that a better world would be one in which the United States leads, but does so by building consensus among nations.
Leaving aside the important issues of which approach is “right” in a philosophical, moral or legal sense, one might ask which approach is more likely to produce durable and desirable results? With the rise of global terrorism and the proliferation of weapons of mass destruction, the answer to this question is more vital today than ever.
But where does one turn to answer such a question? Traditionally, the history of governments and nations has most often been consulted for guidance in such matters, and certainly it would be foolish not to continue to look to the past in this fashion. But the history of formal international relations is not the only reference point available to determine how peoples can interact most productively on a global scale.
We believe that the modern process of consensus based, international standard setting can also shed valuable light on this important question, since it subsumes many of the important attributes of international relations. It is, after all: a global process; it involves fiercely competitive participants; it demands compromises that require the cession of some rights in order to enjoy the beneficial results of common agreement; it achieves the advancement of the public good; it can protect the many from the monopoly of the few; and most important of all: it has been demonstrably successful over a diverse range of issues on a global scale, all without the use of force.
It is not usual to look at standard setting in this light. Most often, this important process is viewed situationally, by analysts and participants that seek to determine if this or that organization is effective, and whether one standard or another is likely to be widely adopted. Those that take a systemic approach typically address the topic from an economic or historical perspective, in each case limiting their focus to the importance of standards to commerce and innovation.
While these perspectives are important and necessary, they fail to appreciate what may be the most interesting aspect of standard setting: that it is based upon a common belief that there is more to gain from voluntary cooperation than from going it alone. Viewed dispassionately, the success of the process is all the more remarkable because those that engage in it are rarely interested in the common good, but rather in achieving as much commercial success for themselves as possible. In other words, it is a process that benefits all as a result of the effective pursuit of self-interest.
Needless to say, there are too few examples of such systems in the world today. Can a careful examination of standard setting help us learn how to reshape other processes in the pursuit of different international objectives?
We think the answer to this question is “yes.” We hold that opinion for two reasons.
The first is that history is rife with high-minded movements that have failed. Utopian visions have never, to date, been successful in achieving stability and longevity. Similarly, those international alignments that have been based on the projection of force have persisted only for so long as that force has remained credible. Where these two forces have come into direct contact, as in the United Nations, the results all too often are based upon success in forming coalitions rather than on a shared commitment to further the common good, or on preventing action from being taken at all rather than inspiring it.
For better or worse, it is extremely difficult to modify behavior, whether at the level of the individual or the state. Those social and political movements that have sought to tame innate human instincts have either required the power of the state to enforce them, or have eventually degraded back into an acceptance of pre-movement behavior. Simply put, systems that are based upon how people naturally act are more durable and stable than those that require people to operate in ways that require unfamiliar forms of sacrifice or conscious conformance to non-intuitive norms.
Another way of saying this is that fighting the gravity of natural human behavior is not only exhausting, but on the international level impossible, without the threatened or actual use of economic or military force to compel compliance.
The second reason that we find the standard setting example to be meaningful is that the concept of standards is so intrinsic to the human condition that it has become a part of our beings. From the earliest days of civilization, there has been an acceptance that common norms of understanding (speech), exchange (bartering, based on a common understanding of equivalents) and behavior (taboos) are both necessary and beneficial. These early standards arose before there was a state that could require agreement on such abstractions. With the rise of states, legal, measurement, and monetary systems were accepted domestically, and then internationally, due to the realization that more could be gained than lost through consensus.
When these two observations are combined, modern, global standard setting may be viewed in a new light. Even when all of the game playing and trade barriers are acknowledged and accounted for, we are still left with a rather remarkable reality: the participation of the peoples of virtually all of the nations in the world in a voluntary, consensus-based process that leads to the near-universal use of a myriad of standards.
How much can be learned from studying this phenomenon? We think that the answer may be “a great deal”. The evidence is both broad as well as deep, as non-regulatory standards cover every conceivable area of commerce, and those that help create them come from an ever increasing number of nations.
Why is this process so successful? One key may be that all that participate in the creation of a given standard have agreed upon the goal to be achieved.
It is true that some of the motivation for such participation may arise from the knowledge that a failure to join in may work to corporate or national disadvantage. But so long as access to the process is guaranteed and the process is legitimate, the results are likely to be respected, and there is an incentive for most to support the process and not to subvert it. And while the system is far from perfect, it continues to spread and to be reinforced through ever-broader voluntary participation.
Another key to this success is that the standard setting system does not try to modify human behavior, but to harness it. Rather than seek to fight the gravity of natural human behavior, standard setting organizations use that force to make the process more secure. Certainly this is the type of model that other international processes could well emulate. After all, the global standards infrastructure is equivalent to a United Nations with a more limited focus. Indeed, the United Nations itself has chartered standard setting organizations such as UN/CEFACT to help it achieve its goals.
We believe that a careful study of why standard setting works so well might hold valuable lessons for international relations and diplomacy. After all, the urge to band together for security is certainly as innate as the willingness to collaborate on standards. Perhaps such an examination could help nations come together to create the type of beneficial, lasting and internationally supported policies that seem so elusive today.
Copyright 2004 Andrew Updegrove
STANDARD SETTING AND DIPLOMACY
Introduction: Private sector standard setting is a recent innovation. Historically, the development of rules has been a governmental prerogative, and adherence to such rules was enforced through the power of the state. Not only was compliance compulsory, but punishment for a failure to conform could be severe. True, religions also fostered rules of behavior, and voluntary societies (such as trade guilds) created strictures binding upon their members. But those that entered into such communities had little or no voice in creating these rules, or freedom of choice in deciding whether to conform to them.
In the late nineteenth century, a new type of rule making came into being, inspired by the emergence of industrial society. The precursor to this process was the creation of interchangeable parts, which had the potential to allow the goods of unrelated vendors to be used by a single customer. The means of achieving this end was not found through the intervention of the state, but by the voluntary, organic efforts of those with an interest in the outcome. The spread of this technique – standard setting – rapidly became pervasive both throughout industries as well as geographically.
Not only was this process novel, but it was also in many respects contrary to strong traditions. Previously, competitors jealously guarded their trade secrets, and often sought differentiation through distinctive differences in their wares. Now, competitors voluntarily associated with each other, agreed upon common (although sharply defined and limited) goals, often shared the valuable intellectual property necessary to achieve these goals, agreed upon final standards through a process of collaboration and compromise, and then implemented these standards of their own free will. Notwithstanding the concurrent evolution of antitrust laws, governments came to condone, and even encourage, this practice through the creation of permissive exceptions within those same laws.
How this novel process come into existence, and why participation in standard setting has become so pervasive in such a short period of time bears examination for the lessons that it may offer to society and diplomacy.
Rule making and society: Throughout most of human pre-history, anthropologists believe that governance was consensual. The basic societal unit was (and still is, in those few hunting and gathering societies that still survive) the band, comprising a small number of extended families and at most a few score individuals. The survival of such a group depended on the cooperation of all of its members, and in the mutual benefit of that cooperation lay the difference between life and death.
In many societies, this same consensual relationship was maintained through the next step of social evolution, with the rise of chiefdoms. Only with the advent of agriculture and the formation of larger polities did the concept of kingship emerge, and with that concept, the cession of control over many aspects of one’s personal existence to government.
With the rise of democracies, the pendulum began to swing back, and the importance of personal freedoms gained a higher priority. But within these democracies, the power to make rules was still delegated to others (elected representatives and appointed agency staff). True, sophisticated controls such as an independent judiciary provided safeguards against abuses, but the individual was still bound by law to conform to an increasing range of rules created by others.
The concept of voluntary rule making (both in the sense of creation as well as compliance) is thus both revolutionary as well as traditional. Revolutionary, because the creation of standards has been the province of the formal governments that have maintained systems of weights, measures, coinage and laws for the last several thousand years, but traditional because the practice of consensus based rule making preexisted modern society for countless millennia.
The first standards: The re-birth of non-governmental standard setting was both organic and logical. It began when a gunsmith realized that assigning one person to expertly make multiple copies of a single part could enable the creation of more weapons in less time than could the same number of gunsmiths, each making every part of a gun. This practice also permitted the creation of spare parts, which in turn allowed armies for the first time to repair weaponry in the field. Employing this new technique, however, required that each replacement part be fabricated to exacting tolerances.
Once the concept of interchangeable parts became accepted, it was only a matter of time before manufacturers came to realize that the purchase, rather than the fabrication, of component parts might be desirable. Commodity parts of various types had been in existence for some time, but interlocking commodity parts from different vendors were not. For example, a shipbuilder might purchase a capstan from another artisan, but not the rack in which the capstan bars would be stored. That rack would still be fabricated by the shipwright to the size and shape of the bars that were delivered. Other parts, such as spars and ironwork might also be made by other tradesmen, but these goods would either be custom work, or might need to be resized by the shipwright as they were incorporated into the fabric of the ship.
With the increasing complexity of locomotives, looms, pumps and other types of machinery in the maturing industrial age, the final breakthrough came in the form of a pair of humble items: the nut and bolt, thousands of which might be needed for a single project. Concurrent with the need for such items was the development of the machinery required to fabricate these parts. No longer did a blacksmith make simple spikes and nails one at a time to visual measurements. Instead, machinery could cut and mill more complex fasteners – and could make each product to the same specification and tolerances. Why not have everyone use the same thread count and bolt diameter when they created fasteners, so that nuts, bolts, taps and dies could all be compatible?
Once this type of reasonable uniformity became feasible, then standards could truly come into their own, and manufacturers could seek multiple sources of supply at more competitive prices. For their part, suppliers could bid on more business, and create products at lower cost due to the ability to make far larger runs of a single product.
A new paradigm: But who would set such standards? Government was already beginning to create standards of its own choosing, in the form of regulations that addressed issues such as sanitation, safety and transportation. Those areas fell within traditional boundaries of governmental action, and also involved situations where compliance without sanctions might be unlikely.
Uniform specifications for bolts, on the other hand, were not high on the governmental agenda, nor was the governmental infrastructure sufficient to create the volume of standards that would be needed as industry ever more rapidly evolved. The result was that the commercial sector was faced with a situation where there was a clear need for a solution, and no one to look to but itself.
At the same time, the sale of some new products was being inhibited by safety concerns. Boilers, for example, were exploding at such a rate that design criteria were clearly needed in order to permit manufacturers to create products that could be used without undue risk. But in the late 1800s, government was not yet interested in regulating product safety.
Such needs brought recognition to industry that interoperability and safety requirements existed that could only be achieved through joint action. Thus, out of need came the realization that new opportunities, efficiencies and, indeed, entire markets could only be attained through collaboration with one’s competitors.
The result was the evolution of the modern standard setting process. Initially, standards were set on a national basis, and often asserted defensively to erect trade barriers rather than to facilitate international commerce. But eventually the need for global standards became evident, and the same methodology was soon deployed globally through the creation of national standards bodies participating in international organizations such as ISO. With the increasing globalization of trade and the advent of new boundary-indifferent technologies such as telecommunications, the number and importance of the standards recognized by such international organizations inevitably increased.
Today, organizations such as ISO are more effective and respected than some agencies and programs of the United Nations, despite the fact that there is no central mechanism to enforce their standards, and participation in their programs is wholly voluntary. Indeed, members of standard setting organizations are not even required to implement the standards that they help to create.
Why does standard setting work? The rapid emergence and success of the modern standards infrastructure is not much short of miraculous, for all of its imperfections. As communications, information technology, defense, and other heavily standards-dependent areas have become increasingly essential to modern society, this same infrastructure has begun to assume a quasi-governmental importance.
For example, by enacting the National Technology Transfer and Advancement Act of 1995, the United States Congress instructed all federal government agencies (including the Department of Defense and the Department of Energy) to use voluntary consensus standards created by the private sector in preference to “government unique” standards whenever possible. Similarly, while traditional utilities such as electricity and water remain subject to government regulations, the Internet and the Web are maintained by independent, non-profit consortia, notwithstanding the fact that they are swiftly becoming the lifeline of communication, government, finance, and just about everything else. And again, while radio frequencies remain under the control of national governments, all of the standards that are enabling the explosive growth of new wireless devices and services are maintained by accredited and non-accredited standards development organizations (SSOs).
In short, more and more of the power to control the rules that enable vital societal functions is being assumed by private sector SSOs.
At the same time, the effectiveness of the standard setting process is high, and complaints of inequities and abuses are surprisingly infrequent. What is it about this methodology that allows a voluntary process with no method of enforcement to be so successful? And how can so effective a system have arisen, given that the evolution of this process has been so ad hoc?
The success of this nouveau methodology of standard setting is doubly impressive, when it is recalled that after thousands of years of experimentation, and endless philosophical examination of formal governmental systems, the vast majority of the peoples of the world still live in countries ruled by governments that are at best unresponsive to the will of their peoples, and at worst outright abusive of human rights. Similarly, the evolution of a fair and effective system of international relations is still seemingly in its early stages. What, then, are the differences between governmental and SSO processes that lead to such divergent results?
A different path: The following characteristics, among others, lie at the core of this dichotomy. In each case, the reality within SSOs and democratic governmental processes is pronounced:
Lack of alternatives: Without SSOs, there would be no way to create standards unless governments were persuaded to take up the task. Given that most industries prefer to be self-regulating whenever they can, this leaves SSOs as the only palatable alternative. While the basic process of standard setting continues to evolve (e.g., with the rise of non-accredited consortia, and now Open Source projects), the central concepts underlying standard setting remain unchanged. Given that the need for standards is undeniable, those that depend upon standards to create their market opportunities have no choice but to support that process. In short, necessity drives behavior, because self-interest is best served by being part of the system.
In the case of governmental systems, however, there are multiple methodologies to choose from, each with its passionate adherents. These methodologies not only have major philosophical differences (e.g., communism, socialism and democracy), but there are variations within each system (e.g., some democracies opt for parliaments and prime ministers, while others have directly elected presidents). While most first world countries have enjoyed consistency in their governmental systems since World War II, many third world countries have seen only turmoil and upheaval in the same time period, particularly while European countries gave up their colonies, and the Cold War played out through the proxies of East and West. Hence, many nations have suffered from the fact that there are too many alternatives, and an inability to gain the commitment of all to any single choice.
On the international stage, the situation is somewhat different, in that the United Nations is the single globally recognized governance body. But regional alliances offer an alternative for some purposes (witness the rise of the European Union, and the peace keeping action in the Balkans under the auspices of NATO rather then the U.N.). Similarly, the projection of power by individual countries permits those nations to achieve unilateral, or alliance supported goals, that weaker countries could only secure through a world body. Absent common agreement that a single body (e.g., the United Nations) can be the only authorized entity to act in certain fashions, these alternatives provide viable opportunities to pursue nationally, regionally, or politically unique goals. Hence, until the United Nations provides a more universally attractive venue, some nations will be more inclined to pursue alternatives whenever they appear to be more advantageous.
More to gain than to lose: Participants in SSOs have concluded that they will reap greater rewards by giving up certain choices, and even valuable rights to earn a return on their patents, than by going it alone. This is because the targeted work product that has been agreed upon is not only necessary, but will be available to all on comparable and reasonable terms. As a result, all have the same level playing field. While there may be winners and losers in the sense that the proponents of one alternative solution may succeed while those that support another may fail, the risk for any individual participant is bounded by the fact that all may make use of the finally approved solution.
But in the case of governments, the system is too often played to create binary results that ensure that one side will win, while the other will lose. Often, the benefits of specific pieces of legislation or international action will only be enjoyed by a minority. Even where all may well benefit, those benefits are often hard to prove, and therefore may not be appreciated by those that are philosophically opposed to the method employed to achieve a specific end. Worse yet, the riders and compromises added to many pieces of legislation in order to garner a majority of votes for passage often lead to expensive “pork” provisions that work to the benefit of only small, but politically significant, interest groups.
The lessons to be learned under this category are necessarily more obscure. Certainly there is no easy corollary to turning every government goal into the equivalent of a standard. Internationally, perhaps closer ties between economic opportunity and the exercise of political influence might more closely align desired results with incentives to cooperate. Similarly, perhaps there is a way to reset processes to provide greater rewards from cooperation than contention, and to engage in a deconstructive process intended to eliminate as much opportunity for partisanship and ideology as possible.
For example, in a domestic setting, requiring a bipartisan legislative committee to first agree on what priorities should be addressed before voting begins on how those priorities should be achieved would help warring political factions focus on the issues themselves. Once agreement was reached on such priorities (e.g., to create jobs), then the next step could be to determine what percentage of a balanced budget should be dedicated to that task, and so on. By the time it became appropriate to agree on the actual implementational steps to achieve the identified goals, the opportunities to manifest competing political philosophies would have been dramatically reduced.
Is such a proposal politically feasible? Probably not. But it does emphasize the fact that without alteration, the current system provides more incentives to work towards partisan solutions than towards common goals in the most efficient and effective fashion possible.
Self-correcting: A standard that is not expected to be useful is simply not adopted. As a result, those that create a new SSO or propose a new standards initiative within an existing organization must temper their desire to exercise too much influence. Otherwise, the resulting standard may not be implemented by one’s competitors or by others whose cooperation is necessary (the recent failure of Microsoft’s Sender ID specification, due to MS-required license terms deemed objectionable to the Open Source community is an example of such a result).
In contrast, under most political systems the reelection of an individual representative has too little to do with the effectiveness of the legislation actually supported by the same representative. The current system instead rewards a representative for reflecting the political beliefs of the majority of her constituents, regardless of whether the voting record of that representative actually produced desirable results. In a better system, we would not only have interest groups that tracked whether a given representative voted in favor of left leaning or right leaning legislation, but whether she voted for legislation that proved to be effective, regardless of how it was viewed politically.
On the international stage, lack of assurance in the commitment of nations to support collective decisions ultimately undermines the ability of any such decision to be effective. When a participant has no confidence that others will truly commit to support collective action, then the safest course of action is to hedge one’s bets as well.
Confidence in the process: Because anyone can opt out of the standard setting process, that process must inspire confidence in those that choose to participate. Since those with the greatest commercial power still need the buy-in of those that have less, there is a powerful incentive for the strong not to overpower the weak, and therefore to agree to a process that will seem likely to ensure fairness in results.
The contrast in this regard is perhaps most dramatic internationally. The United Nations would doubtless be more effective if the Security Council did not exist, since otherwise a majority of the nations in the world would need to support a decision before it could be implemented, and no proposal would need to be tailored to the goals of any single nation in order to avoid a veto. All nations (great as well as small) would therefore have an incentive to bring proposals before the organizations that were deemed to be beneficial (or at least not harmful) to all of the worlds’ peoples, or there would be no point in proposing the action at all.
While the current Security Council system ensures that the largest nations will participate (at least nominally), it also means that many of the most important initiatives that are proposed will either be watered down, or will be proposed only for the purpose of highlighting the veto of a given Security Council member. Those resolutions that do pass successfully through the Security Council may thus be so tainted with proprietary intent that they are deserving of little respect, and there may therefore be little incentive for others to support these same actions.
Proven success: There are few incentives to invest resources, or to make important strategic commitments, in outcomes that are uncertain. One of the principal reasons that the standard setting process is so widely employed is that it has a track record of proven success.
In contrast, the record of United Nations initiatives in matters that would restrict the rights and actions of sovereign powers (as compared to humanitarian projects) is mixed at best. On the other hand, where nations seek to further their agendas outside of the U.N., success may be problematic where broad support is still required, and far more resources must be provided by the proponents to achieve success than would otherwise be the case.
Conclusions: Certainly the above line of thinking can only be taken so far. Standard setting is only comparable to other international situations up to a point, and the SSO process has its own weaknesses and failings. It is true, for example, that there are often too many SSOs trying to solve the same problem, thus failing the “no alternatives” test. Similarly, there have been times in some organizations that more participants seem to be seeking to game the system than to observe the rules.
But it is also true that there is something fundamentally effective about the system that sets standards that makes the constituent pieces of the standard setting infrastructure want to fall together rather than to fall apart. At root, the gravity that brings about this result is the fact that everyone involved has concluded that they have more to gain than to lose by participating, even when being part of the process requires ceding freedoms and (sometimes) even sacrificing valuable intellectual property rights.
What sorts of lessons, then, may the success of standard setting have for nations domestically, as well as for the United Nations internationally? Perhaps the strongest lesson may be that a system that does not have its incentives aligned with human behavior can never be truly effective and fair. Given the right design, good results will likely follow. But with a flawed design, effective results can only be achieved by extraordinary effort, and consistently favorable action will be difficult or impossible to achieve.
What is the strongest foundation for such a design? The motivation of action through enlightened self-interest is perhaps the most useful political force in the world. Creating international policies that can lead (for example) to increased security for every nation should certainly make it possible to harness this same force if all are willing to come to the table in the same spirit. The result could be a world that truly wants to work together, rather than to perpetually strive at cross-purposes and run the risk of falling apart.
But perhaps the greatest and most heartening lesson to be learned from standard setting is by way of example. In other words, an international system actually exists within with the most powerful corporations and nations have given up some of their rights, and even their valuable property at times, because they are convinced that the will of the majority will serve, rather than threaten, the interests of the individual participant. Certainly there is reassurance to be taken from the fact that such a system can, and indeed has, been successful.
Copyright 2004 Andrew Updegrove
BIFF’S TECHNOLOGY BOOKSTORE
ConsortiumInfo.org was launched in September of 2002 with the ambitious goal of becoming “the most comprehensive and detailed resource on (or off) the Internet on the topic of standards and standard setting.” Since that time, we have added a variety of features, including:
- The Essential Guide to Consortia and Standards, which provides book-length content and primary resources on a variety of topics intended to educate anyone on these subjects.
- The Consortium and Standards List, which now abstracts and links to over 400 standard setting organizations of all types, as well as their intellectual property policies and adopted standards.
- The Standards News portal, updated daily and complete with an RSS feed, which links to the latest developments in the areas of standards and Open Source.
- The Standards Blog, which provides a unique look at the role of standards in society.
- The Consortium Standards Bulletin, now in its twenty-first issue.
While this site enjoys the generous sponsorship of the law firm of Gesmer Updegrove LLP, that sponsorship covers less than half of the total cost of creating and maintaining ConsortiumInfo.org and the Consortium Standards Bulletin.
At the same time, we have ambitious plans to continue to expand the site. For example, using a generous seed grant from Sun Microsystems, we are currently constructing a “Standards and Standard Setting Metalibrary.” This new section of the site is intended to abstract and link to all of the literature that is currently available on the Web on standards and standard setting, and will be fully categorized and searchable. When completed, it will provide an invaluable resource for academic and government policy research, as well as on the job training for those that are active in standard setting. Look for the Metalibrary to go live in early 2005.
From the beginning, we have been committed to bringing all of this to the public free of charge. We hope to continue to do so for as long as possible, and here is what you can do to help.
Today we are launching a new resource, with the assistance of a new mascot. That new resource is called Biff’s Technology Bookstore. What is Biff’s all about? Biff’s is about being able to find all of the technology books that would be of interest to an information technology professional (or amateur) at the lowest prices, and in the easiest way possible. 100% of the profits from the bookstore will be used to support the maintenance and expansion of ConsortiumInfo.org, and to continue to distribute the Consortium Standards Bulletin free of charge.
Here’s what you’ll find when you Buy Your Books at Biff’s:
- Every computer technology book from the most popular IT publishers. At launch, we are offering over 2,000 titles from John Wiley and Addison Wesley; as we expand the site, we will make the offerings of all other major IT publishers available as well.
- A unique indexing system, allowing you to search by technical area (e.g., Internet and Web, Wireless, and Graphics), by standard (e.g., XML, 802.11, and OpenGL), by key word, or by ISBN number, title or author.
- Explanatory definitions of every standard and technology on each subcategory page.
- The lowest discounts that we are able to make available (many affiliate programs allow you to decide how much of the reseller’s discount to keep, and how much to pass along to their customers).
- And, of course, Biff.
So there you have it. Help keep ConsortiumInfo.org and the Consortium Standards Bulletin free and on the air. If you need a book, buy it from Biff. At holiday time, remember those you love with a book from Biff’s. And while you’re thinking about it, tell your friends where to go, too.
Copyright 2004 Andrew Updegrove
FROM THE STANDARDS BLOG:
|“It's national pride. They want their country standard to be the international standard.” Ellen Matten, international issues analyst for the U.S. Codex Office, on Japan's reaction to a soy sauce dispute with the U.S., which wants all soy sauces created the same way. (September 29, 2004, Washington Post)
#21 Soy Sauce, Kimchi, and the Golden Rule Those of us that live in the ICT standards space tend to forget that there are parallel universes of standards almost too numerous to mention. The inhabitants of these alternative realities concern themselves with specifications relating to construction, safety, fuel, and just about anything else you can (or can’t) think of.
But while the standards of these other standards spheres may vary widely, the behavior of those that set them remains comfortably, if sometimes regrettably, familiar.
All of which brings us to the vital subject of the latest and most contentious battle to roil the global standards marketplace: I refer, of course, to the donnybrook over that most piquant and ubiquitous of all Japanese condiments: soy sauce.
Yes, Virginia, there is (or at least shortly will be) an international standard for soy sauce. But will it describe the real deal – the traditionally brewed and fermented condiment beloved of the Japanese? Or will it specify some mass-produced, flavor-enhanced, artificially colored American concoction instead? Finally, will U.S. - Japanese relations take an irreparable hit in consequence, or will diplomacy once more succeed in reconciling East and West?
If all this sounds a bit overwrought, consider this: ICT standards invoke high emotions based entirely on economic concerns, but rarely involve national pride. The composition of traditional food, on the other hand, combines both cultural identity as well as profits – a recipe for a highly combustible mixture.
In fact, food standards are neither rare nor economically insignificant. Champagne, for example, may only be so labeled if it comes from the champagne district of France (a requirement best remembered, mon ami, lest you provoke litigation). Until non-French vintages produced using the methode champagnoise attained a following on their own, the right to apply the Champagne appellation to a bottle of sparkling white wine conveyed the ability to command a premium price.
In the case of soy sauce, however, it is expediency rather than geography that is driving the dispute. Traditional soy sauce production is a months-long affair, involving wheat, soybeans, a special mold and a three-step process: Koji-making (a blending step), brine fermentation (which converts the koji to a mash called moroni over a period of several months), and refining (through filtration and pasteurization). According to a Website maintained by Kikkoman, the largest seller of traditional soy sauce in the United States, it is the second step in this process “ that creates the many distinct flavor and fragrance compounds that build the soy sauce flavor profile.”
But there is also an upstart formulation that is sold in greatest quantity under the ConAgra “LaChoy” brand. This fauve soy (as the Japanese regard it), is produced by a brute-force industrial process: soybeans are boiled with hydrochloric acid for 15 to 20 hours, after which the mixture is rapidly cooled, neutralized, filtered, colored, sweetened, salted and refined. In the view of Kikkoman, the resulting product is “harsh and one-dimensional.”
Not surprisingly, it is not only much faster, but also cheaper to utilize the non-traditional technology to create the condiment. In the view of the Japanese, only a product produced by the centuries-old fermentation methodology should be entitled to be called “soy sauce.” Any effort to label the brown substance created through the ConAgra process as “soy sauce” should be banned as false advertising, and a debasement of a cultural epicurean icon.
Still, how does one go about defending one’s culinary heritage? Only an international standard, created by a global, treaty-backed organization will suffice.
So it is that in 1998, the Japanese food industry approached the imposingly named Codex Alimentarius Commission (CAC), and asked that august body to create a standard that would be based upon the recipe and production method used to create traditional Japanese soy sauce. In other words, the standard would be not only a design standard (specifying the required ingredients that would be required to constitute “soy sauce”), but a process standard as well (mandating the method to be used to produce the final product). Those that used the quicker process could still call their product “soy sauce”, but would have to add the words “non-brewed” or “short-term brewed” to the label as well.
Seeking recourse to the CAC entails some degree of risk, however. The CAC was commissioned in 1963 by no less than the United Nations. Once a standard is adopted, all treaty countries must revise their domestic regulations to conform to the standard adopted. Thus, although a given nation may propose its own formulation as the basis for an international food standard, that nation will be bound by the result – whether its own offering becomes the basis of the eventual recipe or not. The Japanese, for example, immediately encountered strong opposition from imposing American trade associations, such as the formidably, if not very delectably, named International Hydrolyzed Protein Council.
How might the Japanese feel about the United States telling them how to make soy sauce? Bruce Silverglade, legal affairs director of the Center for Science in the Public Interest, who is assisting Japanese consumer groups in the dispute put it this way, as quoted in a San Francisco Chronicle article: "It's something to tell Japan how to make soy sauce. Next we'll be telling France that Spam should be labeled pate."
Up to this point, one might justifiably conclude that Japan owns the moral high ground on the soy standard issue. Rather than permit Fast Food America to run roughshod over a hallowed Japanese comestible, Japanese industry is seeking redress before the United Nations of food standards, there to plead the case for quality over cost, and culture over international corporate carpet bagging.
So the Japanese would indeed seem to have the equities on their side. At least, until one looks into the sordid history of Japan’s own recent effort to hijack the recipe and process for making kimchi, the national epicurean treasure of its neighbor, Korea. Japan, it seems, sought to promote the adoption of an ersatz kimchi made using cheaper ingredients and a quickee industrial process. Do you hear the echoes?
Consider the déjà vu ridden parallels to the current soy standard wars when you read the following excerpt from the Mandala Project’s Website on the effort by Korea to launch a kimchi standard to protect its own cultural heritage:
For the Koreans, Japanese kimchi is not genuine kimchi. It is nothing but copycat kimchi. Korean kimchi is made with Chinese cabbage, red pepper, garlic, salted fish and ginger, and then stored in clay containers to ferment for at least four weeks….However, Japanese kimchi is made with Chinese cabbage and artificial flavor, skipping the fermentation process.
Sound familiar? So also are the emotions at issue. The mission of the Mandala Project, is to “ use new technologies and new research approaches to [address the] critical issues of the time ’s [sic].” Kimchi, it would appear, is nothing if not a critical issue of the “time’s”. Read further from the same website:
Kimchi is more than a food for the Koreans. It is a kind of national symbol and part of the national identity for Korea. Kimchi is Korean traditional culture itself. Korea has a saying that "the taste of kimchi is the taste of your mother's fingertips."
Passing for the moment on what role the taste of one’s mother’s fingertips should play in ones’ adult life, let us see how the kimchi battle was fought. Compromises and tensions typical of an ITC standards process were the order of the day, as evidenced by this update midway through the process:
So far, neither the Japanese nor the Koreans seem satisfied with Codex's draft standard. It defines kimchi as a "fermented" product but permits the use of citric, acetic and lactic acids, none of which are used in the traditional kimchi process. The dispute is expected to intensify as Codex moves closer to ratification.
When the standard was eventually adopted, of course, Korea claimed complete victory, focusing on the attributes that most reflected its traditions. Consider this from the official Korean kimchi site:
By establishing international food standard of Codex kimchi in 2001 centering on Korean cabbage kimchi, Codex admitted that Korea is the suzerain state of kimchi, and now kimchi is international food, not only for Korean but also world people.
Japan is still smarting from its defeat in the battle to become the suzerain state of kimchi, and hopes to be more successful in its effort to defend the honor of soy sauce. And why not? This time, Japan has tradition on its side. Will it not be destined to enjoy global hegemony when it comes to its national sauce?
We shall see. Japan presented its case before a CAC committee on September 27; that committee will vote whether or not to recommend the issue to the CAC board, for deliberation in Rome next June. Until then, the dispute can only, well, ferment.
But as with standards anywhere, setting the standard does not automatically lead to reaping the economic benefits. Standards in all spheres, from food to ITC, have often failed when the fickle tastes of real customers enter into the equation. As the Koreans found, just because the CAC was persuaded to standardize on real, Korean kimchi (or close enough), that doesn’t necessarily mean that “world people” will actually eat it.
Consider the following recent thread in the Food and Dining forum of www.koreabridge.com. Those on line had just discovered a McDonalds in Busan, South Korea that serves pancakes. As benstine21 observed, “i have now had them twice including this morning and they are the standard Mcd's fare. really good if you ask me but so is anything compared to kimchi.” Another poster took a theological approach to the issue, noting that:
…it very clearly states in the Letters of St. Paul to the Romans that pancakes are better than kimchi for breakfast. The garlic of the kimchi does have some proven health benefits, but you can't beat raw, unadultered SUGAR for that morning blast that is guaranteed to knock you off your horse on the road to Damascus.
Sadly, we were unable to learn which translation of the Epistles the author relied upon for this intriguing interpretation, as the entry was simply signed, “ Today's post was brought to you by the letter I and the number 18.”
So there you have it. We in the ICT space are not alone. Even when the stakes are as high as preservation of the integrity of soy or kimchi, the standards process may involve people behaving badly, as the temptation to game the system becomes too great to bear. So also, nationalism may stand in the way of globalism, and profit may be the thief and the enemy of piquancy. But there is yet reason to hope that this saga of soy will end as happily as the kimchi caper – thanks to the CAC.
If there is a moral for readers of this humble Blog to learn from such food standards wars, it may be this: Standards, like fine foods, are the product of careful and often laborious processes. Spare the process, and you’re sure to spoil the “soy”.
Copyright 2004 Andrew Updegrove
# # #
Useful Links and Information:
I. The Wonderful World of Traditional Soy Sauce:
Origins: Jiang, a soy sauce precursor, was first produced in China as far back as 500 B.C. In the classic mode of seemingly all oriental stories that find their way to the occident, it was a Zen priest that brought epicurean enlightenment from China to Japan some thousand years later. Thereafter, soy sauce evolved as a distinctly Japanese culinary institution. It was not until the 1800s that soy sauce began to find its way to the United States, along with oriental laborers. By 1972, demand for traditionally prepared soy sauce was robust enough to lead the Kikkoman Corporation (purveyors of fine soy sauce since the 1600s) to open their first American production facility, in Walworth Wisconsin. For more on the history, ingredients, and manufacture of traditional soy sauce, see: http://www.shejapan.com/jtyeholder/jtye/living/shoyu/shoyu_index.html
East (doesn’t) Meet West: For a comparison of traditional and “brute force” soy sauce production techniques, click here.
www.japantoday.com Thread on the Soy Standards Wars:
If the above food standards tales are still not familiar enough, we should observe that China has set its own soy standard relating to certain ingredients. One article at the People’s Daily website sternly states that Chinese condiment factories “ should not fight a suicidal war among domestic partners,” a sentiment that many married couples would do well to heed. See:
EU’s Ban on China’s Soy Sauce is a Sheer Rumor: Official
II. Kimchi and the Korean National Identity:
Kimchi Wars: The Mandala Project examines the Japan-Korea Kimchi dispute (author: Misuzu Nakamura, May 2001) : http://www.american.edu/TED/kimchi.htm#r1
III. Codex Alimentarious Commission: The Codex Alimentarius Commission was formed in 1963 by the Food and Agriculture Organization and the World Health Organization, each an agency of the United Nations. It is based in Rome Italy. As stated on the CAC home page, its charter is: “ to develop food standards, guidelines and related texts such as codes of practice under the Joint FAO/WHO Food Standards Programme. The main purposes of this Programme are protecting health of the consumers and ensuring fair trade practices in the food trade, and promoting coordination of all food standards work undertaken by international governmental and non-governmental organizations.”
Codex Alimentarius Commission Website:
Postings are made to the Standards Blog on a regular basis. Bookmark:
THE REST OF THE NEWS
Web Services Update
Same Old, Same Old: The Web Services news this year has settled into some predictable patterns, but new subpatterns are emerging all of the time. Our selection of news this month demonstrates one ongoing trend, the reemergence of an old issue, and a new discussion that is gaining strength.
The ongoing trend is the continuing barrage of specifications, many of which continue to be created outside of the consortium process and then are offered into it. The old issue is the overlap among the consortia to which these specifications are offered (in this case, the W3C and OASIS, which early on in the Web Services saga were the subject of debate over who should have been offered the BPELS specification that had been created outside the process by the usual suspects). The new trend, represented by the two stories that bracket our selection, raises the question of whether the whole situation is getting out of hand. Whether this is a sign of a maturing Web Services environment or a process that is losing its way because so much is being done by so few outside of the normal consensus process is an interesting question to consider.
Web services standards - Is enough enough already?
InfoWorld, October 13, 2004 -- The World Wide Web Consortium (W3C) this week is holding a workshop on "Web services Constraints and Capabilities." The organization is striving toward a common vocabulary on the constraints of what a Web service can do and may develop a standard to address the issue. Once again, I have to ask, does the world need anymore Web services standards? Aren't there too many already? Well, officials at the two leading Web services standards organizations, the aforementioned W3C and OASIS, have different perspectives on this. ...Full Story
WS-Management's Success Depends on Wide, Deep Vendor Support
Gartner, October 11, 2004 -- Microsoft and its partners are introducing yet another standard definition for management that will overlap with Hewlett-Packard's and IBM's work with the OASIS -- a specification known as Web Services Distributed Management, or WSDM. Microsoft hopes to get element suppliers of servers, storage, networks and application software to use WS-Management to enable access to embedded management information via a standard protocol. On 8 October 2004, AMD, Dell, Intel, Microsoft and Sun Microsystems announced the publication of WS-Management (based on Microsoft's earlier WMX), a Web services standard that defines a common approach to exchanging management information across hardware, software and applications. ...Full Story
AMD, Dell, Intel, Microsoft, and Sun Release Web Services for Management (WS-Management). The Cover Pages, October 8, 2004 -- A new Web Services for Management (WS-Management) specification edited by Alan Geller (Microsoft) has been published. This initial joint publication of the specification names Advanced Micro Devices (AMD), Dell, Intel and Sun Microsystems as co-developers. A version of the specification was previously demonstrated at the WinHEC 2004 conference in Seattle under the title Web Services Management eXtensions (WMX). It is currently "a key part of the Microsoft Dynamic Systems Initiative (DSI)." ...Full Story
W3C Announces Formation of New Web Services Addressing Working Group.
The Cover Pages, October 7, 2004
-- W3C has chartered a new Web Services Addressing Working Group as part of the W3C Web Services Activity, under the W3C Architecture Domain. The TC Chair is Mark Nottingham (BEA), while Hugo Haas and Philippe Le Hégaret have been designated as W3C Team Contacts. The charter extends through 28-February-2006. The goal of the new Working Group is to produce a W3C Recommendation for Web Services Addressing by "refining the W3C Member Submission WS-Addressing based on consideration of the importance of this component in the Web Services architecture, implementation experience, and interoperability feedback. ...Full Story
Where's the simplicity in Web services?
By: Martin LaMonica
ZDNet, October 5, 2004 -- A debate is raging over whether the number of specifications based on Extensible Markup Language (XML), defining everything from how to add security to where to send data, has mushroomed out of control. Defenders of advanced Web services specifications say they are needed to ensure that new computing architectures are flexible enough to accommodate both sophisticated and smaller-scale applications. Detractors say that simpler application development methods are good enough. The rallying cry for people who favor simplicity is a technology approach called REST, or Representational State Transfer, a method of building applications by sending XML documents over existing Internet protocols. This allows programmers to construct applications with existing tools and computing infrastructure, notably HTTP (Hypertext Transfer Protocol). ...Full Story
Big fish swimming in the same school: One interesting trend to observe in the Open Source world this month was the actions of the largest traditional software and hardware vendors to align themselves with, support, and take advantage of the Open Source model. The following selection of articles present a range of examples of how this process is evolving up and down the stack of companies great and small.
In the first two items, Novell and IBM (each of which has heavily committed to Open Source in their core strategies) are seen to take internal and external actions consistent with that commitment. In the third, Microsoft (which is hardly a proponent of Open Source, for obvious reasons) is taking another baby step as it feels its way around this new and threatening reality. The fourth article reports on the activities of the “middlemen”: the collaborative associations that are creating and promoting the wherewithal for vendors to make their moves. The final article focuses on yet another effort to create a specific flavor of Linux, tailored to a particular use. In this case, the French government is supporting that effort.
Novell to Use Its Patents to Protect Open-Source Programs
By: Steven J. Vaughan-Nichols
eWeek, October 12, 2004 -- On Tuesday, Novell Inc. announced that it will use its patent portfolio to protect its open-source software offerings. In a policy statement, Novell said it will utilize its patent portfolio to defend against potential intellectual property attacks on its open-source products. Software patent issues have increasingly become significant in both open- and closed-source programming circles. In Linux, Open Source Risk Management, a provider of open-source consulting and risk mitigation insurance, said in early August that there were 283 issued, but not yet court-validated, software patents that could conceivably be used in patent claims against Linux. ...Full Story
IBM Ratchets Up Attention To Open-Source And Standards-Based Software
By: Paul McDougall
InformationWeek, October 4, 2004 -- IBM has quietly placed some of its most senior executives in a new unit that will develop a strategy to more precisely define the role the company will play in an IT market in which big business customers increasingly look to open-source and industry standards-based software to build their next-generation computing networks. The group will be headed by John Kelly, senior VP and group executive for IBM's Technology Group. Under the plan, Kelly's title becomes senior VP for technology and intellectual property. He will continue to oversee the company's Technology Group. Irving Wladawsky-Berger, general manager for IBM's E-business On Demand group, will also be part of the effort, as will Linux general manager Jim Stallings. Both executives will also retain their current responsibilities. ...Full Story
Microsoft open sources Web authoring application
By: Joris Evers
InfoWorld, September 28, 2004 -- Continuing its flirtation with open source, Microsoft (Profile, Products, Articles) Corp. on Monday posted the code of a little-known collaboration application to open-source development site SourceForge.net. Microsoft is sharing the source of FlexWiki, a program for creating Web sites called wikis that allow users to add and edit content. It is Microsoft's third open-source code contribution, but the first time the company is sharing code for an actual application, said Jason Matusow, director of Microsoft's Shared Source Initiative. "We want to make sure that we continue to push our spectrum in terms of how far we go with different types of technologies and different types of licensing choices, and understand the benefits," he said. ...Full Story
The Free Standards Group and Open Source Development Labs Collaborate on Enterprise Linux Standards
DM Review, September 22, 2004 -- The Free Standards Group (FSG), a nonprofit organization dedicated to developing and promoting open source software standards, and the Open Source Development Labs (OSDL), a global consortium dedicated to accelerating the adoption of Linux in the enterprise, announced a collaboration to accelerate enterprise adoption of the Linux Standard Base (LSB) with new services to support software vendors developing applications for Linux. The two organizations already work closely with Linux distribution and independent software vendors. FSG partnered with OSDL on a new open project called the OSDL Working Set that aims to identify and quantify the most commonly deployed packages with Linux in enterprise production environments. FSG also helped create the conference program for the upcoming January, 2005 OSDL Enterprise Linux Summit in Burlingame, California. At this Summit, FSG will conduct a tutorial on developing LSB 2.0 compliant applications. ...Full Story
French Firms Aim To Beef Up Linux Security
By: Elizabeth Millard
TechNewsWorld, September 24, 2004 -- A consortium of French Linux firms are poised to work on developing a highly secure Linux operating system for business, defense and government use. The effort is being funded by the French Ministry of Defense, which chose Paris-based Linux vendor Mandrakesoft as the project leader. The other French companies include Bertin Technologies, Surlog, Oppida and Jaluna. The contract is a US$8.58 million, three-year deal. Mandrakesoft spokesperson Gael Duval told LinuxInsider that the consortium is important, because it will heighten open-source security . The other companies in the group echo Duval's sentiment. ...Full Story
Where to begin? Ah, wireless. Not so long ago, if you scanned the ICT press coverage of standard setting organizations, the venerable IEEE would have garnered a rather modest market share. That’s hardly the case now, as all things wireless continue to command attention in the media airspace. These days, it seems like there’s nary a frequency that the IEEE isn’t working with, or a device that could be putting out a signal that doesn’t have its own working group beavering away to exploit it.
Our sampling of wireless articles this month amply demonstrates just how hot wireless and the enabling IEEE standards are, beginning with an item reporting on IEEE efforts to exploit unused television frequencies to set up “wireless regional networks.” The next two pieces focus on vendor hunger to make money from each new wireless standard as quickly as possible, whether its ready to roll or not. Since many of the wireless applications that these new standards are intended to support are in the consumer space, rushing products to market before certification programs are in place will certainly lead to much angst at the customer level when early release products fail to perform as their vendors have promised, or where configuration proves to be a tricky affair. Our last item in this array is intended to remind us that IEEE is active in a great many other areas besides wireless standards (okay, so this one is about a cellphone battery standard, but at least it’s a battery).
Now if the engineers at IEEE could just come up with some new and easier names for their endless array of wireless standards, before we get to a specification with a name like 802.11.z.AAA.742….
IEEE to develop wireless broadband standard for open TV spectrum
RCR Wireless News, Piscataway, NJ, October 12, 2004 -- The Institute for Electrical and Electronics Engineers said it has set up a working group for a standard to tap open channels in the television spectrum for wireless broadband applications and services. Known as 802.22, or wireless regional area networks, the new standard will not interfere with existing licensed TV bands but its coverage will reach 40 kilometers and more from a base station, according to the IEEE. "This is ideal spectrum for deploying regional networks to provide broadband service in sparsely populated areas, where vacant channels are available," said Carl R. Stevenson, interim chair of the IEEE P802.22 Working Group. "Our goal is to equal or exceed the quality of DSL or cable modem services, and to be able to provide that service in the areas where wireline service is economically infeasible, due to the distance between potential users." ...Full Story
ZigBee Buzzes on Standard
Unstrung.com September 30, 2004 -- The ZigBee Alliance has fired a warning shot to vendors guilty of creating "market confusion" in their launch of pre-standard products. The Institute of Electrical and Electronics Engineers Inc. (IEEE) last year finished work on standardizing the physical layer and media-access control for the low-power wireless data technology (otherwise known as 802.15.4), but the ZigBee Alliance is still working on the software stack and upper layers. Official ratification is expected in this year's fourth quarter (see ZigBee Ready to Buzz?). Despite the imminent specification release, a number of vendors have launched pre-standard products in an effort to gain an early market lead. In an email note received by Unstrung, the chair of the Alliance expresses concern over the marketing of such kit. ...Full Story
Wi-Fi Alliance Will Not Certify Pre-Standard 802.11n Features
Wi-Fi Alliance Press Release, Austin, TX, October 11, 2004 -- The Wi-Fi Alliance today announced that it will not certify data rate enhancement features based on the IEEE (Institute for Electrical and Electronics Engineers) 802.11n amendment to the 802.11 wireless LAN standard until the standard is ratified. No IEEE 802.11n products currently exist, and none are expected to exist until the standard is completed in approximately two years (November 2006). Due to the potential for customer confusion, the Wi-Fi Alliance strongly discourages use of the term “IEEE 802.11n” in association with any Wi-Fi CERTIFIED™ product. To help assure that Wi-Fi technology users continue to have a positive experience, the Wi-Fi Alliance will revoke the Wi-Fi certification of any product with claims of IEEE 802.11n capabilities if that product is proven to adversely impact the interoperability of other Wi-Fi CERTIFIED products. ...Full Story
IEEE Starts Cellular Phone Battery Standard; Standard to Take Systems Approach to Make Lithium-Ion Battery Cells and Packs in Mobile Phones More Reliable
Business Wire, Picataway, NJ, October 7, 2004 -- The proliferation of cell phones continues to push demand for their lithium-ion and lithium-ion polymer batteries to new highs and has prompted the development of a new standard at the IEEE to improve their reliability. The standard, IEEE P1725(TM), "Standard for Rechargeable Batteries for Cellular Telephones," will be developed within the IEEE Standards Association Corporate Program. The new standard will seek to make cellular phone batteries more robust by setting uniform criteria for their design, production and evaluation. It will consider battery and battery pack electrical and mechanical construction, chemistries, process control, qualification and packaging technologies, among other areas. It will be developed by companies that manufacture batteries, cells and handsets, as well as by carriers. ...Full Story
Meanwhile, back at the courthouse: All things wireless may be one of the hottest areas today, but wireless technology also inhabits some of the densest patent thickets around. While the intellectual property rights (IPR) policies of the standards bodies involved are trying to scope out the IPR landscape as the standards in question are developed, not all infringement issues (especially those involving non-members) can be resolved before products move into the marketplace. The following article reports in depth on what’s happening on the wireless legal front, as patent owners and advocates of royalty free licensing spar over the emerging areas of 802.11 and RFID.
Patent landrush threatens Wi-Fi standards
The Register, October 4, 2004 -- We have examined before how patent lawsuits are threatening to stifle the adoption of wireless standards. Symbol, fresh from an intellectual property victory over rival Proxim, is the latest to assert sweeping licensing rights in 802.11 technology, while VIA is seeking to extend its proposed ‘intellectual property pool’ to WiMAX. With the emerging WiMAX and RFID wireless technologies both subject to major patent claims, as well as numerous intellectual property disputes in Wi-Fi, the arguments are growing louder that standards are kept royalty-free. This would basically give companies – particularly start-ups – the choice of keeping their inventions proprietary and seeking to build a de facto standard with a full royalty revenue stream, Qualcomm-style; or donate the innovations to industry bodies for free, but with the hope of creating a far larger market in a shorter timescale, in which to sell products and services. ...Full Story
And (also) meanwhile, over in the mini-wireless space: The RFID tag marketplace is also having other types of interesting experiences these days: on the up side, it is enjoying increasing success in the marketplace. But on the downside, it is also experiencing headaches in grappling with the IPR rules under which standard setting efforts should operate.
In our first item, we see strong support for the deployment of RFID tags in European supply chains, while in the second, AIM (which is involved in all types of identification technologies, and not just RFID comments on the woes of EPC Global, another standards organization that focuses particularly on the RFID space. EPC Global is struggling with the issue of whether it can successfully maintain a royalty-free IPR policy, and is finding itself in what AIM characterizes as an “IPR Mess.”
New ETSI Standard approved for the use of RFID in UHF Frequencies
ETSI Press Release, Sophia-Antipolis, France, October 18, 2004 -- The use of RFID tags in the European Supply Chain has taken a great step forward with the approval by the European Telecommunications Standards Institute (ETSI) of a new standard for the use of RFID in UHF frequencies. This news has even greater impact now that the Frequency Management Working Group of the European Conference of Postal and Telecommunications Administrations has approved the recommendation to make the frequency band associated with this standard available in their 46 Member countries. The ETSI Technical Committee - Electromagnetic compatibility and Radio spectrum Matters (ERM), has delivered a two part Standard (EN 302 208) that gives the industry much needed guidance on the minimum characteristics considered necessary to make the best use of the available frequencies for RFID. This new standard will allow companies to market High Power RFID tags and readers in all national markets of the European Union and EFTA, by showing compliance with the European Union Radio & Tele-communications Terminal Equipment Directive (R&TTE Directive). ...Full Story
EPC™, ISO and IP
AIM Global Press Release, October 5, 2004 -- Despite having made real progress in moving its intellectual property (IP) policy forward, EPCglobal may be involved in a time-consuming waste of time that will slow eventual adoption of a Gen2 standard. The EPCglobal policy of requiring royalty-free IP from its participating members is laudable, from one perspective, but incomplete from all others. Why? Because it is neither inclusive nor enforceable. And the IP policy is just part of the problem. THE IP MESS 1. There are many holders of fundamental IPs that are not EPCglobal participants. EPCglobal cannot indemnify manufacturers or users against infringement suits from these IP holders. ...Full Story
Next Gen Video Update
Will this video never end? Well yes, it will, and it won’t be that long before a likely winner becomes clear. That’s because the end game in the DVD standards wars is beginning, as content developers finally start to place their bets on which format they wish to support.
Accordingly, our first story in this category reports that 20 th Century Fox has come out in favor of Blu-ray, following in the footsteps of Metro-Goldwyn-Mayer, which just happens to have accepted the acquisition bid of Blu-ray camp proponent Sony. But few things in standards are ever that simple, and in our second item John Dvorak reflects on the fact that while the standards wars are being fought, the standards themselves may not be receiving the type of attention that they should. Meanwhile, the audio technology of perennial favorite Dolby Laboratories has been declared mandatory under both the Blu-ray and the HD DVD format standards, as celebrated in the Dolby press release that next follows. (Which only goes to show that the only thing better than backing the right standard is owning the right standard.) Our final item, from The Japan Times Online, looks ahead to the next Japanese consumer electronics battle: this time over large screen TVs, where vendors in various camps hope to score “An Early Mortal Blow.”
20th Century Fox to adopt Blu-ray disc standard for new DVD
Japan Today, Tokyo, October 3, 2004 -- Twentieth Century Fox Film Corp, a major U.S. movie distributor, will adopt the Blu-ray disc standard for next-generation DVD players developed by Sony Corp and Matsushita Electric Industrial Co and other companies, the Nihon Keizai Shimbun reported Sunday. The economic daily said Twentieth Century Fox's move may give the Blu-ray disc standard an advantage over HD DVD, another format for DVD players developed by Toshiba Corp, NEC Corp and other companies. (Kyodo News) ...Full Story
Suspicious Battle Dept.
By: John C. Dvorak PC Magazine, October 6, 2004 --
The Blu-ray disc standard continues to battle with the HD-DVD standards in the back alleys of Asia. The final spec for HD-DVD should appear early next year. Though the Blu-ray group has a final standard, they are now adding to it. This process of creating constant additional standards within standards makes a mockery of the concept of standards. This problem is rampant in the current crop of DVD players—in case you haven't noticed. These incorporate different sound standards and playback schemes, and some DVD players can read writable discs while others cannot. It's amazing that these things manage to work at all. Most work poorly. ...Full Story
Dolby Technologies Mandatory on both HD DVD and Blu-ray Disc Next-Generation Packaged Media Formats
Dolby.com, San Francisco, CA, September 23, 2004
-- As evidence of its pioneering efforts in multichannel audio entertainment, Dolby Laboratories announces that Dolby® audio technologies have been selected as mandatory formats for both High-Definition Digital Versatile Disc (HD DVD) and Blu-ray Disc. The DVD Forum has selected Dolby Digital Plus and MLP Lossless™ as mandatory audio formats for HD DVD. The Blu-ray Disc Association announced that Dolby Digital will be a mandatory technology on its new format, the Blu-ray Disc. Both discs are next-generation packaged media formats designed to deliver high-definition picture quality. "Dolby is a recognized leader in multichannel audio technologies. ...Full Story
Firms learn from VCR war, seek early mortal blow
The Japan Times Online, October 16, 2004 -- Japanese electronics makers are waging battles in various digital home appliance sectors, aware that those who claim initial victories will likely remain dominant. A Sony corp. employee showcases a Blu-ray disc recorder and its disc package at a Sony showroom in Tokyo in this file photo taken in June. Manufacturers such as Sharp Corp. and Matsushita Electric Industrial Co. showed off their latest technologies earlier this month at Ceatec Japan 2004, one of the biggest digital consumer electronics exhibitions in Asia. The event was held in Chiba through Oct. 9. ...Full Story
And about that content: Not all of the action in the content space has to do with which format it will be incorporated into. There’s also the threat of illegal copying to contend with. The following article reports on a new consortium that has been formed to bridge the gap between competing digital rights management systems, to ensure interoperability between products that use different copy protection mechanisms.
Tide turns in DRM wars with the creation of Coral Consortium
By: Peter White
DM Europe, October 7, 2004 -- Intertrust, Philips and Sony have added more top consumer electronics, content and technology heavyweights to its attempt to create an open interoperable Digital Rights Management environment. The system promised at the turn of the year has taken a step closer to becoming a reality with a new DRM clustering of companies calling itself the Coral Consortium. Lining up with the expected triumvirate of Intertrust and its two owners Philips and Sony, are more powerful names in the form of Panasonic, Samsung, Hewlett-Packard and the News Corp controlled film company Twentieth Century Fox. ...Full Story
What a difference a decade makes: On the first of December, the World Wide Web Consortium will be throwing itself a party. And well it might, given the revolution in global information sharing that it has helped to create. Tim Berners-Lee and his crew will doubtless look forward as well as backwards, as they celebrate past victories and project what the future may hold for their work, and for our continuing enjoyment of it.
W3C Celebrates Ten Years Leading the Web
W3C Press Release, October 13, 2004 -- The World Wide Web Consortium (W3C) will mark the ten year anniversary of its founding with a symposium on 1 December 2004, at Boston's Fairmont Copley Plaza hotel. The day-long event will feature presentations by and discussions with luminaries from around the world whose contributions have played a key role in creating the Web, ensuring that the Web is open and accessible to everyone worldwide, and helping the Web to reach its full potential. Although attendance to the event is limited to W3C Members and invited guests, plans are being made to make the content of the day's lectures and talks available to the public. ...Full Story
| Creative commons continues to catch on: In our September issue this year, we focused on collaborative efforts of all kinds, including the Creative Commons project, an ambitious effort to facilitate content sharing on flexible, non-economic conditions . Since that issue of the CSB, the Creative Commons organization has launched efforts in the United Kingdom, and also, in the article below, reports that the use of its creative copyright licensing alternatives continues to enjoy robust growth.
Creative Licensing Scheme Grabs Artists' Attention
By: Chris Nolan
eWeek, September 29, 2004 -- Is the intellectual property licensing scheme known as Creative Commons picking up steam? The answer, it seems, is a cautious "yes." And that—despite the organization's demurring—could have political implications. "It's picking up," Commons director Glenn Otis Brown says. "The last six months, we feel like it's a completely different organization." The licensing scheme's popularity is clearly growing, increasing by a steady 50 percent every fiscal quarter for the past year, according to the Commons' traffic and other records. More than 4 million sites—of the 5 billion searched regularly the Web—have some kind of license. ...Full Story
| Trying to turn the tide: In our June issue this year, we focused on Standards and Security, including the rising tide of spam. The following article reports on a meeting co-hosted by the FTC and NIST to be held in early November. The meeting agenda acknowledges the fact that technology, rather than traditional legislative prohibitions, hold the key (if there is one) to finally curbing the spread of spam.
FTC, NIST to Host E-mail Authentication Summit
Linux Electrons, September 16, 2004 -- The Federal Trade Commission and National Institute of Standards and Technology (NIST) will co-host a two-day 'summit' November 9-10 to explore the development and deployment of technology that could reduce spam. The E-mail Authentication Summit will focus on challenges in the development, testing, evaluation, and deployment of domain-level authentication systems. A Federal Register Notice to be published today notes that the FTC’s National Do Not E-mail Registry Report to Congress stated that “significant security, enforcement, practical and technical challenges rendered a registry an ineffective solution to the spam problem.” ...Full Story
Standards in Health
Stop, now, what’s that (ultra) sound? While healthcare has historically been the province of regulations rather than consensus-based standards, more and more of the latter are being created as healthcare incorporates more and more technology into its delivery systems. The following two items are representative of the work that is being done in this area on two fronts: standards that relate to diagnostic equipment, and standards needs that relate to managing the ever growing amount of data that the health system needs to create, analyze, store and access.
NEMA Releases Two Revised Diagnostic Ultrasound Standards
NEMA Press Release, Rosslyn, VA, September 23, 2004 -- NEMA, The National Electrical Manufacturers Association, has released two standards for users in hospitals, clinics, and medical offices who want to measure the performance of their diagnostic imaging equipment: UD 2-2004, Acoustic Output Measurement Standard for Diagnostic Ultrasound Equipment, and UD 3-2004, Standard for Real-Time Display of Thermal and Mechanical Acoustic Output Indices on Diagnostic Ultrasound Equipment. UD 2-2004 describes a set of measurement procedures for ultrasonic output parameters. The standard achieves this objective by setting forth precise definitions of quantities, primarily those relating to acoustic output levels, and specifying standard procedures for measuring them. ...Full Story
OASIS Members Form International Health Continuum Technical Committee.
Cover Pages, September 23, 2004 -- OASIS has announced the creation of a new International Health Continuum Technical Committee as a "forum for companies on the Healthcare continuum internationally to voice their needs and requirements with respect to XML and Web Services." OASIS member sponsors of the IHC TC include CommerceNet, BT, National Insurance Administration of Norway, ReadiMinds, Webify Solutions, and SeeBeyond. DeLeys Brandman (CommerceNet Consortium) is the TC Convener and Proposed TC Chair. A principal motivation for the TC activity is that many standards organizations are working to standardize transactions in the healthcare vertical space but "little attention is being paid to the continuum of health, viz., to horizontal standards allowing all related verticals to interoperate through the use of web services tools and technologies." ...Full Story
The Business of Standards
So that’s where they come from: The standards of accredited standards bodies, in contrast to those of consortia, are almost always sold, rather than being made available without charge – which means that someone has to provide the fulfillment function. One of the biggest players in that space is IHS, which has just gotten bigger with its acquisition of USA Information Systems.
IHS Acquires USA Information Systems
IHS Press Release, Englewood, CO, September 20, 2004 -- Information Handling Services, Inc. (IHS - www.ihs.com), the leading international provider of content integration and decision support tools, technical standards, codes and product specifications, parts management and logistics solutions, today announced that it has acquired USA Information Systems, Inc. (USAInfo - www.usainfo.com). The move augments the world's largest published collection of engineering standards, military specs and parts logistics solutions from IHS, which has served worldwide engineering communities and customers for nearly 45 years. "This acquisition is about industry leadership and customer choice," said Charles Picasso, president and COO, IHS. "USAInfo has a successful track record of product innovation and customer focus and will be a complement to our existing logistics and government document products and services. ...Full Story
Standards and Society
| Your very own black box: For many years, valuable data has been recovered from the famous “black boxes” that are mandatory on passenger aircraft above a certain size. That data is essential not only to learn why a specific disaster occurred, but more importantly, how such an event can be prevented in the future. Now the same technology may be on the way for passenger cars, augmenting “crash dummy” information with field data from actual accidents in real, as compared to simulated, road conditions.
World's First Motor Vehicle 'Black Box' Standard Created At IEEE
Business Wire, Piscataway, N.J. September 23, 2004 -- Driven by a lack of the uniform scientific crash data needed to make vehicle and highway transportation safer and reduce fatalities, the IEEE has created IEEE 1616(TM), the first universal standard for motor vehicle event data recorders (MVEDR) much like those that monitor crashes on aircraft and trains. National Safety Council statistics show that motor vehicle accidents are the leading cause of death in those between one and 33 years in the U.S. They are the nation's largest public health problem, causing a death every 12 minutes and a disabling injury every 14 seconds. Worldwide, someone dies in a motor vehicle crash each minute. Road crash fatalities have claimed about 30 million lives globally since 1896. ...Full Story