Consortiuminfo.org Consortium Standards Bulletin- September 2004
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
   Home > Consortium Standards Bulletin > September 2004
  Untitled Document
Untitled Document

 

September 2004
Vol III, No. 9

Standards Alternatives

EDITOR'S NOTE: WE’RE NOT IN KANSAS ANYMORE
There’s more to standards today than just specifications.
   
EDITORIAL: THE MEDIUM IS THE MESSAGE
Not only Open Source projects, but all manner of non-profit collaborative projects are springing up across the Web, bringing kindred spirits together to create valuable tools for the networked world. The unique attributes of the Internet and the Web are at the heart of a revolution in cooperative creativity and value creation. Print this article (PDF)
   
FEATURE ARTICLE: A LEVER LONG ENOUGH: VALUE DRIVEN ENTERPRISE IN THE NETWORKED ECONOMY
The IT economy enables a new form of non-market, non-corporate activity to exist: “networked peer production”, of which Open Source software is but one example. Networked peer production makes possible the realization of an alternative, post-capitalist economic vision based on value, not profit, working alongside traditional markets and businesses.   Print this article (PDF)
   
TRENDS:

NETWORKED PEER PRODUCTION:  A NEW WAY TO SOLVE OLD PROBLEMS

Standards are great, and standards have problems. A new way of creating “commonalities” solves many of these problems and provides a way to create tools that never existed before.  Print this article (PDF)

   
STANDARDS BLOG: PREDICABILITY AND STANDARDS DENIAL
What do diets, mutual funds and DVD standard wars have in common? More than you’d think.  Print this article (PDF)
   
NEWS SHORTS: THIS MONTH'S TOP STORIES
Market Filters Reject MS Sender ID; Sony Puts Billions into Standards War; Voice Standards in the News; Ericsson Abandons Bluetooth; Microsoft Wins Another Round in Eolas Dispute; Latest Web Services News; ETSI Relevant Patent Database Surpasses 12,500 Claims....and much more  Print this article (PDF)

Print this issue (PDF)




EDITOR'S NOTE:

WE’RE NOT IN KANSAS ANYMORE

Andrew Updegrove

Anyone returning to the standards scene after a long absence might feel strangely like Dorothy arriving in the land of Oz: the landscape in the distance looks very different from what it used to. Continuing where we left off in our last issue, we explore what may lie ahead.

Last month we reviewed the Open Source phenomenon from a business perspective, looking at what must be done before Open Source software becomes ubiquitous in the business world (see: Open Source – Coming of Age?)

This month, we take a broader view, examining why Open Source and similar non-remunerative collaborative projects have become so common. Is it just because the Internet and the Web have made them possible, or is there something more profound going on?

In doing so, we look at how collaborative projects fit into the standard setting infrastructure, highlighting what is different and what is the same in comparison to more traditional means of achieving kindred goals – and what unique advantages can be gained by employing these new methodologies in preference to legacy processes.

We also take pleasure in welcoming a guest author to this issue: David Galiel, a visionary with his own non-profit, commons-based Web-deployed project, and some intriguing ideas about why “networked peer production” has become such a popular and productive model for creative endeavor.

As always, we hope that you find this issue intriguing. If you have your own ideas to share, please let us know.

    Best Regards,
   
    Andrew Updegrove
    Editor and Publisher


EDITORIAL

THE MEDIUM IS THE MESSAGE

Andrew Updegrove

Twentieth century visionary Marshall McLuhan is remembered today not only for his probative and creative thinking about modern life, but also for his ability to coin pithy word images that aptly summed up matters of social import. Examples still in use today include the phrase “global village” and the now ubiquitous words “the media”, which rapidly supplanted the “the press” as proper usage in the increasingly TV-driven world of the late twentieth century.

Another McLuhan phrase that is less remembered today is this: “The Medium is the Message” (being the title of a book he coauthored in 1967). But while the phrase has faded, the observations he presented in that work remain of profound relevance today. In explaining the title, the authors observed that "societies have always been shaped more by the nature of the media by which men communicate than by the content of the communication."

"Somehow, the speed, power and democracy of the Web allow a reordering of relationships and enable outcomes that motivates those involved to build virtual pyramids that would never have been erected any other way."

Although McLuhan died in 1980, he would have found the Web to be of profound importance. One reason is this: unlike television and radio, which allow commercial interests to unilaterally push their views at the populace, the Internet and the Web permit unmediated peer to peer interactions responsive to the desires and creativity of those at the ends of the network connections. How different than the commercial reality of radio and television, which McLuhan observed permitted "the media [to] work us over completely…so pervasive are they in their personal, political, economic, aesthetic, psychological, moral, ethical and social consequences that they leave no part of us untouched, unaffected, or unaltered'.

In a very different way, the phrase “the medium is the message” is even more meaningful when applied to the Web than it was applicable to the media of the 1960s. While television and radio permit a more vivid form of communication than newspapers, the dynamics are otherwise much the same. True, the potential for influence and manipulation are augmented by the power of the television medium, but the power relationships among those involved remain the same.

In contrast, the Web fundamentally changes the relationships of those that choose to take advantage of its potential. More importantly, it also allows relationships to emerge that were impossible to create in a pre-networked world. Not surprisingly, new practices and aspirations have found expression using this radically different enabling medium than existed before.

How radical are these new practices? Witness projects as varied as the Open Source movement, the Wikipedia, and the Project Gutenberg electronic publishing project, each of which is the work of many uncompensated individuals that are unknown one to another, and often publicly unacknowledged as well. What motivates them to contribute their time and energy to such novel endeavors?

What indeed? People are still people, and therefore it must be the medium that makes the difference. Somehow, the speed, power and democracy of the Web allow a reordering of relationships and enable outcomes that motivates those involved to build virtual pyramids that would never have been erected any other way.

As these projects grow and proliferate, it appears that this is no transitory phase in the evolution of the Web. While experimentation is certain to continue (with ever more interesting results), something fundamentally new has been introduced to the human condition. Unlike so many other modern innovations that have enabled violence, degradation and other adverse consequences, this new medium promises to give expression to what is best about humankind, rather than what is worst.

And that is a fine message indeed.

Comments? Email:

Copyright 2004 Andrew Updegrove



FEATURE ARTICLE

A LEVER LONG ENOUGH:

Value driven enterprise in the networked information economy

David Galiel

Summary: Today’s networked information economy enables a new form of non-market, non-corporate activity, networked peer production, of which Open Source software is but one example. This activity in turn, makes possible the realization of an alternative, post-capitalist economic vision, with self-sustaining systems deliberately based on value, not profit, working alongside traditional markets and firms.

"Five-thousand people were given paint-stirring sticks, and on the top of the stick was a card, one side of it reflective green and the other side was reflective red. At the front of the theater there was a [video] camera, mapping that red and green pixel for pixel onto a huge display… when people turned their wands they would change the color of their pixel, but there were five-thousand pixels, so you literally weren't able to find yourself… [W]ithin five minutes the audience was collectively playing a game of Pong by moving a red paddle up and down through a sea of green. It defied any notion of how one might actually go about managing that many people to do a collective activity. By the end of the half-an-hour the audience was flying a flight simulator."

  Brenda Laurel, describing an experiment by Loren Carpenter called Cinematrix done at SIGGRAPH '93, an international computer graphics conference and trade show.

Networked Peer Production: We’ve all heard of Open Source software. Much has been written about it in this very forum. Its ubiquity, less often noted, is remarkable:

  • 85% of all email is transmitted using Sendmail.
  • 69% of all Web sites run on Apache software (as compared to Microsoft’s 23%)
  • 30% of the servers on which those sites run use the Linux operating system (Microsoft has 50% of the market, but Linux use is growing much faster)
There are many other examples. Most significantly, 41% of all development tools used are Open Source. Thus, Open Source serves as its own network-enabled bootstrapping technology.

Open Source software, however, is only one, and not even the most significant, manifestation of what Yale Law professor Yochai Benkler calls “commons-based peer production”, a networked model of economic production that is not organized in either markets or businesses—as virtually all other economic activity is and always has been in capitalist societies. With peer production on the Internet, distributed masses of people share open production of complex products and services, largely for no financial compensation.

While the idea of non-market, non-corporate production is not new—science has traditionally worked this way—large-scale, decentralized, sustained open production, by diverse groups of peers, on a wide variety of focused projects, is a new phenomenon that has been enabled and encouraged by the confluence of computers, networking and the information economy.

Non-market peer production has been used to:

  • Create a world-class encyclopedia: Wikipedia is a peer-edited, open development online encyclopedia. In the less than four years since its inception, Wikipedia has grown to include over 310,000 high-quality articles in English and over 530,000 in other languages (Britannica has only 85,000 articles online), with over 20,000 contributors. In the past six months alone, Wikipedia has grown by more than 50%.
         
  • Develop a superior Internet directory: The Open Directory Project, a 65,000-person human-edited directory of the Web, is a peer-production alternative to the Yahoo! Directory. As of this writing, ODP has categorized more than 4 million websites in nearly 600,000 categories. ODP’s consistent quality is such that it now powers the core directory services for many of the Web’s biggest search portals, including Lycos, HotBot, AOL Search—and Google.
         
  • Publish major news sites: OhmyNews is a leading news outlet in South Korea. Its unusual power derives from its unique open editing system, which makes “every citizen a reporter”, currently including more than 33,000 citizen-reporters a day. In the US, open production projects like Slashdot, Kuro5hin and Indymedia engage hundreds of thousands of participants in the production, editing and peer-review of breaking news, opinion and analysis.
         
  • Empower grass-roots organizing: CivicSpace, developed by the creators of DeanSpace, the community site which powered Howard Dean’s presidential campaign, allows anyone to utilize sophisticated community-organizing tools for free.
         
  • And even map craters on Mars: The Clickworks project was an experiment run by NASA that allowed 85,000 people to collaborate on mapping Mars craters. When NASA did an analysis comparing the work performed by the Internet volunteers to the mapping previously done by trained PhD’s, they described the outcomes as “practically indistinguishable”.

According to capitalist economics, not-for-profit, commons-based peer production should not be happening—at least not on this scale, producing this degree of consistent quality, in such a diversity of fields. This new dynamic in the marketplace, and the reasons for its growing economic and social significance, stem from the ways new technologies have changed the capitalist “rules of the game”.

Markets and Firms: Until recently, virtually all economic activity in our capitalist system was centered on the generation of profit. There have been two practical ways to engage in economic activity: sell something for more than you paid for it, or make something and sell it for more than it cost to make. In turn, there have been two ways to organize the exchanges that enable economic activity: markets and firms (hierarchical corporations).

In 1937, economic Ronald Coase identified transaction costs as the engine behind the formation of capitalist enterprises. When the transaction cost of a given exchange of goods can be lowered through an intermediary below its natural cost in the marketplace, an opportunity is created for a capitalist enterprise. Conversely, when the open market is more efficient in terms of transaction costs, a corporate intermediary ceases to be viable.

As Prof. Benkler has observed,

The result was that most individuals lived their productive life as part of corporate organizations, with relatively limited control over how, what, or when they produced; and these organizations, in turn, interacted with each other largely through markets. We came to live much of the rest of our lives selecting from menus of goods, heavily advertised to us to try to fit our consumption habits to the decisions that managers had made about investment in product lines.

And then came the information economy, computers, and, most recently, networking.

Information: There are two radical ways in which information behaves differently than physical property, which used to form the core of our economy. A) The marginal cost of production is zero (once information is produced, it costs essentially nothing for one individual to transfer it to another), and B) the input, or raw material (information), is of the same nature as the output (new information).

As Prof. Lawrence Lessig points out, “The crucial feature of this new space is the low cost of digital creation, and the low costs of delivering what gets created.”

"peer production is powered by a commitment to the common interest, agnostic towards profit and is both supportive of, and highly dependent upon, democracy, individual freedom and social justice."

Computing power: While there has been some debate as to the precision of Moore’s Law (does processing power double every year? Two years? Eighteen months? Will the pace continue?), the principle articulated is merely a special case of a more profound and important phenomenon in technological evolution, which R. Buckminster Fuller identified in the 1920’s—the trend of “progressive ephemeralization”, or doing more and more with less and less. What this means, in practical terms, is that computers and network technology, the key mediums for production and distribution of information, continue to become more and more accessible, at lower and lower cost, to more and more people. The potential pool for peer production is rapidly expanding.

Network technology: Networks themselves have characteristics that make them economically unique. The now familiar Metcalfe’s law states that the value of a communication system grows at approximately the square of the number of users of the system. This means that, in defiance of the classic laws of supply and demand, the more ubiquitous and accessible a network is, the greater its value becomes.

The combination of ephemeralization and Metcalfe’s law has a determinant effect on the utility of hierarchical corporations as a vehicle for economic activity, because they push the trend of transaction costs for information exchange towards zero. This trend does not favor corporate solutions. And yet, certain forms of information production require the kind of collective effort of many people that corporations traditionally afford. This creates a growing opportunity vacuum that peer production enterprises increasingly fill.

Group-Forming Networks: That alone, however, is insufficient to explain the explosion of not-for-profit peer production projects in recent years. Less well known than Metcalfe’s law, but of even greater significance, is David P. Reed’s Law of Group-Forming Networks (GFNs). Group-forming networks are networks like the Internet that support the development of sub-communities, affiliating sub-groupings of network members, enabled by many-to-many communication tools like email lists, discussions forums, Weblogs and chat. Reed has found that the utility of large-networks, particularly social networks like the Internet, can scale exponentially with the size of the network. If the value of physical networks grows at a rate of n², the value of GFNs grows at the accelerating rate of 2n.

This confluence of new conditions enables peer production to emerge as a significant alternative to the traditional market/firm system, when productive activity is not best served by that system.

It still does not explain, however, why people choose to participate in such activity. There is largely no profit in it, there is often no fame in it (major projects like Apache don’t even credit individual contributors), and, while most Open Source coders are motivated by the sheer pleasure of tackling complex technical challenges, that doesn’t quite explain why masses of people voluntarily participate in a rather tedious process like Distributed Proofreaders, created independently to support Project Gutenberg.

Project Gutenberg is a peer production effort to digitize public domain books. Volunteers manually enter the text of a book, and other volunteers proof-read it for accuracy. Distributed Proofreaders breaks down works entered by Project Gutenberg participants into individual pages, which are then randomly assigned to volunteers who sign up at the Distributed Proofreaders site. The project has dramatically accelerated the output of Project Gutenberg, becoming the main source of its e-books in less than two years. Clearly, the rewards (in any traditional sense) for such activity are not self-evident.

The value-driven enterprise:Most discussion about peer production has focused on the novel means—production occurring on a large scale outside the market/corporate system.

I believe that there is something else going on. This newly available means of networked peer production is catalyzing activity that is not just a new means to make the same old stuff. To understand it—to understand how the ends of the process of production will be critically changed by these new means—we need to look at the most neglected part of human economic activity: purpose.

In 1995 a former IBM programmer named Craig Newmark started keeping track of interesting events in the San Francisco area and telling people he knew about them. Word spread, and Craig started an email list to keep everyone notified. He called it “craigslist”.

There are now craigslists for more than 45 major cities in the US, providing “a trustworthy, efficient means for folks to get the word out regarding everyday stuff, and connect with others in the local community to find jobs, housing, companionship and community”. A Forrester report in 2000 confirmed that craigslist was the most effective job site in San Francisco. Craigslist, running entirely on Open Source software, serves as many pages every day as Amazon.com—with a staff of just 14 people.

There have been estimates that if craigslist was ever offered for sale, it could be worth as much as $100 million. But craigslist won’t be sold. In 1997, Newmark decided not to make craigslist commercial. Craigslist does not even accept banner ads, despite lucrative offers. Nor does craigslist charge the public for using its myriad services, except for a small fee required in three major cities for job postings—just enough to keep craiglist sustainable, with enough left over to make a difference with the causes important to Craig—including helping other non-market efforts get off the ground.

When you listen to the founders of enterprises like craigslist, you sense a very different motivation than the profit-driven rationale that powers traditional market/firm activity.

Craigslist CEO Jim Buckmaster told BusinessWeek earlier this month,

One thing that people often don't get about us: We're a public service first. Most businesses are conceived as a way of making money. Our primary mindset is philanthropic -- to offer what we see as a public service. On our site you don't see banner adds, text ads, cookies, co-marketing agreements, selling user information to third parties -- all of those nuisances users encounter on the majority of sites.

Similarly, Jimbo Wales, the founder of Wikipedia, has paid for all Wikipedia operations since its inception, with no financial return whatsoever. In a recent interview on Slashdot, he stated:

It is my intention to get a copy of Wikipedia to every single person on the planet in their own language…. [This kind] of big picture ideal makes people very passionate about what we're doing. And it makes it possible for people to set aside a lot of personal differences and disputes…and just compromise to keep getting the work done. Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That's what we're doing.

CivicSpace Labs, in turn, describes its mission as, “Building a new kind of civic infrastructure, tools and technologies to connect citizens and help them organize themselves”, while Project Gutenberg founder and CEO Michael Hart says, “Project Gutenberg is powered by ideas, ideals, and by idealism. Project Gutenberg is NOT powered by financial or political power.”

Even Linus Torvalds, who, like many “geeks” in the Open Source community tends to emphasize the intrinsic pleasures of solving problems and writing code, describes his “basic rule of life” thus:

If you ever wonder, "What should I do?" and you ask yourself [the] question - "What would I want somebody else to do?" - suddenly you know the right answer.

None of these value-driven projects fit into the traditional capitalist model of profit-driven enterprises, organized in hierarchical corporations, competing in the marketplace. No corporate leader of today could articulate management goals in such a manner, so devoid of consideration for “maximizing shareholder profit”, and keep their job.

So what else can we learn from the unique characteristics of these new enabling technologies discussed above? And where can the example of individuals like Craig Newmark and Jimbo Wales lead society in the future?

The implications for democracy: Historically, t he benefits conveyed by capitalism to society have come at the cost of significant tradeoffs. The pursuit of profit, organized in corporations competing in markets, is at best agnostic to (and often in direct conflict with) the public interest. That is why society has developed a system of checks-and-balances that, in theory, elevates the rule of law above individual and institutional power.

At the same time, we sacrifice certain individual freedoms, tolerate certain social inequities, and relinquish a degree of democratic oversight, in order to enjoy the benefits of capitalist production. We don’t produce wheat democratically, because we are more interested in everyone eating bread.

The layers of authority in hierarchical organizations reduce individual autonomy, while markets introduce social inequities (in fact, they depend on them—if wealth and the means of production were distributed equally, there would be no market incentive to excel). And neither corporations nor markets are run by democratic means.

We tolerate and even embrace these tradeoffs, because all historical alternatives have dramatically failed. Nevertheless, all but the most zealous of ideologues recognize the imperfections inherent in capitalist economies. Over time, free societies have employed varying degrees of government intervention in order to regulate markets and firms in order to limit the loss of autonomy and reduce social inequities. Most discussion of capitalism centers on the appropriate degree of such intervention.

These controls have been necessary because markets & businesses are fueled by the pursuit of profit, are agnostic towards the social interest, and require compromising the principles of democracy, autonomy and equality. Peer production, in contrast, can serve the public interest and enable the creation of significant value, with much less surrender of democracy and loss of individual freedom. As it has evolved, peer production is powered by a commitment to the common interest, agnostic towards profit and is both supportive of, and highly dependent upon, democracy, individual freedom and social justice.

Peer production offers more, however, than mere liberation from the moral conflicts of a profit-driven world. Sadly, the mere will to “Do Good” does not seem sufficient in this world to motivate “right” conduct. If it were, charity alone would solve all problems of want and social inequity. In reality, the magnitude of our problems, the inertia of our institutions, and the relative poverty of the nonprofit realm (total charitable donations in the US equal but 2% of our GDP) are inadequate to the task. Many dedicated nonprofit leaders in consequence spend a majority of their time and energy securing donations to keep their organizations afloat, rather than devoting all of it to the causes to which they are committed.

The inherent qualities of networked peer production systems—the multiplier effect of online collaboration, the democratized production process, the growing accessibility and low barriers to entry, provide a powerful platform upon which to build transformative, value-driven enterprise. I believe that this insight is the key to understanding the emergence, and significance, of value-driven enterprise on the Internet. All things being equal, most people would prefer to do work that makes a difference. All things are rarely equal, however, in the corporate world.

Trimtabs to the rescue: Futurist R. Buckminster Fuller often talked about the concept of a “trimtab”. Large ocean liners, tankers, aircraft carriers and other massive vessels have equally massive rudders. However, even the sturdiest rudder axle would snap under the pressures of battling the inertia and water pressure resisting its attempt to change the direction of one of these behemoths. That is why, on the trailing edge of these giant rudders, there is a thin strip of steel called a trim tab. A relatively small motor turns the trim tab, which can withstand the relatively small water pressure against it. That pressure, however, is translated to the massive rudder, which responds by turning in the opposite direction, and, in turn, redirecting the entire ship.

With what we know today about the structural dynamics of tipping points, it is easy to see how the intentional application of trimtab sensibility can hope to affect change in even the most intractable problems.

Enterprises dedicated to generating profit tend, over time, to deemphasize the intangible, hard to measure quality of “Doing Good” in relation to the precisely quantifiable monetary metric of “maximizing shareholder profit”. On the other hand, charitable enterprises committed to making a difference generally reconcile themselves to living off the scraps capitalist enterprises throw their way, constantly at the mercy of the next fundraiser to continue their vital work.

Networked peer production offers a way to make a real difference without massive financial resources, political clout or business connections. It makes things equal, and the challenge is to use it to make a difference.

Peer production has already been incorporated into profit-driven enterprises like Amazon.com (member book reviews), eBay (peer-rating of sellers) and Epinions.com, (product ratings and reviews). Clearly, commons-based peer production has a complementary role to traditional profit-driven activity.

It is also apparent that peer-production, enabled by networks like the Internet, makes possible purely-value-driven, non-market enterprise on a scale that was previously impossible. It makes each of us a potential trimtab.

If not for the power of networked peer-production, Michael Hart, who founded Project Gutenberg in 1971 while still a student at the University of Illinois, might still be laboring in obscurity, keying in whole books, one at a time, much as the monks of old did, transcribing manuscripts before the era of the printing press. He and his handful of helpers likely would not have made significant progress towards the goal of making a rich library of public domain books electronically available to all in his lifetime.

In fact, it took twenty years for the first 1,000 books to be entered. In the following two years, powered by networked peer production, the number of completed Project Gutenberg books doubled to 2,000. In 2001, two years after that, the number doubled to 4,000. The Distributed Proofreaders Project further utilized peer production techniques to accelerate the process beginning that year. By October 2003, the number of digitized books had topped 10,000. Project Gutenberg is now international - PG Europe was launched earlier this year. Michael Hart looks forward to the digitization of the millionth book in 2015. His projection might just be conservative.

Tiny levers, loosely joined: My own value-driven effort is, Public Interest Entertainment Corporation (PIECORP), a non-profit attempt to move non-market, value-driven enterprise to the next level. Rather than focusing on production, or individual project sustainability, we seek to act as a trimtab, to address the viability of the value-driven enterprise in general. We address these issues through design.

  • First, our own Open Source product, Mars First!, is designed as an environment for experiential learning of civil society, non-violent conflict resolution, critical thinking and science, thus directly addressing a social need.
  • Second, all the tools, technology, platforms, infrastructure, research, designs and content we produce will be released to the public domain. In this way, we hope to make development of virtual worlds, currently a prohibitively expensive and highly proprietary process, available and affordable to all. We will also freely share the lessons we learn and the tools we build to help facilitate constructive, sustainable, democratic community management.
  • Third, via player subscriptions, we will leverage our work to make an even greater difference in the world. First, by becoming rapidly self-supporting, we will sever our final dependence on the market/firm/charity system. Then, we will donate all net revenue to allied, worthwhile purposes: 90% of it to nonprofit organizations working in the fields of civil society, non-violent conflict resolution, democratic empowerment, science education and critical thinking; and a critical 10% of net revenues to support the creation of other non-market open development enterprises that feature a similar, revenue-generating, further-reproducing model.
Our ultimate goal is to help jump-start a new, alternative economy, alongside the traditional, profit-driven, market/firm system, that supports, sustains and develops value-driven enterprise in non-market, non-corporate networks. Each value-driven venture may only be a tiny lever in the scheme of things, but, aligned end to end, using open, Internet-based peer production as the fulcrum, we can form a lever big enough to move the world.

After all, if it doesn’t make a difference, what’s the point?

David Galiel is Executive Director of PIECORP, a nonprofit creative studio using popular culture and digital entertainment technologies to promote civil society, non-violent conflict resolution and critical thinking. Please share your thoughts about this essay with him, david at galiel dot com. If you are interested in Mars First! and would like to learn more, visit http://www.piecorp.org/

This article is licensed under a Creative Commons License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/2.0/ For other s uses, contact the author.

References:
  Yochai Benkler: http://www.benkler.org/
  Lawrence Lessig: http://lessig.org/
  Brenda Laurel: http://www.tauzero.com/Brenda_Laurel/index.html
  Reed’s Law: http://www.reed.com/Papers/GFN/reedslaw.html
Cited projects:
  Craigslist: http://www.craigslist.org/
  Wikipedia: http://wikipedia.org/
  Open Directory Project: http://dmoz.org/
  Project Gutenberg: http://www.gutenberg.net/
  Distributed Proofreaders: http://www.pgdp.net/
  CivicSpace Labs: http://www.civicspacelabs.org/
  OhmyNews: http://english.ohmynews.com/

 

 

 

 

 

 

 

 

 



TRENDS:

NETWORKED PEER PRODUCTION:

A NEW WAY TO SOLVE OLD PROBLEMS

Andrew Updegrove

Introduction: Traditional, specification-based standards have many virtues. The process whereby they are created is well developed and time tested, and the beneficial results that can be obtained from them are similarly established.

But while such standards can make new products possible, accelerate their market acceptance, increase consumer choices and lower costs, they are not without their weaknesses as well. Those weaknesses include the difficulty of addressing intellectual property rights (IPR) issues, the vulnerability of the standard setting system to be “gamed” by devious participants, and the launching of rival standards efforts for reasons of proprietary advantage rather than technical merit, among others.

Many of these vulnerabilities arise from the nature of this type of standard, which is well suited for creating interoperability among users. As the number of compliant products grows, so do the benefits to the owners of those products, as the “network effect” takes hold (e.g., the value to the owner of a telephone or a desktop computer increases with the number of users linked together on the same network).

What distinguishes a commonality? A commonality has three essential attributes. It is: (1) whatever tool(s) we need, (2) that we need to agree on, (3) in order to do what we agree needs to be done.

This network effect also enables the creation of many different implementations from one point of origin, each with its own virtues of lowered cost, integration with other features and supporting services. But until a critical mass of vendors actually adopts and builds to the specification, a user is no better off using a standards-based product than any other market offering.

As we have pointed out in the past (see A Look into the Future: Not “Standards” but “Commonalities” ),we believe that the traditional concept conveyed by the word “standard” is too narrow to address the needs of a modern, networked world. While specifications serve a vital purpose, they are not optimal (and in some cases, even capable) of solving all problems.

Instead, we think that a better term is “commonalities”, a word that we believe is both broader and more conducive to thinking creatively. What distinguishes a commonality? As we have proposed before, a commonality has three essential attributes. It is: (1) whatever tool(s) we need, (2) that we need to agree on, (3) in order to do what we agree needs to be done. Depending on the goal to be achieved, traditional tools that fall within this definition include not only specifications, but also reference implementations, test suites and best practices guides.

But with the emergence of the Internet and the Web, a new type of commonality has become possible: the end product itself, rather than the tools to create or connect products. While this new type of commonality, created through “networked peer production” (NPP) naturally has its own issues to address, it also has distinct virtues that avoid or address many of the problems inherent in traditional specification-based standards. This article will explore some of those differences, as well as the advantages that employing NPP methodology may offer to create new and useful commonalities.

Starting at the end and working back: An increasing number of Web-based projects are being launched that involve networks of people (often in large numbers) working to create something that is typically made available to all without charge. While the most high-profile example is Open Source software, which is made available using the GNU Public License and other variations on the same theme, there are many other projects that do not involve creating software.

These projects are as varied as they are numerous, ranging from Project Gutenberg (which is making thousands of “out of copyright” books electronically available without charge), to the Creative Commons (which is creating the legal tools that permit authors to make their work product freely available on less than a full copyright basis). New efforts are being launched virtually on a daily basis.

By starting at the end (the product itself) rather than the definition of a product or an interface (the traditional specification approach), many of the issues that arise under traditional standards are rendered moot. Consider the following:
       

  • Lack of Profit Motive: First and foremost, when the goal is implementation without remuneration, many of the normal competitive forces that can subvert traditional standards efforts disappear. These forces include the following:
          
    • Subversion of Quality: In traditional standard setting, some participants have a greater incentive to adopt a particular technical approach than to achieve the best technical result.
           
    • Delay of Uptake: The broad adoption of standards creates losers as well as winners. Those that fear the erosion of existing proprietary advantages often work to delay the launch or thwart the success of a new standard.
           
    • Incentives to Launch Competing Standards: While avoiding standards implementation royalties is often a goal in some industries (e.g., software), the ability to charge patent royalties is tolerated (or even expected) in others, such as the telecommunications space. Where the royalty potential is very great, a “winner take all” competition can result (as in the current battle over the next-generation DVD standard). In such a case, the players vie for licensing income rather than settle for the benefits of smooth and rapid market adoption of the next generation of their products. The results can be slower uptake, higher prices, and (at times) dual standards competing in the marketplace to the detriment of vendors, end-users and intermediaries alike.
           
    • IPR Issues: Because traditional standards efforts are intended to foster volume sales, there are incentives for participants to embed patent claims in standards, and for non-members to assert similar claims that they may own.
           
  • None of these motivations need exist in the case of an NPP. With a single, free implementation, no participant has a reason to work against quality or immediate availability. Similarly, there is no motivation for competition (besides human nature). For example, if a second group of individuals decided that they wished to make copyright-lapsed books available on the Web, they would have no obvious motivation to input books that were already available at the Project Gutenberg site. Instead, they would be more likely to collaborate with Project Gutenberg, offering to take responsibility for a particular genre of literature, or perhaps add commentary or bibliographic material in support of the library. And (expect in the case of Open Source software) there is likely to be little potential for patent infringement, nor significant profit to be gained from a patent assertion even if infringement did exist.
         
  • Independence from the Network Effect: The output of most NPP collaborations is valuable at the moment of creation, because it is an end product. While some work product (such as Open Source software) can increase in value with broad adoption, much of this software is immediately useful even on a single platform. Other work product (such as this website or Project Gutenberg’s output) is not only valuable and usable upon creation, but is totally independent of the efforts of third parties.
         
  • Easy initiation: The nature of the Web makes the unification of kindred spirits with public-spirited goals uniquely possible. Historically, non-profit activities were launched through one-on-one interactions. Necessarily, this limited the pool of potential supporters to those that were local, and known – a sufficient constraint to relegate many fine ideas to the status of unfulfilled visions. In contrast, Web-based NPPs can draw on a global reservoir of potentially kindred spirits. In the case of Open Source, venues have already grown up (such as SourceForge) that make gathering a community of interested parties quite easy.
         
  • Low Overhead: While historical standard setting efforts have not been large-budget affairs, they have nonetheless required the investment of time and travel expenses by those that have wished to participate, as well as the funding of the infrastructure needed to host the process, and publish and maintain the standards developed. The result is that they have been primarily commercial endeavors, with some input from academia and government but little end-user involvement. Web-based NPPs, in contrast, require almost no infrastructure at all.

Where do we go from here? Of course, NPP projects are not without their own limitations. Without a profit motive, there is far less market-imposed discipline, and an NPP can be as on-target, or as clueless as its members lead it to be. In the case of Linux, for example, there are a number of missing features that are of greater interest to end-users than to the volunteer engineers that have contributed to the creation of Linux to date. Similarly, NPP projects can wither as well as flourish, and are dependent upon the visionary leadership of their founders.

But as with the consortia that burst upon the traditional standard setting scene in the late 1980s, the NPP model will evolve. We are still in the early exploratory stage of the NPP concept, and a wide variety of governance and process experiments can be expected to arise before consensus begins to emerge over what practices may best support the success of such an effort.

We believe that the NPP model is one of great interest, and holds profound promise across many areas of non-profit endeavor. We also believe that this methodology is uniquely suited to create useful commonalities that at times will be superior surrogates for traditional specification-based standards. In consequence, we hope that the creation of NPPs and the adoption of their work products will be embraced and supported by the standards community.

Comments? Email:

Copyright 2004 Andrew Updegrove



FROM THE STANDARDS BLOG:

#20 Predictability and Standards Denial  

What is it about human nature and standards, anyway? Consider two seemingly unrelated benchmarks, and the relationship of the typical American to each of these under-appreciated tools.

The first is what is referred to in common parlance as the “calorie” (in fact, the dietary calorie is a “kilocalorie”, properly so called, and is equal to 1,000 of the “small” calories used as a measurement in fuel research). A calorie (small or large) is an extremely precise measurement: the large, economy size, version used in dietary circles is the amount of energy required to raise the temperature of one kilogram of water, at one atmosphere of pressure (itself a precise standard), from 50 to 51 degrees Celsius.

When it comes to thermodynamics, the human body is a machine. It converts fuel into energy, and that energy can be measured in calories. The fuel itself, therefore, can be expressed in terms of the calories of energy it can release in the process of that conversion. Similarly, the body can be analyzed in terms of the amount of energy that is required to keep it running for a given time under stated conditions.

Dieting, therefore, is a pretty simple proposition: If the energy created by the conversion of the food ingested in a day is less than the energy requirements of the body on the same day, weight loss results. Invert the relationship, and weight gain occurs instead. As Ross Perot used to say, “Pretty simple”. Right?

Well, you’d never know that it was this simple today.

Not so long ago (say the 1960s), when you wanted to diet, you bought a little pamphlet that had two tables in it. In the first table, you could look up a given type of food, and find out how many calories were represented by a standard serving size. The second table told you how many calories, on average, a person of a given weight and height needed to ingest to meet 24 hours of energy demand. A deluxe version of the same type of pamphlet might also contain a third table, indicating how many calories a given type of activity (e.g., swimming or walking) might burn up.

All you needed to lose weight was this little pamphlet, based on the humble but precise standard known as the calorie, and a measuring cup. If you wanted to go really wild, you could buy a food scale as well. Oh yes – you also needed something called “will power”.

The second benchmark that we’ll examine is the S&P 500 Stock Index. This familiar index aggregates the performance of a representative sample of 500 leading companies in a balanced variety of industries of importance to the U.S. economy. From time to time, companies are added and subtracted to maintain the representative nature of the mix.

The significance of the S&P 500 for current purposes is that it is the index that investment professionals most often reference as the reason that potential customers should come to them to invest their savings. Only by doing so, it is usually said, can an investor provide for adequate retirement savings. Over and over again we are reminded that the 50-year average annual return of the S&P is approximately 11%, and that there is not a single ten-year period in which the S&P did not beat the yield of conservative bond holdings.

So what we can learn from the above is that the only tools anyone really needs to retire in a state of svelte comfort are a simple diet table, a measuring cup, and a no-load S&P index fund. Guaranteed good looks and financial security, all made possible by two simple, well-respected, well-documented standards. Who could ask for more?

Similarly, we also learn that there is no need for either diet books or other types of mutual funds, right?

Let’s take a look and see.

Type the word “diet” into the “books” search field at Amazon.com and you’ll get 106,154 titles to choose from (as of this writing). That’s quite a lot of authors anxious to tell you how to read a calorie chart and wield a measuring cup. But no, calories are passé. The offerings of today’s diet gurus range from pure absurdity to more scientifically based diets that, at best, may make dieting marginally more effective.

How about mutual funds? Well, at the end of 2003, there were an impressive 8,126 mutual funds available to the investor trying to beat the S&P 500. In fact, the majority of these funds rarely beat the humble, mindless S&P, even before taking loads and fees into account.

So why is it that we don’t stick to simple basics, but are always trying to beat the system? Why is it that we’re convinced that if we try something new and different that we can eat more and still lose weight, or invest the same amount to make more?

Why isn’t predictability enough?

The answer in part is that dieting and financial investments are huge industries, each of which is intent on pulling more dollars out of our pockets with new gimmickry. One day carbohydrates mean salvation; the next day they’re damnation (while we all grow fatter by the day).

So also with investment vehicles: junk bonds, Internet stocks and hedge funds - each gets its season in the sun, each promising a superior road to riches. Perhaps it was only a matter of time before investment professionals and authors noticed the ability of the fashion industry to drive consumers through hemline hoops like trained poodles.

In fact, the example of diets and mutual funds is symptomatic of much behavior in the commercial standards market as well, albeit for more deliberate and calculating reasons.

Standard setting is rife with competing efforts to set standards that will uniquely advantage the proponents of those standards. Witness this week’s news that a consortium of investors led by Sony has placed the winning offer – approaching US $5 billion – to purchase the MGM film portfolio. It is believed that a leading goal behind Sony’s bid is to gain the upper hand in a standards war. Once Sony takes control of the thousands of video titles now owned by MGM, it will be more likely that the Blu Ray next-generation DVD specification it backs will win out over the competing HD DVD specification being promoted by arch enemies NEC, Toshiba and Sanyo.

Sony lost a similar standards battle in the same market space in the late 1970s, when its proprietary Betamax video format fell victim to JVC’s competing (and many believe inferior) VHS offering. Still smarting from that debacle, Sony appears willing to spend heavily this time around to avoid a similar defeat.

But not only Sony suffered in the Betamax/VHS wars. Video stores and consumers took a beating as well, since stores needed to stock both formats for years, and Betamax owners ultimately were forced to abandon their expensive, and now obsolete, video players. Due to the confusion, vendors and store owners as a group made less money, consumers spent more, and the market matured more slowly.

Wouldn’t it be better for all if consumer electronic makers had learned a few lessons from the Betamax/VHS battle? The content owners have - they want a single standard. Which is why Sony is buying up the MGM catalog, so that it can offer that catalog (if it so chooses) only in its favored Blu Ray format, making it a safer bet for other content owners to tip in the same direction.

So what do DVD formats have to do with diets and mutual funds? Unfortunately, it seems to be human nature not to be satisfied with the predictability that standards can offer, whether we’re a dieter, an investor, or a consumer electronics manufacturer. Instead of playing it safe and sure, we gamble on a more problematic, and sometimes illusory greater return.

Too many of us are fatter as a result. And too many vendors, as well as investors, are poorer for the same reason.

Comments? Email:

Copyright 2004 Andrew Updegrove

# # #

Useful Links and Information:

Calorie information: Typical User-Friendly Calorie Table: http://www.bodyfatguide.com/foodcalorie2.html

The source of most commercial calorie and nutrition tables is the USDA Home and Garden publication Number 72, which includes not only calorie information, but data on fat, protein and vitamin content as well: http://www.nal.usda.gov/fnic/foodcomp/Data/HG72/hg72_2002.pdf

Mutual Funds Fact Book summary of mutual fund facts and figures: http://www.financialservicesfacts.org/financial2/securities/mutualfunds/

Standard and Poors’ S&P 500 information page

50 year S&P 500 Performance Table:
http://www.mutualofamerica.com/articles/CapMan/October03/SandP500.htm

Postings are made to the Standards Blog on a regular basis. Bookmark:



THE REST OF THE NEWS

For up to date news every day, bookmark the ConsortiumInfo.org
Standards News Section


Or take advantage of our RSS Feed

Open Source

Is this is or is this ain't "Open Source"? If the old saw is true that "the nice thing about standards is that there are so many of them," then open source licensing regimes must be really terrific. Notwithstanding the many variations on Open Source licensing, experts in the area believe that they, like the Supreme Court when it comes to pornography, "know it when they see it." When it came to the terms upon which Microsoft was willing to make its new Sender ID anti-spam framework available to the industry, Open Source leaders were emphatic that they didn’t like what they saw.

The first article below focuses on the terms that Microsoft proposed, and on the elements of those terms that open source experts view as being unacceptable. One after another, multiple influential Open Source groups came out not only against the Sender ID terms, but also called upon the IETF to amend its policies to preclude what they viewed as unacceptably encumbered material from being incorporated into core Internet infrastructure. Within a matter of days, the IETF agreed. The last article below describes the final indignity: it indicates that Sender ID isn’t effective in thwarting Spam.

Open-Source Community Skeptical About Microsoft's Sender ID License
By: Steven J. Vaughan-Nichols
eWeek, August 26, 2004 -- Microsoft this month is moving forward with the developer implementation of its anti-spam Sender ID framework, but open-source advocates and mail vendors doubt whether the software giant's new proposed license meets open-source requirements. Sender ID has already gained market support. Both ISPs, such as AOL, and mail software and support companies, such as Cloudmark Inc. and Tumbleweed Communications Corp., have announced support for it. Microsoft has also announced that it will start using Sender ID for inbound e-mail to its hotmail.com, msn.com and microsoft.com domains in October. Despite this groundswell of commercial support, Microsoft's licensing requirements are incompatible with many open-source licenses, according to experts. This, in turn, means that Sender ID couldn't be implemented in open-source mail applications. ...Full Story

Apache Software Foundation Rejects Microsoft Patent License Agreement for Sender ID.
The Cover Pages, September 3, 2004 -- An open letter from Apache Software Foundation (ASF) to the IETF MTA Authorization Records in DNS (MARID) Working Group announces the decision of ASF projects not to implement or deploy the IETF Sender ID specification under terms required by Microsoft's Patent License Agreement. The letter from Apache also expresses concern that "no company should be permitted IP rights over core Internet infrastructure" and urges the IETF to "revamp its IPR policies to ensure that the core Internet infrastructure remain unencumbered." …Full Story

Debian refuses to add Microsoft anti-spam technology
By: Matthew Broersma
TechWorld.com, September 6, 2004 -- The Debian operating system project will not implement Microsoft's proposed Sender ID anti-spam specification under the current licensing terms, it has announced, because they are not compatible with open-source licenses. Debian's rejection of Sender ID follows a similar statement from the Apache server project on Thursday and criticism from the maintainers of open-source projects such as Postfix, Exim and Courier. "We are... concerned that no company should be permitted intellectual property rights over core Internet infrastructure," Debian's message said....Full Story

IETF deals Microsoft's e-mail proposal a setback
By: Paul Roberts
InfoWorld, September 14, 2004 -- A proposed technology for identifying the source of e-mail messages suffered a blow last week when a group within the Internet Engineering Task Force (IETF) established to study the proposal sent it back for more work, citing concerns over vague intellectual property claims made by Microsoft Corp. covering some of the technology. Members of the IETF's Mail Transfer Agent Authorization Records in Domain Name System (DNS) working group, also known as MARID, voted last week to not to proceed with standards documents for the Sender ID authentication technology that were submitted by Microsoft to the IETF for approval in June. ...Full Story

AOL Dumps Microsoft's Sender ID
By: Wayne Rash
eWeek, September 16, 2004 -- America Online Inc.'s announcement Wednesday that it would abandon its attempts to support Microsoft's Sender ID e-mail authentication standard are a serious setback for the Redmond, Wash., software company. "Given recent concerns expressed by the Internet Engineering Task Force [IETF], coupled with the tepid support for Sender ID in the open-source community, AOL has decided to move forward with SPF-only checking on inbound e-mail at this time," AOL spokesman Nicholas Graham said in a statement. AOL still will provide Sender ID information for outgoing mail so that its users can communicate with e-mail providers using that system, but that will be the limit of support for the standard. ...Full Story

Spammers using sender authentication too, study says
By: Paul Roberts
InfoWorld, August 31, 2004 -- New technology for identifying the sender of e-mail messages has not been widely adopted despite backing from software giant Microsoft Corp. and may not be effective at stopping unsolicited commercial e-mail, otherwise known as spam, according to a survey by e-mail security company CipherTrust Inc. A check of approximately two million e-mail messages sent to CipherTrust customers between May and July showed that only about 5 percent of all incoming messages came from domains that published a valid sender authentication record using Sender Policy Framework (SPF) or a newer standard, backed by Microsoft, called Sender ID. ...Full Story

And then there’s the other way: While Microsoft was finding out how not to contribute Open Source to the standards community, IBM was having better success. The Apache Foundation, which spurned Sender ID, was happy to be a co-recipient of speech recognition software from the computer giant (IBM’s consortium spin-off, the Eclipse Foundation, was the other beneficiary).

IBM Contributes XML-Based Speech Software to Apache and Eclipse Open Source Projects.
The Cover Pages, September 14, 2004 -- At the SpeechTEK 2004 Conference IBM announced a major contribution of software to open source initiatives at the Apache Software Foundation and the Eclipse Foundation. The new software projects are intended to "spur the availability of speech-enabled applications by making it easier and more attractive for developers to build and add speech recognition capability in a standardized way. Supported by more than 20 key industry players from speech vendors to platform providers, the initiative is aimed at ending the battles over competing, proprietary specifications." An Eclipse Voice Tools Project will "focus on Voice Application tools in the JSP/J2EE space, based on W3C standards, so that these standards become dominant in voice application development. ...Full Story

Standards and Your Business

Standards for the rest of us: Participating in standards development and early adoption of standards is usually the province of large companies that have the resources to dedicate to such efforts. The result is that such companies reap disproportionate benefits from influence and early adoption. RosettaNet has decided to promote the more rapid adoption of its work product by taking its offerings straight to the little guy, thereby doing well by doing good.

RosettaNet takes standards push to small firms
By:
Winston Chai
CNET News.com, Singapore, September 17, 2004 -- RosettaNet, a consortium pushing to establish a universal e-business language, is hoping to broaden its appeal to smaller companies by slashing the costs of standards adoption. The consortium is looking for ways to make it easier and cheaper for businesses to automate the exchange of data, including information about purchase and delivery orders, inventory levels and other business matters. This effort will be spearheaded by RosettaNet's first architectural design and research facility outside the United States, which was officially unveiled here this week. ...Full Story

Anticipating a meta communications bill: The Holy Grail of consumer and business communications being held out to us today is VoIP, and the promise of satisfying all of our telecom needs over a single fat pipe. Pressed by vanishing margins and upstart competitors, the majors are getting on the bandwagon, even if the technology is not yet as reliable as one might hope. The following article describes a new protocol that promises to get us closer to the day when we can pay just one bill a month for all of our electronic needs (and have it all work, as well)

When Do We Start Slurping SIP?
By: Beth Cohen
Wi-Fi Planet.com, August 31, 2004 --
Imagine if your Voice over IP (VoIP) (define) phone administration was as easy as using the Web. No more dropped connections, insecure sessions, lack of integration, or dependence on one vendor for systems. With Session Initiation Protocol (SIP), the long awaited promise of unified messaging may finally come true. No, SIP is not the latest in silly soft drinks; it is the latest emerging standard to address how to combine data, voice and mobility into one neat package. With its simple and integrated approach to session creation, SIP has the potential to transform how companies do business. ...Full Story

Yes, that’s billions: From time to time, NIST publishes studies underlining the economic impact of standards – and the absence of standards. These studies underline the vast disparity between the extremely low cost of creating standards – which, after all, are created through volunteer, consensus based processes – and the savings that can obtain from well developed, widely deployed specifications. Here’s another in the series, with an eye-popping number related simply to software interoperability standard deficiencies in a single, limited industry.

Software Difficulties Cost Builders Billions
NIST Tech Beat, August 30, 2004 --
Inadequate software interoperability in the capital facilities industry cost the commercial, institutional and industrial building sectors $15.8 billion in 2002 in lost efficiency, according to a newly released study commissioned by the National Institute of Standards and Technology (NIST). Conducted by RTI International (Research Triangle Park, N.C.) and the Logistics Management Institute (McLean, Va.), the report places a price tag on avoidance, mitigation and delay activities due to data-exchange problems. It also takes into account the cost of redundant paper management. The analysis, expected to benefit key stakeholders throughout the construction industry, breaks down data exchange-related losses for architects and engineers, general contractors, specialty fabricators and suppliers, and owners and operators at three different stages of a buildings life: ...Full Story

Download the full report: http://www.bfrl.nist.gov/oae/publications/gcrs/04867.pdf

Who’s Doing What to Whom

What’s a few billion dollars between enemies? Few standards battles have been as fierce in recent years as the conflict that has been raging in the video player market. The carnage focuses on the next generation DVD standard, and one camp (led by Sony) has been locked in mortal combat with the other (led by NEC and Toshiba), like Godzilla and Mothra in a Japanese monster movie. In the latest development, a consortium led by Sony has bid almost $5 billion to win the auction of the MGM Studio video portfolio, in part because control over this large block of content may help turn the standards tide in favor of the Blu Ray standard favored by Sony.

With MGM, Sony Gains in Fight for New DVD Standard
By: Ken Belson and Andrew Ross Sorkin
TechNewsWorld.com, September 19, 2004 -- The purchase of Metro-Goldwyn-Mayer by a group led by Sony (NYSE: SNE) will not only give the company an enormous film library but also considerable power in its fight to set the format for the next generation of digital video discs. The transition to the new discs, which are not expected to be widely available until next year at the earliest, could generate billions of dollars in royalties to the developers of the technology that runs them. Sony, as part of the Blue-ray Disc Association, a consortium of major electronics makers, is at the forefront of efforts to develop the new technological standard. As a major consumer electronics company, Sony could also reap the benefits of selling the new generation of disc players the new format would require. Sony's success in the standards battle is far from certain, though, because the rival HD DVD group, led by Toshiba and NEC (Nasdaq: NIPNY) , is championing its own format. ...Full Story

Web Services Update

Business as usual: It’s been a typically busy month in the Web Services standards neighborhood. The following selection of news items includes a variety of stories that, together, provide a good picture of the various currents at work in this dynamic area, including: the release of two more draft specifications by traditional partners BEA Systems and Microsoft and a supporting cast of other companies, independent of any standards body; the submission of another specification to the W3C by the same two partners and a different group of partners; and the release of a bouquet of specifications and schema by traditional consortia OASIS and the W3C, all developed through the traditional process.

WS-Enumeration and WS-Transfer Published as Web Services Messaging Specifications.
Cover Pages, September 17, 2004 -- Two new Web Services messaging specifications have been published under terms of co-development and joint authorship by BEA Systems, Computer Associates, Microsoft, Sonic Software, and Systinet. The documents have been released as-is, for review and evaluation only, with no further warrantees or representations. Web Service Enumeration (WS-Enumeration) "describes a general SOAP-based protocol for enumerating a sequence of XML elements that is suitable for traversing logs, message queues, or other linear information models. It brings enumeration capabilities to the WS-* suite of specifications, enabling an application to ask for items from a list of data that is held by a Web service. In this way, WS-Enumeration is useful for reading event logs, message queues, or other data collections." ...Full Story

OASIS WSRM TC Releases Web Services Reliable Messaging (WS-Reliability) Version 1.1.
The Cover Pages, September 10, 2004 -- The OASIS Web Services Reliable Messaging Technical Committee has published a milestone version of its Web Services Reliable Messaging (WS-Reliability) specification, including a prose document and four supporting XML schemas. WS-Reliability is a "SOAP-based specification that fulfills reliable messaging requirements critical to some applications of Web Services. It is needed because SOAP over HTTP is not sufficient when an application-level messaging protocol must also guarantee some level of reliability and security. Reliable Messaging in this context refers to "act of processing the set of transport-agnostic SOAP Features defined by WS-Reliability, which results in a protocol supporting quality of service features such as guaranteed delivery, duplicate message elimination, and message ordering. ...Full Story

Submission of WS-Addressing to W3C
The Cover Pages, August 27, 2004 --
The big news in the standards world this month was the highly anticipated submission of WS-Addressing to the W3C by BEA, IBM, Microsoft, SAP, and Sun. Furthermore, the specification was submitted in accordance with the W3C's strict intellectual property (IP) policy, turning over all copyrights to the W3C and explicitly waiving potential patent royalties and licensing fees. WS-Addressing is a critical specification for the current generation of extended WS-* specifications, since so many of them depend upon an addressing solution. WS-Addressing is also required for all but the simplest message exchange patterns (MEPs), defining a standard format for routing, reply, and error messages. MEPs such as those needed for publish/subscribe, event notification, and long running business processes need a standard for addressing. WS-Addressing provides mechanisms for specifying and correlating a message reply, and for defining a fault address. ...Full Story

World Wide Web Consortium Issues SSML 1.0 as a W3C Recommendation
W3C.org, September 8, 2004 --
Strengthening the voice of the Web, the World Wide Web Consortium (W3C) has published the Speech Synthesis Markup Language (SSML) 1.0 as a W3C Recommendation. SSML 1.0, a fundamental specification in the W3C Speech Interface Framework, elevates the role of high-quality synthesized speech in Web interactions. Application designers for mobile phones, personal digital assistants (PDAs), and a host of emerging technologies use SSML 1.0 to achieve both coarse- and fine-grain control of important aspects of speech synthesis, including pronunciation, volume, and pitch. Like its companion W3C Recommendations VoiceXML 2.0 and Speech Recognition Grammar Specification (SRGS) published by the W3C Voice Browser Working Group, SSML 1.0 is built for integration with other Web technologies and to promote interoperability across different synthesis-capable platforms. ...Full Story

Web Services are growing up: While the IT standards world in the past has often been known for efforts that have fizzled, it is becoming increasingly common that initiatives that meet with early skepticism rapidly move towards commercial reality. RFID is one example that we have been following where this has proven true, and Web Standards is another. The following article reviews the recent announcement by the Web Services Interoperability Organization that it has progressed three key profiles to "Final Material" status (in WS-I lingo, that means "We're done - go to it"). Analysts complemented WS-I for addressing concerns that may have kept some vendors from getting on the bus.

WS-I Rearchitects Basic Profile
By: Darryl K. Taft
eWeek.com, August 24, 2004 -- The Web Services Interoperability Organization, or WS-I, announced Tuesday the publication of its Basic Profile 1.1, Attachments Profile 1.0 and Simple SOAP Binding Profile 1.0 to Final Material status….The new WS-I profiles show that "Web services are finally growing up," said Ronald Schmelzer, an analyst with ZapThink LLC, of Waltham, Mass. "We have seen that companies have seemed to be hesitant to implement Web services and SOAs [service-oriented architectures] on a widespread basis until some of the major roadblocks, such as standards definition, have been cleared out of the way. Also, they are looking for signs of adoption by their customers, partners and software vendors. Now that the WS-I has taken the final step with their Basic Profile, they have eliminated one of the potential stumbling blocks, namely that of standards convergence."… Full Story

XML Update

XML Everywhere: About the only standards topic to rival Web services and wireless for level of activity is the equally widespread and inexorable, but lower profile, march of new XML schema out of a myriad of venues. This month’s crop of news includes notice not only of a new XML standard to manage human relations data, but an article that asks whether there’s a limit to how far the trend should go.

The HR-XML Consortium Approves HR Metrics Data Interchange Standard
PR Newswire, Raleigh, N.C., August 31, 2004 -- A new standard designed to improve the management of HR metrics data has been approved by the more than 100 member organizations of the HR-XML Consortium. The standard provides an important tool for employers and HR solution providers wanting to access and integrate HR performance data from different computer systems. The HR-XML Consortium's Metrics Interchange Specification consists of a simple yet flexible XML schema capable of supporting a wide variety of integration scenarios. ...Full Story

XML: Too much of a good thing?
By: David Becker
ZDNet.com, September 7, 2004 -- Aside from that, it's hard to find an industry or interest that isn't taking advantage of the fast-growing standard for Web services and data exchange. In the six years since the main XML specification was first published, it's spawned hundreds of dialects, or schemas, benefiting everyone from butchers to bulldozer operators wishing to easily exchange information electronically. While some industry observers worry proliferation has gone too far, potentially creating new instances of the interoperability problems that XML was meant to solve, proponents say the explosion of schemas is a testament to the format's success. ...Full Story

Miscellaneous

Who’s doing what?? The most surprising news of the month was the announcement by Ericsson that it would no longer be supporting the Bluetooth wireless standard, which it had launched and nurtured for years. Despite the fact that after many years of hard slogging to windward, the standard is making steady inroads in various areas, Ericsson has decided to move on. According to the following article, the decision was motivated by a desire to direct efforts to higher margin opportunities.

Ericsson ditches Bluetooth
The Register, September 6, 2004 -- Ericsson’s decision to pull the plug on its Bluetooth design and manufacturing activities do not sound a death knell for the short range wireless technology, but they do show that the standard has reached maturity – with no obviously viable next generation. This means that innovators like Ericsson will turn to other technologies with greater market potential and Bluetooth, within a few years, will be confined to a few niches. Ericsson spun off its Bluetooth group, Technology Licensing, which invented the technology, in 2000 ...Full Story

New frontiers in standard setting: Notwithstanding the explosion of consortia in the ITC space, other disciplines have been slow to get on the bandwagon. Inevitably, however, all other aspects of life and commerce – from academia, education and health sciences to government – have become more and more dependent on IT infrastructure. Perhaps as a result, the concept of consensus-based standards, developed within a consortium structure, is beginning to find traction in these other disciplines. The following article focuses on one such extension, describing how the world of biology is exploring how traditional standards processes may be useful in the bioinformatics arena.

Committee Aims to Develop Bioinformatics Standards
By: M.L. Baker
eWeek, August 20, 2004 -- At a panel discussion on the last day of the IEEE-sponsored Computational Systems Biology Conference at Stanford University, researchers concluded that the standards-making process is painful and arduous, but many clearly felt that a standards project could foster efficiency and perhaps even reverse the rapid fragmentation of life sciences. Part of the purpose of the meeting was to gauge whether the relatively new field was ready for standards. Standards are already being developed, in an ad hoc fashion, within various pockets of the life sciences community. There is already considerable overlap in standards being codified by societies such as W3C, IUPAC, and I3C; part of the committee's job will be to figure out what all the other groups are doing. The issues are sometimes as fundamental as developing a common language, according to Sylvia Spengler. For example, the same protein or gene could have different names in different communities, such as those that think about pathways, gene products, gene expression or gene sequences. ...Full Story

Intellectual Property

That’s some thicket you got there: Those who follow intellectual property matters are doubtless familiar with the phrase “patent thicket,” which graphically describes the situation where a host of owners own a dense undergrowth of interlocking patents, making implementation of technology not only difficult, but at times prohibitively expensive for those that do not own patents of their own to offer in cross licensing arrangements that lower or eliminate the actual amount of money changing hands. The article below shows not only how dense such thickets can be, but how large the forest itself can become: it marks the fact that the ETSI database of known patents identified by their owners as being covered by ETSI patents has passed the 12,500 mark.

ETSI's IPR On-line Database exceeds 12,500 entries
ETSI Press Release, Sophia-Antipolis, France, September 1, 2004 -- 10 years after the adoption of the ETSI Intellectual Property Rights (IPR) Policy(1), the ETSI IPR on-line database now displays more than 12,500 entries(2) reflecting how the Institute is taking the pulse of innovation, as embraced by IPRs stemming from the research and development efforts of the Information & Communication Technology (ICT) industry. The ETSI IPR on-line database contains IPRs, especially patents and patent applications, which have been notified to ETSI as being essential, or potentially essential, to ETSI standards. Although the database relates only to those Information Statements and Licensing Declarations actually received by ETSI, and is therefore not necessarily exhaustive, there is no doubt that there is a lot of interest in it as the value of IPRs is increasingly apparent in the ICT domain. ...Full Story

Standards and Society

Now even your appliances will be talking behind your back: Much has been made of the coming “digital home,”, but usually the focus has been on the owner commanding the appliances, rather than the appliances talking amongst themselves. The following articles focus on two component technology features of the digital home falling into place that partake of the latter. The first reports on a new standard that will allow devices to “talk” to each other, while the second examines another standard that will help them decide what to say. The suite of such standards surrounding the digital home will grow as more and more wirelessly enabled devices take their place all around us. (We’ll probably be happier not knowing what they’re saying.)

New DECT standard makes "machines" talk to users
ETSI Press Release, Sophia-Antipolis, France, September 6, 2004 -- ETSI has published a new important Digital Enhanced Cordless Telecommunications (DECT(tm)) standard. The DECT; Open Data Access Profile (ODAP) specification (TS 102 342) builds upon the tremendous success of the DECT GAP (Generic Access Profile) standard (EN 300 444), which has been implemented in 99% of the DECT products on the market today. ODAP aims at providing the means to include all kinds of "things" in to the communication process that surround us at home and at work. ODAP allows the creation of an accessories market for alarms, sensors and similar devices, which can be connected through a DECT base station to users and/or servers in either a home or industrial environment. This enables home applications such as automatic voice calling or messaging when a fire or smoke alarm goes off, as well as, remote control for home appliances...and this is just a start. ...Full Story

Wireless sensor networks looking to Zigbee Alliance
By: Tom Krazit
InfoWorld, August 18, 2004 -- Imagine a golf course that can sense rainfall, and adjust the automatic sprinkler system to delay a scheduled watering session or focus on parts of the course that didn't get as much rain as others. Or a hotel that can detect when a room is vacant, and turn off the heating or cooling systems in that room to save energy. Later this year, vendors will start releasing products based on a wireless standard called Zigbee that enable these types of sensor networks. The Zigbee Alliance plans to certify products with a Zigbee logo to ensure that products from different vendors are interoperable and easy to manage, said Bob Heile, chairman of the Zigbee Alliance, at a briefing for reporters and analysts at the Zigbee Alliance Member Meeting in Boston Wednesday. ...Full Story

It had to happen (bummer): What do you get when you add a better screen and faster data transfer to your kid’s web-enabled cellphone? A pocket pornography browsing device, of course. While the technical challenges to filtering are not easy, governments in multiple countries are passing laws to require cellphone operators to take action to filter what is being pushed across the telephone lines to under-age mobile phone users.

Wireless: Dial-up pornography spurs search for filters
By:
Jennifer L. Schenker
IHT, Paris, August 23, 2004 --
Now that cellphones are offering Web access on a par with services offered via personal computers, the mobile phone sector is grappling with the same thorny question faced by Internet providers in the mid-1990s: how to safeguard children while protecting civil liberties. Governments in Japan, Germany, Australia and Taiwan are proposing or passing legislation that requires mobile operators to protect minors from pornographic or violent content on phones and to put controls on cellular chat and dating services. Cellphone operators in Britain have voluntarily adopted a code of conduct and agreed to implement filtering systems by year-end. ...Full Story

Right. Now give me that one more time? When you’re all alone, do you admit to yourself that you still don’t have the foggiest notion what the Semantic Web” really is? Here’s your chance to gain enlightenment while gaining the information to push for your favorite candidate in an election year, and learn about Web services, all at the same time. The following article offers a chance to watch the author create a real-life example of the Semantic Web in action, using the United States Government as the source of data and the object of the exercise.

Screenscraping the Senate
By: Paul Ford
XML.com, September 1, 2004 --
The United States government and the Semantic Web are a perfect match: imagine all of those senators and representatives, each query-able by age, party affiliation, bills proposed, committee membership, and voting record. For the last few years, I've wanted to collect as much data on the U.S. government as I could, convert it to RDF, and build a site and a web service that make it possible to explore that data. This will be my goal over the next year, and I'll document my progress here on XML.com. I am aware that I am reinventing the wheel with this project. Several other sites attempt to map the government, most notably the Open Government Information Awareness project. ...Full Story

New Initiatives

Let's all get small: As the wireless telephone becomes the new killer platform, a host of traditional applications, services and enabling technology must adapt to the unique characteristics of mobile devices: small screens, slow date transfer speeds (for now), limited battery power (until fuel cells become readily available), and fewer and more cumbersome controls. But the market is huge, innovation is proceeding at a breakneck pace, and those with the most skin in the game are scrambling to form the alliances and launch the initiatives that will be needed to provide users what they want - and seem anxious to pay for. The following article, which focuses on Java adaptation, is only one of this month’s developments in this area.

Nokia, Vodafone Push for Mobile Java Standards
By:
John Blau
InfoWorld, August 26, 2004 -- The world's largest mobile phone manufacturer and Europe's largest wireless operator have launched an initiative to simplify Java standards for mobile devices in a move aimed to help developers create software for multiple Java-enabled devices, thus providing users with a wider choice of Java-based applications. Under the initiative, Nokia Corp. and Vodafone Group PLC hope to drive the development of specifications for an open standards-based mobile Java services architecture. The group intends to establish a number of new component Java Specification Requests and clarifications to existing specifications in a move to define a consistent Java API services architecture, according to the companies. This unified services architecture, they said, will enable Java-based applications to run on mobile devices from multiple vendors. The objectives and responsibilities of the unified mobile Java services architecture will be aligned and coordinated with several organizations, such as the OSGIi Alliance, the Open Mobile Alliance, the Open Mobile Terminal Platform and the World Wide Web Consortium. ...Full Story

Nokia press release: http://press.nokia.com/PR/200408/958311_5.html

Story Updates

Y’er out! In the first several innings, tiny Eolas had scored impressively against Microsoft in its legal patent battle. But later in the game, Microsoft came back strong. Since our last issue, Eolas struck out for a second time in a row in court. All will be watching to see what happens in its third, and possibly last, at bat.

Microsoft Wins Again in Eolas Patent Dispute
By: Paul Festa
CNET News.com, August 18, 2004 -- The U.S. Patent and Trademark Office has handed Microsoft a second victory in its dispute with Eolas, rejecting browser patent claims that could roil the Web if upheld. The patent in question, owned by the University of California and licensed exclusively to its Eolas software spinoff, describes the way a Web browser opens third-party applications, or "plug-ins," within the browser. The decision is a big victory for the software giant and another setback for Eolas, which claimed the rights to the way browsers open third-party applications. Eolas has at least one more opportunity to argue its case. Patents and copyrights have been taking on a higher profile in the software industry in recent months. The issue is especially contentious in the open-source arena, where the Linux operating system has become embroiled in a number of intellectual-property disputes. ...Full Story

If it’s good enough for Tom Ridge… In our July issue of this year (Open Source - Coming of Age) we looked into what the open source process still needs to accomplish before its products become ubiquitous. While there are some areas where real issues exist (e.g., adequate support for commercial users and complete enough feature sets for given usages), part of the problem is still perception. Telecom carriers, which have to date shied away from Linux, might find the first article below heartening: it focuses on the increasingly wide government usage of a Emergency Response system that relies on four well-tested open source elements; Linux, Apache, MySQL and PHP. The second article shows how open source and traditional standards must work together to achieve solutions: when the Emergency Broadcast System was launched over 50 years ago, radios were the only delivery device that most people owned and listened to. Today, more people are spending more time on line, and accessing audio and video by that means as well. The result? They're spending less time in front of the tube and listening to the radio. The solution includes CAP - the Common Alerting Protocol, an OASIS standard.

LAMP is at the heart of Emergency Response Network Systems and is saving lives.
Linux Journal, August 20, 2004 -- Like many government contractors, the provider of ERN (Emergency Response Network) Systems maintains a low profile. When you ask the CEO, Jo Balderas, for references she politely says, "the Federal Bureau of Investigation, the Department of Public Safety and the Department of Homeland Security". That's quite an impressive list, and it represents only a few of the company's clients. When you ask for a technology snapshot Jo says, "currently we use an enterprise open-source software stack known as LAMP (Linux, Apache, MySQL and PHP). We also use an appliance to support rapid deployment and to minimize total cost of ownership. Our roadmap has us integrating the OASIS Common Alerting Protocol (CAP) version 1.0 and Justice XML standards within six months." ...Full Story

FCC: Alert system so last century
By: Dibya Sarkar
www.fcw.com, Aug. 23, 2004 -- Federal Communications Commission officials said they intend to correct deficiencies in the nation's emergency warning capability, with the support of industry and nonprofit groups whose leaders say they have sought the FCC's attention on the matter for several years. Critics of the Emergency Alert System said the commissioners' Aug. 12 notice in the Federal Register of proposed rulemaking is the first significant federal step in years to create an effective nationwide system for warning the public of emergencies…. Lucia said one technical solution under consideration is the Common Alerting Protocol (CAP), a nonproprietary data interchange format that can simultaneously transmit emergency alerts through different communication networks. The Organization for the Advancement of Structured Information Standards, an international standards body, has adopted CAP as a standard….Full Story

Standards are Serious (Right?)

Show me your cards: If you thought that games weren’t serious, take a trip to Las Vegas sometime and imagine the cash flow. Not surprisingly, casino owners play their business cards pretty close to the chest, and it has apparently taken a long time for open standards to make their way into the back room. But now, according to the following article, you don’t have to be a high roller any more to be able to play at the standards table.

New Policy Allows Immediate, Widespread Adoption of Groundbreaking Standards LAS
PRNewswire, Las Vegas,, August 26, 2004 / -- The Gaming Standards Association (GSA) has announced a new policy that would allow non-GSA members to license GSA's groundbreaking standards. The policy takes effect immediately and opens the door for virtually instant, global adoption of GSA's protocol standards. The policy applies to all GSA standards, including the Best of Breed (BOB) and the System-to-System (S2S) standards. Previously, GSA's standards were available to all members, including supporting members. The new policy allows non-member companies to license any or all of GSA's standards. GSA Board President Gregg Solomon said, "This is a landmark decision in the history of the gaming industry. What this means is, starting right now, companies around the world can license and begin implementing protocol standards immediately. That is to say, the entire industry has now moved on past the old days of expending resources and development on company-specific protocols to a new era, where creativity in product development can flourish, moved forward by global standards." .