Consortiuminfo.org Consortium Standards Bulletin- October 2005
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
   Home > Consortium Standards Bulletin > October 2005
  Untitled Document
Untitled Document

 

OCTOBER 2005
Vol IV, No. 10

Standards for a Small Planet

EDITOR'S NOTE: GOVERNMENT, SSOS AND SOCIETY (PART III)
In this issue we continue our examination of the relationship between the public and private sectors, focusing this time on the future. Print this article
   
EDITORIAL: THE SMALL BLUE SPHERE
At key points in history, governments have provided a vital role in accelerating adoption of crucial standards by the private sector.  One such instance was the standardization of railway gauges in the 19th century.  Today is another, as governments around the world advance the cause of open standards and open source software. Print this article
   
FEATURE ARTICLE:

STANDARDS FOR A SMALL PLANET

Standards have traditionally addressed discrete, immutable problems that derive from the laws of physics and nature. The regulations and standards that will be needed to address complex environmental issues will face a greater challenge: how to achieve desired results where there are not only multiple variables, but we don't even know what all of the variables are. Print this article
   
TRENDS: OPENDOCUMENT  VS.  OFFICE:  THE  LINES  ARE  DRAWN
It’s a rare event when competitors have a chance to break a monopoly, and with the adoption of OpenDocument by Massachusetts, many of the largest IT companies in the world are pressing their advantage for all they are worth.  We present a timeline of what has happened since our last issue.  Print this article
   
STANDARDS BLOG: TELL THE GOVERNMENT WHAT YOU THINK ABOUT THE FUTURE OF THE INTERNET
The WSIS process is building towards a climactic meeting in Tunis in November, with Internet governance as the most contentious issue, and the U.S. standing alone in its position that it should retain control of the Internet root directory. A little noticed posting at the State Department's informational Website tells you how you can tell the U.S. Ambassador what you think. Print this article
   
CONSIDER THIS: DO STANDARDS MATTER?

Of course they do. Here are three reasons why that you've probably never thought about before Print this article

 
FEATURED MEETING: STANDARDIZATION: UNIFIER OR DIVIDER?

The latest in the series of Standards Edge conferences will be held in Vancouver, British Columbia on December 5 – 7. It will focus on the standards-related forces that unify and divide nations, industries and people, and what can be done to tip the balance to unification.. Print this article

   
NEWS SHORTS:  
Washington Disses WSIS; the Semantic Web Picks up Steam; EWC Forks IEEE Standard; Congress Forgets About Wireless Standards; Rambus Seeks More Documents; Get Ready for Two DVD Formats; Multiple OpenDocument Format Compliant Products are Released; Scientigo Says it Holds Patents on XML; and, as always, much, much more. Print this article
 

 




EDITOR'S NOTE:

GOVERNMENTS, SSOS AND SOCIETY (PART III)

This issue is the third in a series exploring the evolving relationship between the private and public sectors, as they confront the increasingly intractable challenges of a modern world. 

This series began in August, with an examination of the poor coordination that too often characterizes the standard setting partnership between the government and private-sector in the United States.  That issue included a call to optimize that relationship to national advantage.  The September issue focused on a current drama unfolding in the Massachusetts state government, involving both open standards and open source software.  The current experience of the Commonwealth highlights the importance of these tools in the modern world, as well as the significant role that governments can play in accelerating their adoption by society at large.

In this issue, I look to the future, and ask whether we will be able to meet new challenges (such as global warming) that are of a type and magnitude that our existing political systems seem inadequate to address, and which will require a variety of standards solutions.

This examination begins in the Editorial, which contrasts the current need for universal cooperation to solve global issues with the slow progress achieved to date towards that goal, and asks whether the consensus based standard setting process might provide a better model for the future than historical treaty negotiations.

This month’s Feature Article looks far into the future, suggesting what a long-term approach to solving global environmental issues might resemble, as we become better able to monitor the effects of regulations, and to utilize the resulting data to refine such restrictions directed at environmental preservation and enabling sustainable development.  The result would be the need for dynamic rather than static standards, capable of adjusting to new information as it becomes available.

The Trends piece this month looks back to the September issue, and provides a timeline of the ongoing developments in the OpenDocument saga.  As these events demonstrate, some of the largest technology companies in the world are continuing to press their standards-based advantage in an effort to break Microsoft’s dominance in office productivity software, providing a unique opportunity to observe how a single standard can profoundly affect the marketplace.

My Standards Blog selection for this month looks in on the status of a different global governance story relating to standards: the World Summit on the Information Society that will conclude its originally contemplated work at a meeting in Tunis in November.  Going into that meeting, the United States stands alone regarding the control of the root directory of the Internet, and in this post, you can learn how you can log on and tell the U.S. Ambassador to the process who you think should "govern the Internet."

And finally, this month's Consider This tries to answer the question of "why standards matter" in the greater context of the societal concerns that have been the subject of this series of issues.

Next month, we will conclude this series on the intersection of standards, government and society with an in-depth of review of what happens at Tunis in the final WSIS meetings, and what the decisions made there may portend for the future.

As always, I hope you enjoy this issue. 

    Best Regards,
  signature
  Andrew Updegrove
  Editor and Publisher
   

2005 ANSI President’s
Award for Journalism



EDITORIAL

THE SMALL BLUE SPHERE

Andrew Updegrove


The Earth was small, light blue, and so touchingly alone, our home that must be defended like a holy relic.

Soviet Cosmonaut Aleksei Leonov
 
We are as gods, and might as well get good at it.
Purpose statement, the Whole Earth Catalog

One of the most transformative images of the 1960's was the first publicly released, color picture of the earth as seen from lunar orbit.  Like a small, blue-green marble lost in the immense black void of space, this view of earth became a metaphor in 1969 for many of the fragility of our planet, and a timely totem for the then-emerging environmental movement.

The years that immediately followed saw many victories for environmentalists in the United States, including passage of the Environmental Protection Act in 1969 (which established the Environmental Protection Agency), the extension and then amendment of the Clean Air Act (1970 and 1977), and the enactment of the Clean Water Act (1972), Safe Drinking Water Act (1975) and Resource Conservation and Recovery Act, which established the so-called "Super Fund" (1976).  

The result of these and other pieces of legislation was cleaner air and water, the remediation of many notorious hazardous waste sites, the conservation of millions of acres of federal and state lands, and increasing awareness of the negative impact that human activities continue to have on our environment.

But the early momentum of the environmental movement of the 1960s and 1970s has not been sustained uniformly throughout the world, where Europe (for example), largely assigns a higher priority to environmental concerns than does the United States, both absolutely as well as philosophically.  For example, in the U.S., "hard science" is often demanded to prove that a given action or substance will have a deleterious effect, while in Europe, the possibility of such an effect may be sufficient to result in legislative action.  These same divisions are present within the United States itself, resulting in swings of policy when control of government passes from one party to the other.  The most significant example of such a swing was the repudiation of the Kyoto Protocol on global warming by the current administration, following signature -- but not submission to Congress for ratification -- of the Protocol by the Clinton administration, which was aware of significant opposition to the Protocol in the Senate.

Historically, such variations in policy might be of concern solely to the citizens of the nation involved.  But increasingly, the actions of one nation can impact the welfare of many, not only with respect to over-exploiting a single common resource, but by degrading the very biosphere we all share as well.

The extremely complex example of global warming therefore marks a transition from an era during which the "tragedy of the commons" was the primary international environmental concern (i.e., when everyone shares in the ownership of a resource - such as fish stocks -- there is a greater incentive for any single user to exploit it than to conserve it).  And yet efforts to address even single-resource issues via international treaties have proven to be difficult enough.

Nor is global warming the only emerging area of common concern.  Currently, an initiative commissioned by the United Nations, known as the World Summit for the Information Society, or WSIS, is concluding a two-year process that includes examination of the question "Who should govern the Internet?"  Following the most recent meeting in that process in Geneva in August, the near unanimous opinion appears to be "not the United States," at least as regards the root directory of the Internet, which is currently under the indirect control of the U.S. Department of Commerce. 

From one perspective, the Internet and the Web are simply a new virtual element of our environment, and in that sense present parallel concerns to environmental issues.  True, telecommunications treaties and commonly agreed upon standards have always existed.  But their impact was largely internal to the countries involved -- if you chose not to sign such treaties or adopt such standards, you lost the benefits of the network, but did not harm the interests of others (other than by taking your nation out of the network, either fully or partially).  But he who controls the root directory of the Internet can, literally, disconnect an entire nation.

As time goes on, issues that require international cooperation are bound to multiply rather than diminish, and today we have no comprehensive, binding (or even consensual and demonstrably successful) process for dealing with them. 

So what are we to do? 

Sadly, there are four, seemingly intractable problems to be solved.  Using global warming as an example, the first is identifying and quantifying the problem to be addressed (is the atmosphere warming, and are we indeed the cause?).  The second is agreeing on the priority to be assigned to the problem at hand (which element should be given greater weight: the impact to the economy, or the impact to the environment?).  The third is the means of addressing the risk (should emission controls fall equally on all nations, or should third world countries be granted allowances to compensate for the damage already done by those nations that have already achieved a modern stage of development?)

And lastly, there is the hardest question: will nations be willing to yield a measure of their historical sovereignty in order to achieve effective solutions to global problems?

To date, the answer to that question has been "no," which has made reaching a decision on the first three questions that much more difficult.  Some 86 years after the founding of the League of Nations, there is still no global body that is empowered to create a law that is binding upon all.

If we assume that the answer to the sovereignty issue will continue to be "no" for some time, then we have no choice but to find a different way to solve problems such as global warming, because we cannot wait to find a solution.

Perhaps a model for that solution may be found in the standard setting process.  After all, what other system exists that is not only global, consensual and successful, but involves the voluntary giving up of rights in order to achieve a common good?  This process, in fact, has found a solution to the conundrum of the Tragedy of the Commons by identifying benefits in the exploitation of opportunities that can only be achieved by reaching consensus.

The magic of the standard setting process is that it makes the risk of standing outside the process greater than the costs and concessions required to participate.  In consequence, not only do those whose participation is vital to success willingly engage in the process, but they adopt the resulting standards without the compulsion of laws as well.

To an extent, this is how the treaty process has historically operated as well, but characteristically sanctions for non-compliance have been required in order to achieve success, because compliance may be seen to be more costly and disadvantageous than non-compliance.

The lesson of the standard setting process, then, is that it takes carrots as well as sticks to bring stakeholders -- multinational corporations as well as nations -- not only to the table, but to compliance as well, until that day when even more urgent changes force a different answer to the sovereignty question.

Perhaps the WSIS process will provide the crucible within which a first step in this direction may be taken, as the standard setting process and nationalism meet in the same room, in an effort to solve a common problem.

subscribe

Comments? Email:

Copyright 2005 Andrew Updegrove



FEATURE ARTICLE

STANDARDS FOR A SMALL PLANET

Andrew Updegrove

Abstract: Traditional standards address fixed problems, and therefore are (and must be) unchanging to be useful. Environmental and resource issues, in contrast, are by their nature dynamic, and the purpose of regulations and standards in this domain will be to change, or prevent change, in such systems. A new level of data collection will be required to create such standards, and to create the type of feedback loops that will be needed to confirm and fine tune their effects. Finally, the standards that will be required in order to meet the challenges of finite systems and a growing global population will also require that new factors – cost, political feasibility and social equity – be taken into account, using new infrastructures yet to be devised in order to reach workable solutions. Such solutions might be called "Standards for a Small Planet."

Introduction: Humanity today is facing the reality that resources are finite, as is the capacity of the world to absorb the punishment that we are inflicting upon the environment. Having passed the point where the earth's resiliency was equal to our impact, we must now achieve an understanding of the workings of the environment that is sufficient to bring our effects back within the bounds of that resilience. Doing so will be a matter of unparalleled complexity, and will require an endless process of data gathering, integration, modeling, regulation, and then monitoring to determine whether the desired effects have been achieved.

Each of these steps may require the development of new standards, but the types of standards that will be needed may not be the same as those that have been sufficient to address previous challenges.  Nor, perhaps, will traditional standard setting organizations be appropriate to develop them.

Standards have historically addressed discrete tasks: from defining weights and measures in millennia past to baud rates and interfaces today.  Because the problems these standards were created to solve were finite, they could be (and were) useful in isolation, often needing no coherent connection to any other standards at all, even in the case of weights and measures, as in the chaotic, ad hoc English system that evolved over time.  Even where such relationships were coherent, as in the Metric system, the base ten relationships between the various weights and measures were provided for convenience rather than out of actual necessity. 

The creation of standards, not surprisingly, has therefore been accomplished in discrete settings, either by governments from the days of Hammurabi to the present, in the case of weights and measures, or inside individual standard setting organizations in modern times, sometimes formed to create a single specification, or at most to develop standards for a well-delimited technical or business domain.

Today, this insular, distributed system is already being challenged by many forces, most obviously by the convergence of information and communication technologies (ICT) within the same network, and even the same device.  No longer can a single standard exist in a business and technical vacuum.  Instead, it may be invoked by a mobile device that may not only provide for voice communication, but also may include camera, video, Web browsing, PDA, music and other functions as well.  The demands for versatility on a standard may also transcend technical compatibility issues, to involve economic and business concerns as well.  For example, a standard originally designed for use in a royalty-tolerant commercial setting may now be needed in another where royalties are not tolerable, and open source licensing terms are mandated.

Standards thus increasingly have meaning within larger and more complex contexts, forcing those that create a standard to accommodate new demands, and to re-balance the concerns that would have been considered in defining the same type of standard only a short time ago with ones that have only recently emerged.  Since convergence and innovation will continue, care will also be demanded to anticipate the future use to which a standard may be put.

Still, even in this broader context, a given standard typically remains only a part of a single module within a larger system.  For example, the U.S. Department of Defense desires to create a global ITC infrastructure that will support the "network centric operations" of the combined armed forces of the U.S. and its allies wherever located, anywhere in the world.  In this model, the data from a single source (e.g., a battlefield sensor) would become not only accessible, but would also identify itself, to those users out of more than a million individuals with network access that have the credentials to receive that particular type -- and source -- of data.

Such a vision, while imposing, still represents the pursuit of traditional ICT standards goals writ very large, since each of the standards required to meet this grand design needs only to interact with other standards in time-honored ways.  Thus, while the vision may be grand and the task of creating such standards will become more complex, the nature of the demands placed upon an individual specification will remain familiar.

The traditional standards infrastructure, then, has proven itself capable of scaling very impressively.  Doubtless it will continue to do so, but only to meet the increasing complexity of traditional challenges.

Today, however, significant environmental challenges are emerging that involve multiple, changing variables, and for which traditional static standards will prove inadequate.  To meet this type of need, webs of standards that are dynamic rather than fixed will be required, with each individual standard interconnecting with and adjusting as required by changes in independent variables.  Some of these standards, rather than serving the network, would in a sense become the network itself. 

Agreeing upon and utilizing such standards will demand a new type of infrastructure as well, requiring new types of international relationships in order to marshal the means and the will to tackle issues that may only be solved through determined, collaborative action. These standards may appropriately be referred to as "standards for a small planet."

The need for a new type of standard:  The definition of a standard has never been static.  From weights and measures, to standards that achieve physical interoperability (e.g., train gauges and screw threads), then safety (e.g., pressure standards for boilers), performance (e.g., tensile strengths for airframe materials), and finally virtual interoperability (e.g., radio frequencies and software interfaces), the concept of the commonly agreed upon tools we call standards has continued to expand and evolve.

Most of these tools, however, could and often have incorporated somewhat arbitrarily determined characteristics, since what was needed was any, rather than a specific standard for a weight, or for the distance between railroad rails, or even for a radio frequency, so long as the exact parameters chosen lay within certain bounds of practicality.  And most importantly, once fixed, the utility of a standard has historically been based upon the assumption that it would never change.

But static standards are only useful in unchanging circumstances.  Indeed, a standard relies on the assumption that the physical world and the natural laws that govern it will not change.  But can such a standard be useful when the goal is in fact to change, or limit the change, of the physical world as part of a continuing process, or must such a standard also adjust, adapting as the desired results are achieved (or not)?

Over the last several decades there has been a growing appreciation that the world's resources, as well as the earth's capacity to absorb abuse, are both finite.  From this realization have come many new and difficult questions that raise social, geopolitical and equitable questions.  If resources are finite, how should they be allocated, and who should be entitled to make such a judgment?  If given actions can damage common, global resources, or result in deleterious effects such as global warming, how can this damage be prevented or moderated, and does one nation have a right to influence the actions of another?  If not, how will such restrictions be agreed upon and enforced?

As yet, there are no wholly satisfactory answers to such questions when the actions required extend beyond national boundaries, although the need for such answers grows more urgent by the day.  But if international consensus on these questions could be achieved, how will we decide what specific actions need to be taken, and how will we know whether they are working?  And as among a variety of actions that could be taken, would we  be able to determine which would be the most economical and present the most acceptable burden upon society?

Solving these problems will involve ever more sophisticated modeling that will utilize available data to project what actions will have what results (both positive and negative) on a factor by factor basis (e.g., will carbon dioxide rise or fall if we do X?), and, in turn, what those reactions will have on other factors (if carbon dioxide levels can be made to fall on average by .003% between latitudes 75 and 80 north between May and June of each year, what else changes?) 

Because such downstream effects can only be estimated, there will be a need for intensive data collection in order to measure the effects of environmental protection actions taken in order to learn, in part by trial and error, what effects human actions and policies are having.  As importantly, there will also need to be formulae – variable standards – that utilize these data in a feedback loop in order to indefinitely adjust regulations and policies on a constant, periodic basis.  These standards will need to balance not only scientific data, but political and economic feasibility as well, in order to determine the most expedient and achievable means to the required ends. 

In other words, unlike traditional standards that answer only to fixed, direct physical and cost issues, new environmental and sustainable resource standards will need to be developed via a process that incorporates not only scientific data, but accommodates forces as non-empirical as arguments for social equity between developing and developed nations – as has already been witnessed in the rancorous debate over the Kyoto Protocol on global warming.

The current state of the art:  Designing such standards will involve acquiring skills that we currently largely lack.  While science continues to provide grater understanding of natural processes, actual attempts to achieve sustainability have been limited to discrete resources, such as in the management of commercial fishing stocks and populations of game animals.  In the case of game, there has been reasonable success.  But game animals can be counted in the field and at harvest, and the legal take can be limited on an annual basis through managing the length of seasons and the number of licenses issued.  Fishing stocks are harder to estimate and can be more difficult to monitor, and the ecological webs involved within which they exist are complex and hidden as well.  As a result, success to date in this area has been limited at best, and many of the forces resulting in recoveries and declines remain largely a black box.

The concept of sustainability seems to have found its greatest purchase in the area of “sustainable development,” which conjoins recognition of current human needs with environmental concerns and seeks to reconcile the two in a fashion that does justice to the needs of our descendants.  The concept has received the formal support of the United Nations (which published the report of the Brundtland Commission in 1987 on this topic), as well as recognition by several nations (for example, France acknowledged environmental responsibility in an amendment to its constitution in 2004, as did Poland in Article 5 of its 1997 constitution).

Other activity has been taken by private industry, non-governmental organizations and other non-profits.  Examples of these efforts include initiatives in areas such as ethical investing and enabling environmentally responsible purchasing (e.g., certifying that lumber products have been harvested in a sustainable, environmentally-friendly fashion).  But since compliance with such standards is elective, their impact has to date been limited, relegating the role of such efforts to one of leadership rather than meaningful and systemic change. 

The role of government:  It would seem obvious, then, that global environmental and resource problems can only be solved through international governmental action.  The leading effort to date in this area, of course, has been the multi-year effort to agree upon curtailment of global warming via the negotiation and execution of the Kyoto Protocol, which the United States has thus far declined to ratify.  Despite this setback, the fact that widespread agreement was achieved at all represents a hope for the future.  And while the Federal government of the United States has not committed the nation as a whole to comply with the Kyoto Protocol, individual states (such as California and Massachusetts) have moved forcefully to set environmental laws that are more strict than their Federal analogs.

Given the near-unanimous ratification of the Kyoto Protocol and the growing scientific consensus that human actions are contributing to a process of global warming, it seems likely that efforts to curtail green house gases will increase.  With the commitment of governments comes the possibility of affecting real change, but the question then arises of what specific changes should be required.  In the short term, the obvious answer is to seek any reductions in the most pernicious gases and substances that are politically achievable, since at this juncture in history only a reduction, rather than the elimination, of the impact is feasible. 

But in the long term – ten, twenty or fifty years from today – there will need to be better and more calibrated goals that permit the maximum environmental protection to be achieved at the minimum economic cost.  With more precise measurements and understanding of industrial causes and environmental effects, better calculations of the true avoided cost savings of various types of actions, such as development of new alternative energy infrastructures, will also become possible. 

The concept of such an evolution in strategies for attacking environmental and resource challenges would be roughly analogous to the evolution of cancer therapy in modern times.  Following decades when surgery and radiation were the only treatment options, the first efficacious chemical agents were discovered.  However, the pathology of cancer at the cellular level was poorly understood, leading to the shotgun application of a limited number of harsh chemotherapy regimes to all patients that were diagnosed with a single type of cancer, identified only grossly by the organ of origin (e.g., liver or pancreas).  More recently, it has become clear that in fact there are many different variations within what were previously thought to be single types of cancer, and that many of these varieties have unique points of vulnerability.  With the discovery of new drugs based upon new treatment theories derived to exploit these points of opportunity, doctors are beginning to utilize more highly-targeted and efficacious therapies that match a given drug regime with the specific sub-type of cancer involved.

Working towards such a goal in the case of resource utilization and environmental protection is neither unreasonable nor impossible.  Already a variety of different greenhouse gasses have been identified, and more is learned all the time which ones are most potent, and in what ways the various components of the atmosphere react with each other to reach specific end results.  As more is learned, more "treatment options" will therefore become possible, as well as the prioritization of the benefits and costs of controlling specific agents.

When one takes the long view (e.g., the world that our great-grandchildren will inhabit), there is little choice but to raise the level of restrictions on our activities until equilibrium is reached, and ideally to work towards recovering lost ground.  But maintaining equilibrium in a system as complex as the earth's biosphere will be a challenge indeed.  Even here, however, there are analogies to be found, such as today's efforts to control national economies through the actions (in the case of the United States) of the Federal Reserve, which closely monitors and seeks to influence the vastly complex fluctuations of the marketplace through a constant process of interest rate adjustments.

As with the economy, there will need to be regulations and standards applied that are not fixed, but variable.  A standard that is applied in one year may not be appropriate for even the next, as other variables change, or as the monitoring of causes and effects permits the refinement of the standards and regulations being applied.

What is to be done:  If the concept of empirical and dynamic global sustainability standards is to be pursued on a systemic basis, it would require an effort of a magnitude and type not previously attempted, and would involve tasks such as the following:

Data identification Data of may types will be required in order to support a growing global population on a sustaining basis.  The following provide a representative sample of the questions that must be answered through the gathering of such data:

  • Global warming:  What are the existing percentages of the various substances that contribute to global warming (gas, particulate, etc.)? What is the magnitude of the effect of each, both alone and in combination with other agents? How fast are production levels of these agents changing, and who is producing them?
  • Non-renewable resources:  What are the technically and economically available stocks of hundreds of non-renewable resources, both energy-producing as well as raw materials for production?   What percentage of each of these resources is recoverable without unacceptable degradation of the environment?  What will our future needs be, and can alternatives be found as resources become exhausted?
  • Renewable resources:  To what extent can we continue to harvest non-domestic flora and fauna without imperiling their survival?  How do human activities impact the environments upon which such species depend?  How can forests be harvested in a manner that is both sustainable and environmentally acceptable? 
  • Impacts:  What are the environmental impacts and remediation costs of the thousands of processes upon which modern society depends, from mineral extraction to processing to final disposal?
  • Food resources:  What is the carrying capacity of arable land?  What land can be utilized within the limits of locally available water resources?  What will be the impact of altering existing vegetation on the carbon dioxide cycle?  What will be the transportation costs and energy demands of distributing this food, and what practical limits will such costs impose on where people can live? 

Data acquisition and integration:  Once identified by type and purpose, the data would need to be acquired, presumably through international cooperation and/or via new global institutions, and then integrated in a way that permits the use of that data for multiple purposes.  New GIS and other standards would play a key role in this process.

Theorizing and technical modeling:  When collected, the data would be utilized to generate models that would test the effects of modifying individual variables against all outcomes of concern (both positive and negative, on a systemic basis).

Economic and geopolitical evaluation:  While the technical feasibility of modifying certain industrial outputs would be the first step, the economic impact and geopolitical feasibility of actually mandating such behavior would be equally vital, as would reliable, long-term public funding, immune from the impacts of changes of administrations.

Achieving consensus:  Once the best alternatives were identified, they would need to be “sold” to the global community.  This would involve not just buy-in to absolute goals, but resolution of the type of issues that complicated the Kyoto process as well, such as whether restrictions should apply equally to all nations (the U.S. position), or should impact less developed countries less severely than nations those that have already modernized (the Protocol’s approach).

Monitoring and Enforcement:  Public agreement may be more easily achieved in the future than actual compliance.  Mechanisms would therefore be needed to monitor human behavior as well as the environment.  Contemporary experience under existing treaties, such as the WTO and those intended to avoid nuclear arms proliferation, indicates that this will be an ongoing challenge.  On a positive note, such existing mechanisms can provide a precedent for future action.

Data updating and integration: Monitoring the success or failure of regulatory restrictions at achieving desired results will be essential, because evidence of positive effects will only become clear very slowly.  Thus, if limits are set too low or permitted resource utilization levels are too high, many years of damage may become inevitable before readjustment can be agreed upon and remedial actions take effect.  Similarly, if controls are set strictly, the ability to prove positive results as quickly as possible (however minor) will be vital in order to maintain the global will to accept such restrictions.

Standards readjustment:  Over time, the system will need to be adjusted on a periodic, ongoing basis, as normal variables take effect (e.g., natural weather cycles), as global development occurs in unexpected ways (e.g., regional population spikes and dips), as catastrophic events (e.g., volcanic eruptions) have temporary impacts, as new industrial processes are devised which have new environmental effects, as data collection efforts become more sophisticated, and as the interrelationship between variables becomes better understood.

System requirements:  What would be required to enable such an effort?  At minimum, the following:

  • Critical Mass:  Participation by a sufficient number of nations, measured by aggregate impact, would be essential in order to ensure that the effort could be successful rather than merely symbolic.  This participation would involve not only the willingness to accept restrictions, but cooperation in data collection as well.
  • Methodology:  Agreement would be needed on whether globally mandated restrictions or nationally determined strategies would be utilized.  The answer might vary on a problem-by-problem basis.  For example, a cod is a cod, but various gases contribute to a single problem, and one nation may find one more economically palatable to curtail than another.
  • Authority:  The participants would need to agree to permit international inspection and monitoring of compliance, as well as agree to submit to sanctions for a failure to comply.
  • Structure (political):  Either a United Nations agency, a new treaty organization, or a body of a wholly new design would be required to provide the venue within which agreement could be reached and through which compliance would be enforced.
  • Structure (science):  A global organization, or alliance, would be needed to collect, integrate and update data, as well as to perform modeling and ratify recommendations for political adoption, as well as to fine-tune the standards derived and approved over time.  Ideally, a single organization (rather than each nation individually) would perform this function and release the data upon which restrictions would be based, to lessen the likelihood of disputes.  Such an organization would need to be apolitical and immune from external influence to the greatest extent possible.
  • Structure (standards):  At every step of the way, there would be the need to identify and use existing standards, as well to create traditional and dynamic standards for data measurement, analysis, limitations, and monitoring.  Creating and integrating these standards will likely require not only the creation of new standard setting organizations, but also a level and type of cooperation between existing standards organizations and governments that has not been required to address historical problems. 

Conclusions:  Is such a process so Herculean as to be unimaginable?  The first reaction would be to say yes.  But the more sobering reflection is whether we have any choice but to undertake it.  Even if global populations were to stabilize in the future, it is hardly probable that stasis could be achieved at a number that is not billions higher than at present.  Certainly there can be no doubt that, absent significant changes in our behavior, our existing resources and the resilience of our environment would be insufficient to withstand the onslaught of nine billion people, each hoping to enjoy the life style of a typical American. 

On a more hopeful note (depending upon one's point of view), the challenge we face is not monolithic, but rather a host of nested imbalances that need to be redressed.  While global warming has garnered the lion’s share of recent headlines, there are countless smaller issues within the over-all fractal pattern of environmental degradation and resource depletion, from the die-off of coral reefs to the destruction of rain forests.  Although each of these issues represents its own set of challenges that must be understood and resolved, each also represents its own test bed from which much generic expertise will grow and transferable discoveries will be made in a new discipline that may come to be known as “sustainability science.”

As such experience does grow, scientific tools and standards will be devised to make use of this experience.  Hopefully the necessary political will and international structures will evolve as well.   Perhaps as the simpler challenges are successfully addressed, public determination will grow, and the mores of society will permanently settle into a state where maintaining sustainable balances will become an unquestionable mandate. 

And if not?  That is not a question with an answer that is pleasant to contemplate.   Truly, the credo of NASA is one that might usefully be appropriated for this effort:  Failure is not an option.

subscribe

Comments? Email:

Copyright 2005 Andrew Updegrove



TRENDS:

OPENDOCUMENT VS. OFFICE:
THE LINES ARE DRAWN

 

If commerce can be theater, than certainly the current full-frontal assault on Microsoft Office by a significant number of the largest technology companies in the world is such, and political theater at that. And while the OpenDocument office suite and then the format of the same name developed by OASIS have been under development for years, it was the announcement by Massachusetts that its Executive Agencies, for pragmatic reasons, would be required to use products supporting the OpenDocument OASIS Format that unleashed the corporate hounds in full cry in the hunt to tree Microsoft at last.

Since the last issue of the Consortium Standards Bulletin OpenDocument and Massachusetts: the Commonwealth Leads the Way scarcely a day has passed without some announcement, large or small, calculated to maintain the maximum momentum possible in the drive to break Microsoft's dominance in office productivity software, an immensely profitable line of business (at least for a vendor that can maintain an 85% market share).

The stakes, of course, are much greater than just stealing market share in a single product space. If Microsoft's competitors can succeed with OpenDocument, then Microsoft will not only be weakened in this product space, but both open source software and open standards will gain credibility and appeal in the marketplace as well.

This brief space in time thus presents a unique opportunity to observe a variety of fascinating dramas unfold concurrently in rapid, real time, from the combined efforts of companies like Sun and IBM to press their advantage, to Corel's Quixotic effort to accomplish whatever Corel is trying to accomplish (itself an open question as of this writing), to Microsoft's public positioning regarding OpenDocument at the same time that may assume it to be pushing for a counterattack in the Massachusetts legislature.

And all of this has been made possible by the creation of a humble (if ambitious and complex) standard. If the current efforts to unhorse Microsoft in the office productivity software marketplace are even partially successful, let it not be forgotten what was the nature of the engine of destruction that eventually accomplished a decades-long quest.

With that as prelude, here is a day-by-day chronology of the more significant developments in the last 30 days in the continuing OpenDocument saga. As this issue goes to virtual press, the crescendo continues to build.

  • September 27: Long-time Redmond nemesis (and more recently partial ally) Sun Microsystems releases StarOffice 8, which supports the OpenDocument format, to much fanfare.
  • September 28: eWeek.com reporter Steven J. Vaughan-Nichols reports that Communications Manager for Corel WordPerfect Greg Wood has told him that "While Corel won't commit to a date for adding OpenDocument to WordPerfect, the company made it clear that it is working towards that goal."
  • September 29: Sun pledges not to assert any of its patents against OpenDocument implementers, thereby neutralizing question earlier raised by Microsoft.
  • September 30: FoxNews.com posts a commentary critical of OpenDocument by James Prendergast, Executive Director of Americans for Technology Leadership. After a flood of objections from readers, Fox acknowledges that it should have disclosed that Microsoft was a founder of ATL. Fox later posts a sampling of the comments received.

  • October 3: Microsoft announces that its next release of Office, Office 12, will support Adobe PDF , the second format (besides OpenDocument) that Massachusetts will allow to be used for saving documents. Microsoft attributes decision to "customer demand," saying that it has received 120,000 requests a month for this change.

  • October 4: Computerworld reports that a study of large government departments found, "for many sites, it is now 10 times cheaper to migrate to the new OpenOffice.org 2.0 than upgrading to Microsoft Office 12".

  • October 10: I report at the Standards Blog that Microsoft has indicated to me that support of OpenDocument is not impossible, likening the situation to PDF: "For us this has been, and will continue to be a matter of evaluating the flow of customer requirements."

  • October 10: Announcement of the formation of the "Open Document Fellowship" to support adoption of the OpenDocument format and products that support it. A ZDNet.com article erroneously reports that OASIS is a co-founder of what is in fact a volunteer-supported.

  • October 10: OASIS announces that it submitted OpenDocument to ISO on September 30 for adoption as a "Publicly Available Standard." Achieving this status will make it much more likely that European and other governments will adopt it.

  • October 13: OpenOffice.org, the developer of an open source office suite that has committed to support OpenDocument (and which supplied the precursor specification used by OASIS as a starting point for its development of OpenDocument) announces that the release of OpenOffice.org 2.0, which will support the OpenDocument format, will be delayed.

  • October 18: In an interview at BetaNews.com, Corel's Richard Carriere and Greg Wood promote WordPerfect's number 2 market position in office productivity software and criticize OpenDocument as "an unsupported standard" that it has no current interest in supporting.

  • October 19: Corel is roundly criticized for the interview in Shame on Corel and at other sites that pick up the story. Corel later shifts position (see October 25, below).

  • October 25: Dan Farber at ZDNet.com reports a conversation with Microsoft's Ray Ozzie, who makes eventual support by Microsoft sound much more likely.

  • October 25: InformationWeek confirms rumors that Massachusetts Secretary of State William Galvin and State Senator Marc Pacheco are critical of OpenDocument and will hold a hearing regarding the Information Technology Division's OpenDocument policy.

  • October 31: Massachusetts Secretary of State William Francis Galvin and State Senator Marc R. Pacheco hold a much-awaited hearing on the OpenDocument policy.

The above is only a sampling of what has transpired since our last issue. To see a more complete day-by-day selection of events and analysis and to follow the story as it continues to unfold, visit the OpenDocument subtopic heading at the ConsortiumInfo.org News Portal and the Standards Blog.

 


 


STANDARDS BLOG:

October 29, 2005

Tell the Government What You Think
About the Future of the Internet

For over a year I've been covering the United Nations sponsored "World Summit for the Information Society" (WSIS) process, which hopes to bring the benefits of the Internet and the Web to all peoples everywhere, a very honorable and important goal. Not surprisingly, this process has raised the question of "who should control the Internet?" That question has become extremely emotional, with the United States government saying "Us!" as regards the root directory of the Internet (which is currently under our control), and the rest of the world saying, in so many words, two words that I won't repeat here.

Amazingly enough, on November 2, you'll have the opportunity on-line to tell the U.S. representative in this process who you think should hold the keys. I'll tell you how later in this post.

But first, a little background. Who controls the Internet today?

Right now, the answer is, "it depends on what aspect you're talking about." If you're talking about the standards that enable the Internet, the answer is a number of non-profit, global consortia (e.g., the W3C and the IETF). But if you're talking about the root directory of the Internet – the country codes (in one database) and the individual addresses given to individual computers (in another) -- then you're talking about the International Corporation for Assigned Names and Numbers, popularly referred to as ICANN.

ICANN is, on the one hand, a non-profit that is headquartered in the U.S. and has an international Board. But on the other hand, it is subject to the will of the National Telecommunications and Information Administration, which in turn is a unit of the U.S. Department of Commerce. And there's the rub, because theoretically, someone in Washington could call someone in California at ICANN some day and tell him to turn off, oh, say, Iran or North Korea (not that they would, of course).

So when the WSIS process began, the International Telecommunication Union (ITU), which might have, but didn't, bid for control of the Internet during it's chaotic early days, put the "control of the Internet" on the table (the ITU is an agency of the UN, and is also primarily responsible for organizing and running the WSIS process). As a result, the Working Group on Internet Governance (WGIG) came into being, and discussion quickly zeroed in on control of the root directory as a matter of international concern.

All that's happened in the past two years would fill a book (and doubtless some day will), but if you want to get quickly up to speed, here's how to go about it. Start with this article from July of 2004, called Who Should Govern the Internet? Then go to the ConsortiumInfo.org News Portal WSIS/Internet Governance subcategory, and you'll find c. 25 articles and blog entries from the past five months tracking what's been happening leading up to the final WSIS meeting in Tunis, including the recent PrepCon3 meeting in Geneva when even the European Union deserted the U.S. A particularly good article by Kevin Murphy linked there describes how the current system works.

In truth, this is a big and important issue, because although today it's America' condescending attitude regarding ICANN that is under the microscope, in the future, it will be other issues, including, perhaps, the future technical development of the Internet or the Web, whether content could be censored, and so on. So what happens in Tunis matters to everyone, in the first world as well as the developing world, because it represents the beginning, and not the end, of the discussion on such issues.

So with that as prelude, here's how you can let the U.S. government know who you thinks should "govern the Internet" in the future.

The lead representative for the U.S. is Ambassador David A. Gross, a Bush appointee who has served since August 2001 as the U.S. Coordinator for International Communications and Information Policy in the Bureau of Economic and Business Affairs. If you go to an October 28 posting at the USINFO.STATE.GOV site called Expanding Internet Access Must Remain World Focus at Summit you'll see the current U.S., position. And at the very end of the post, you'll find this:

In a November 2 Internet chat, Gross will preview the upcoming WSIS summit and discuss his views on why the current governance structure is the best way to preserve the nature of the Internet as an innovative medium.

Gross will be available to answer questions at 11 a.m. EST (1600 GMT). To ask a question or make a comment, please register at iipchat@state.gov. Questions and comments are welcome in advance of and during the November 2 program.

So there you have it – your chance to make your point to the man who will make the U.S.'s points in Tunis November 16-18, or at least your chance to see how he answers the opinions of those who do choose to ask questions and make comments.

Either way, you can follow how things progress here at the Standards Blog, as I will cover the run up to, and the outcome of, the Tunis meeting in detail. It's likely that I'll also dedicate the November issue of the Consortium Standards Bulletin to the same topic.

Bookmark the Standards Blog at http://www.consortiuminfo.org/newsblog/ or set up an RSS feed at: http://www.consortiuminfo.org/rss/

Comments? Email:

Copyright 2005 Andrew Updegrove

subscribe


 

CONSIDER THIS:

[][][] October 27, 2005

#33 Do Standards Matter?

Yes, that’s a rhetorical question.  But like a lot of things we “know,” it is sometimes interesting to ask ourselves what, exactly, supports what we are sure we know.

I recently asked myself that sort of question when I was notified that I would receive a standards-related award, and that I would be expected to say a few words by way of an acceptance speech.  Since I like to present ideas in an “out of the box” fashion, I thought I would try this question out when I stood in front of a room full of people who have spent most or all of their professional careers developing standards.

As hoped, it made an impact, perhaps because I put into words what I suspect many of them had always sensed, but which was too non-traditional for them to articulate.  After all, standards are just dry, dusty, boring specifications, are they not?

I’ve never thought so, perhaps because I don’t actually help create them.  Instead, I help create the organizations that develop the standards, and perhaps this enables me to focus more on the forest of meanings and effects that standards represent, rather than on the pieces of paper themselves.

So here is what I said:

I'm very pleased to be here this evening to accept the 2005 ANSI President's Award for Journalism.  As some of you may know, I am receiving this award in recognition of my work at a website, ConsortiumInfo.org, and in a monthly eJournal, the Consortium Standards Bulletin, now in its 33rd issue.  Each of these ventures is public and free, and each addresses standards, standard setting (both SDO and consortium-based), and the role of standards in society.

You may wonder why, as a full-time practicing attorney, I have spent over a thousand hours a year for the past three and a half years in such a pursuit.  There are two possible answers to that question.  Since one reflects poorly on my mental state, I will turn to the second, which is this:  Because I believe that Standards Matter.

Since I'm sure that all of you already share that opinion, I would like to offer you three reasons that may not have occurred to you before.

The first reason I will identify with the word "Humanity." 

What I am suggesting by this is that the type of standards that you and I work with are simply the latest form of a tool that mankind has been creating since long before the dawn of history.  After all, what is a standard but an abstraction that people voluntarily agree upon for a particular purpose.  The first standards system may well have been sign language: "this sign means that thing," followed by spoken words, then written words, then weights and measures, and eventually the types of standards that we know and help create today –  safety, performance and interoperability standards, and, most recently, open source software – a new way to solve old problems.  There is no question in my mind that there will be other types of standards that will come into being in the future. 

It is hardly an exaggeration to say that the creation of standards, like the creation of tools, differentiates us from all other forms of life.  To create standards, therefore, is inextricably intertwined with the very concept of what it means to be human.

The second reason I will call "Hope."

What I wish to point out with this word is that there is something important and unique about standard setting:  in what other process do people from all backgrounds and from all nations come together to mutually agree on matters of common concern, and then voluntarily bind themselves by rules that restrict their freedom of action in order to make the world a more productive and useful place?  The United Nations has had nowhere near the same success in its efforts, despite a vastly greater budget and the official support of governments from all over the world.  Surely there are lessons that national and world governments can learn from the standard setting process that could have great benefit to humanity as it struggles with the challenges of a shrinking world.

The last reason I will call "Service."

There are on the order of a million supported standards in the world today, and the value that they convey is incalculable.  With the extension of the Internet and the Web to the third world – something that would be impossible and unimaginable without standards – that value is about to multiply a hundred fold, as everyone, everywhere, will gain equal access to the riches of science and the arts, to unbiased accounts of history and current events, and to a competitive education -- something that only twenty years ago would have been an inconceivable goal to achieve even in the lifetimes of our children.

And yet all of these standards, old and new, have been created by a comparative handful of people, unknown, unsung, and utilizing the most modest of resources.  To be fortunate enough to be among those that serve in this pursuit is, I believe, a privilege indeed.

Thank you very much.

Comments? Email:

Read more Consider This… entries at: http://www.consortiuminfo.org/blog/

Copyright 2005 Andrew Updegrove


FEATURED MEETING :

STANDARDIZATION: UNIFIER OR DIVIDER?

December 5 - 7, Sutton Place Hotel Vancouver, B.C

 

Standardization can be used to unify - or divide - people, markets, nations, and technologies at every level.  Join world experts at a free conference to explore how standardization can create economic, technical, and social benefits for all.
This interactive conference will examine how standardization can and is being employed to influence global and local markets from the perspectives of industry, nations, geopolitical regions, and groups and classes of people.  Speakers from the public and private sectors in a number of countries will interact with the audience to discuss:

  • What is open information and communications technology (ICT) standardization?
  • How do different standardization processes serve as unifiers or dividers?
  • How does standardization unify or divide local and global markets?
  • What impact do different approaches to managing intellectual property rights in standardization have on the market, competition, and users?
  • How can standardization be employed to encourage or discourage cooperation?

To view the agenda, speaker list, and register, visit the event website.


 

THE REST OF THE NEWS

For up to date news every day, bookmark the ConsortiumInfo.org
Standards News Section


Or take advantage of our RSS Feed

WSIS

We cannot stand idly by as some governments seek to make the Internet an instrument of censorship and political suppression  [October 19, 2005]

 

Senator Norm Coleman (R., Minn), on the prospect of the U.S. losing control of the Internet Root Directory...Full Story

   

In attempting to act as an advocate for developing nations, the EU has instead done little more than compromise its own common sense  [October 23, 2005]

  Information Technology Association of America (ITAA) President, Harris Miller...Full Story
   
The United Nations will not be in charge of the Internet. Period  [September 30, 2005]
  U.S. Ambassador David Gross, addressing calls for ICANN to relinquish control of the Root Directory...Full Story
   
A directory system for the internet that wouldn't be controlled by the politicians, lawyers and bureaucrats  [October 14, 2005]
  Internet domain name system inventor Paul Mockapetris, when asked what else he wished he could have invented...Full Story
   

Battle lines are being drawn: In the run-up to the November WSIS meeting in Tunis that will conclude the originally scheduled process envisioned by the UN, the question of "Who shall govern the Internet?" has reached the halls of Congress, especially since the EU (the last vocal ally of the U.S. on the question) deserted the U.S. camp at PrepCon3, the last preparatory meeting prior to the Tunis conclave. At issue is the control of the root directory of the Internet, currently under the direction of ICANN, which is under the control of the U.S. Dept. of Commerce. Now, senators and congressmen from both parties are standing up to say that only the U.S. can be trusted to prevent censorship or other free speech violations on the Internet. The rest of the world, not surprisingly, has a different opinion.

Washington demands Internet status quo
By: John Blau
ITWorld.com, October 19, 2005 -- Lawmakers in Washington, D.C., are speaking out against efforts by several countries participating in United Nations-sponsored talks to force the U.S. to relinquish control over key Internet functions. The most recent critic is Senator Norm Coleman, a Republican from Minnesota. Earlier this week, Coleman submitted a resolution aimed at protecting control of the Internet, in particular the domain name and addressing system, from being transferred to the U.N. ...Full Story

separator

A voice of reason: If the whole, unending string of "Who should govern the Internet?" stories that have littered the Web (and this news portal) for the last year has left you glassy-eyed and not quite sure what the dispute is really all about, then this piece is for you. Blogger Kevin Murphy gives an excellent overview of how ICANN works, what types of actions it can take, and a recent example of an action its U.S. overseer, the National Telecommunications and Information Administration asked ICANN not to take at the behest of a conservative action group. A good, clear, and informative piece of writing.

CBRO Editor's Blog [The Skinny on Internet Governance]
By:  Kevin Murphy
Computer Business Review Online October 14, 2005 -- Paul Mockapetris, who invented the internet’s domain name system, was asked a few years ago what else he wished he could have invented. He answered: “A directory system for the internet that wouldn't be controlled by the politicians, lawyers and bureaucrats.” Tough luck, Paul. Not only is the DNS controlled by politicians, lawyers and bureaucrats, but politicians, lawyers and bureaucrats from all over the world have been spending vast amounts of time and effort arguing over which politicians, lawyers and bureaucrats should control it in future. ...Full Story

separator

There's more to WSIS than just the root directory:  Meanwhile, although one might not know it from the fact that 95% of all articles about WSIS seem to relate to only two stories (the other is the poor track record of Tunis when it comes to free speech, including the rights of journalists), those involved in the WSIS process have been paying attention to other important topics as well, such as advancing the credibility and use of open source and open standards.

Open Source Agreed In UN Information Society Summit Preparations
Intellectual Property Watch, October 10, 2005 -- Encouragement for the use of free and open source software and open standards for science and technology has quietly worked its way into the draft texts being prepared for the November second phase of the World Summit on the Information Society (WSIS). Such ideas have gained significant support in recent years as potentially low-cost, easy-access solutions for developing countries, but as they are put forward in the WSIS context they are balanced by stronger calls for proprietary approaches. The draft WSIS texts are lengthy and detailed, and intellectual property (IP) issues play a comparatively small role overall, but the stakes are high enough to draw top government IP officials and industry lobbyists to the meetings. ...Full Story

separator

Intellectual Property Issues

The world has learned that you don't mess with the Internet, the Web, or anything crucial to its operation [October 21, 2005]

 

Andrew Updegrove, a partner at Gesmer Updegrove...Full Story


A disturbance in the force:  Few types of news reverberate throughout the technology galaxy like a patent threat to the royalty and license-free nature of the Internet or the Web.  When such news does occur – such as the initially successful effort of tiny Eolas to assert a patent against a feature of Microsoft's Internet Explorer browser that utilizes a W3C standard – mighty is the response (see Patents:  Too Easy to Get, Too Hard to Challenge?).  This month, another shock wave was felt when yet another small company, called Scientigo, announced that it intended to "monetize" two patents at the expense of those that use XML.

Small company makes big claims on XML patents
Martin LaMonica
ZDNet.com October 21, 2005 -- A small software developer plans to seek royalties from companies that use XML, the latest example of patent claims embroiling the tech industry. Charlotte, N.C.-based Scientigo owns two patents (No. 5,842,213 and No. 6,393,426) covering the transfer of "data in neutral forms." These patents, one of which was applied for in 1997, are infringed upon by the data-formatting standard XML, Scientigo executives assert. Scientigo intends to "monetize" this intellectual property, Scientigo CEO Doyal Bryant said this week. ...Full Story

separator

Semantic Web

Are we there yet? In June I dedicated the entire issue of the Consortium Standards Bulletin to The Future of the Web. That issue included an extensive interview with Tim Berners-Lee, during which I asked what it would take to jumpstart support of the Semantic Web (i.e., adoption by Google?) Berners-Lee said at that time that there were a number of projects that companies had in the works that he could not yet discuss. Is the Google project described in the first piece below one of them?   In another sign of momentum and confidence, OASIS kicked off a new technical committee to pursue a project to support Semantic Web services, combining work in two hot standards areas in one effort.

Google May Take on eBay
RedHerring.com October 26, 2005
 Google on Tuesday appeared close to launching a service called Google Base that would pit the search giant against eBay, Amazon, and Craigslist by allowing users to list everything from used cars to real estate.... By hosting everything from articles on current events, to used car listings, to scientific data, Google could be taking a step toward building the “Semantic Web” pushed by World Wide Web inventor Tim Berners-Lee and other computer scientists. The Semantic Web moves toward one that uses XML (eXtensible Markup Language) to describe the meaning of information rather than simply what it should look like on a web page. ...Full Story

OASIS Issues Call for Participation for New Semantic Execution
OASIS, October 18, 2005 -- This week, the Consortium announced the formation of the OASIS Semantic Execution Environment (SEE) TC. The aim of the TC will be to provide guidelines, justifications and implementation directions for an execution environment for Semantic Web services. The work of this committee will be relevant to parties interested in Web services and SOA. The inaugural meeting will be sponsored by DERI and will be held via telephone on 11 Nov. ...Full Story

separator

Ready to Rock and Roll: This is a first for the News Portal: a product announcement. Why are we breaking with our usual policy and including a manufacturer's press release touting their new product? Because it means that the Semantic Web is becoming more real, with ISVs creating products that they believe people want to buy to create Semantic Web documents. It will be interesting to see how this new product sells.

Altova Reveals Ground-Breaking Semantic Web Development Tool
Business Wire, October 4, 2005 -- Altova(R) (www.altova.com), creator of XMLSpy(R) and other leading XML, data management, UML, and Web services tools, today announced… Altova SemanticWorks(TM) 2006,… a visual Semantic Web development tool with support for Resource Description Framework (RDF) and Web Ontology Language (OWL) creation and editing….Altova created SemanticWorks to help customers learn and work with emerging Semantic Web technologies in an intuitive way….Altova SemanticWorks allows developers to graphically create and edit RDF instance documents, RDF Schema (RDFS) vocabularies, and OWL ontologies with full syntax checking. ...Full Story

separator

Wireless

What we have here is a group of silicon companies who want to try to force the standard to be what they are already trying to build  [October 10, 2005]

 

Greg Raleigh, CEO of Airgo, objecting to the current tactics of the newly formed EWC...Full Story

 

And then there were three: A long-building log jam in the IEEE working group that is developing the 100 Mbit/sec+ WLAN standard broke recently, but not through a compromise between WWiSE and TGn Sync, the two rival groups that have been facing off for months, even though companies like Broadcom had forecast just such a resolution as recently as October 3. Instead, there was an announcement only a few days later that a new group of two dozen companies had been formed (including guess who -- Broadcom) called the Enhanced Wireless Consortium. The new group includes many other heavyweights as well, such as Cisco, Intel, and Aptheros, but pointedly excludes Airgo Networks, Inc., a leading chip vendor. Ostensibly, the new group was formed to put pressure on the IEEE working group to achieve a compromise agreement on the new proposal -- but the group also threatened to go directly to market with compliant products anyway if IEEE adoption did not occur. Given that all standard setting is consensual, it is not a good omen when those within a process strike out on their own -- especially if they are successful, except to the extent that it leads to greater efforts to reach a compromise solution the next time around.

Wireless LAN Group Offers Spec
By: Stephen Lawson
PCWorld, October 18, 2005 -- The Enhanced Wireless Consortium (EWC) has published its draft specification for high-speed wireless LANs on its Web site, the group has announced. The group went public on last week saying its members had developed a compromise draft specification to help speed development of the IEEE 802.11n standard. The EWC includes some of the biggest players in wireless LANs, such as Intel, Broadcom, Atheros Communications, and Cisco Systems. ...Full Story

separator

Or they could just save weight and leave them home: Congress has just set the dates for the switchover to digital TV -- and for auctioning off the recovered spectra that could reap as much as $10 billion for the federal government. But it hasn’t given any thought to requiring that those that pick up those spectra use them to deploy services using international standards, leaving open the prospect for yet another disjunction between U.S. wireless devices (U.S. devices from cell phones to pet identification RFID tags are already out of synch) not to work when they go abroad. Just another example of the too-often poor coordination when it comes to those things that lie within the control of government and those that arise from private sector actions -- but involve us all.

Senate Sets Spectrum Standard
RedHerring.com October 22, 2005 A U.S. Senate committee set a final deadline for the switchover of television broadcasters from analog to digital services, and also set a date for the auction of crucial recovered spectrum. There will be a wide range of interest in the spectrum from the mobile voice, mobile data, and wireless broadband industries. [Consultant] Mr. Nordgaard believes that the international implications of the digital switchover and the reuse of spectrum have, surprisingly, not been raised. “Users will want to be able to use their wireless broadband equipment abroad,” he said. “If the U.S. uses spectrum A, and Europe uses spectrum B, then having a roaming device becomes more costly.” ...Full Story

separator

Government and Regulation

The FCC must be prepared to take steps to assure continuity of service to consumers in the event that the parties fail to reach an agreement  [October 10, 2005]

 

US Rep. Edward J. Markey, ranking member of the House Telecom Subcommittee, commenting on Internet disruption caused by a dispute between two ISPs...Full Story

   

My goal is to do all of the work it takes to be explaining to the Supreme Court in 2025 why broadcasting is unconstitutional [October 18, 2005]

 

Open source advocate Ebon Moglen, on why the FCC should not be allowed to prohibit open source "mesh" broadcasting by hackers

 

Is you is, or is you ain't a utility? For a long time I've wondered when government would decide that the Internet must be treated like a utility. Recently, a major ISP had the bright idea that it should make the U.S. government ask itself that same question. Utilities are essential services that governments regulate in order to protect the public. For example, many laws protect consumers from having their heat or water cut off suddenly for non-payment of bills. As more and more of our lives, finances, home systems and everything else become controlled via the Internet, can its "utilitization" be far off? Where, when, and especially how that happens will be an interesting and contentious process.

Dispute threatens to snarl Internet
By:  Hiawatha Bray
BostonGlobe.com October 10, 2005` Internet connections could be disrupted for millions of people in Europe and North America as the result of a pricing spat between the world's two major service providers, raising concerns about who governs the global communications network and how it should be regulated. On Wednesday, the Internet service provider Level 3 Communications Inc. of Broomfield, Colo., broke its connections with a major competitor, Cogent Communications Group Inc. of Washington, D.C., effectively throwing up roadblocks for some e-mail communication and access to websites. ...Full Story

separator

So long, we hardly need ye?  Meanwhile, Columbia Law Professor and open source advocate Ebon Moglen thinks that it's a hacker's constitutional right to broadcast via open source software, and the FCC better not get in the way.

Does Open-Source Software Make the FCC Irrelevant?
By: Daniel Fisher
Forbes.com, October 19, 2005 -- Columbia Law School Professor Eben Moglen wants to destroy the Federal Communications Commission. Not as some kind of terrorist act, but because technology is rapidly making it irrelevant. The agency might have made sense in the 1920s, Moglen says, when it was formed to assign specific frequencies to broadcasters so they wouldn't try to drown each other out by cranking up the transmitter power. ...Full Story

separator

Story Updates

If somebody robs the same bank you do, are you still guilty? The tangled tale of Rambus took yet another twist this month as Rambus sought to gain access to documents that it claims will prove that it was the victim of a price fixing conspiracy. There's no question that there was misconduct among several SDRAM companies, since they've already been hit with enormous fines for their deeds (see the second item below for an example of one of the large fines paid). What Rambus wants to show, however, is that their misconduct should excuse its own actions, which remain the subject of an FTC proceeding. According to an opposition filed by the FTC in an attempt to prevent Rambus from gaining access to the documents it seeks,  the chip technology company's efforts are simply an attempt to "deflect attention from its own conduct by blaming third parties."

Rambus and a Price-Fixing Tale
Arik Hesseldahl
BusinessWeek.com October 30, 2005 It's a matter of public record that at least three companies participated in a global conspiracy to manipulate the prices of computer memory chips. The U.S. Justice Dept. settled the issue by handing down more than $600 million in fines against the businesses. What isn't known, though, is why they did it. And Rambus, a designer of chip technology, is intent on finding out. Rambus on Oct. 31 will urge a California Superior Court in San Francisco to release documents it says will help in that pursuit. ...Full Story


Samsung Electronics to Pay $300 Mln for Price-Fixing (Update1)
Bloomberg.com, October 13, 2005 -- Samsung Electronics Co. agreed to pay $300 million, the second largest criminal antitrust fine in U.S. history, to settle charges it took part in a global scheme to fix the price of computer chips used in personal computers, mobile phones and other electronic devices. South Korea-based Samsung, the world's largest maker of computer memory chips, and its U.S. unit will plead guilty to conspiring to fix the prices for dynamic random access memory chips, a $7.7 billion market in the U.S. last year. They are also used to make printers, video recorders, game consoles and digital cameras. ...Full Story

separator

No compromise, but many prisoners: The consumer electronics industry's longest running chicken fight seems destined to end up with a consumer smashup, with both sides committed not to blink, even at the last minute, despite pleas from companies like HP, as reported below. How likely is HP to succeed? Well, the same day, Warner Bros became the second major studio (after Paramount) to announce that it would support both formats. But that doesn't mean supporting both standards on one disk - that would require a compromise between the two rival format groups. So here we go again, headed towards content vendors having to produce two product lines, video shops having to stock duplicate versions of the same movies -- and you flipping a coin to decide which format your next player should support, without knowing which will be the winner, and which will eventually be the loser. Thanks guys!

HP Tries To Bridge Blu-ray, HD-DVD Formats
By:  Spencer Chin
EE Times October 21, 2005 In the latest attempt to unify the divergent Blu-ray Disc and HD-DVD optical disk formats, Hewlett-Packard Corp. has formally appealed to the Blu-ray Disc Association to incorporate two key technologies in the format. So far, the Blu-ray and HD-DVD camps have succeeded mostly in polarizing entire industries along separate camps. ...Full Story

separator

Open Document

OpenOffice.org is on a path toward being the most popular office suite the world has ever seen  [October 21, 2005]

 

Sun Microsystems President and CEO Jonathan Schwartz, on the release of OpenOffice.org 2.0...Full Story

separator

Downloadable to a computer near you: After a last minute delay, OpenOffice.org 2.0 was released to much fanfare from OOo fans and the press. As of October 4, over 47 million computer users had downloaded copies of earlier versions of OpenOffice.org's free, open source version of the OpenOffice office suite. OOo community development manager Louis Suarez-Potts estimates that with the release of 2.0, this number will rapidly rise to over 100 million. (To get your own local language copy of OOo 2.0, simply visit the OpenOffice 2.0 download page.)  Meanwhile, as the quote above and the second story below indicate, Sun is doing everything in its power to give OpenDocument compliant products a boost against Microsoft Office – even at the expense of its own OpenDocument format compliant office suite (StarOffice 8.0) which, as reported in the third item below, was the first compliant version to reach the marketplace.

OpenOffice.org 2.0 Is Here
OpenOffice.org October 21, 2005 OpenOffice.org 2.0 is the productivity suite that individuals, governments, and corporations around the world have been expecting for the last two years. Easy to use and fluidly interoperable with every major office suite, OpenOffice.org 2.0 realizes the potential of open source. Besides a powerful new database module and advanced XML capabilities, OpenOffice.org natively supports the internationally standardised OpenDocument format, which several countries, as well as the U.S. state of Massachusetts, have established as the default for office documents. More than any other suite, OpenOffice.org 2.0 gives users around the globe the tools to be engaged and productive members of their society. Available in 36 languages, with more on the way, and able to run natively on Windows, GNU/Linux, Sun Solaris, Mac OS X (X11) and several other platforms, OpenOffice.org banishes software segregation and isolation and dramatically levels the playing field. And, with its support for the OASIS Standard OpenDocument format, OpenOffice.org eliminates the fear of vendor lock in or format obsolescence. The OpenDocument format can be used by any office application, ensuring that documents can be viewed, edited and printed for generations to come. ...Full Story

Sun puts patent weight behind OpenDocument
Tom Sanders
VNUnet.com October 4, 2005 -- Sun Microsystems has promised not to enforce any of its patents covering the OpenDocument format. The Oasis Open Document Format for Office Applications is a standard backed by Adobe, IBM and Sun....Sun's patent support for OpenDocument is different from a move the company made earlier this year, in which it pledged 1,670 patents in support of any software governed by the open source Common Development and Distribution Licence.Simon Phipps, Sun's chief open source officer, said: "Previous attempts at patent protection using the 'patent commons' approach glorify patents, forcing anyone who would benefit from the apparent protection to become a patent expert. A blanket statement like this just says: 'no need to look, you're safe.'" ...Full Story

Sun Microsystems' StarOffice 8 Provides A Suite Alternative
By: Sean Doherty
InformationWeek, October 9, 2005 -- How do you compete against Microsoft in the office- productivity software market? You could give up, like WordPerfect did long ago. Or you could try harder, as Sun has done. Sun Microsystem's StarOffice 8 Office Suite gives users a similar word- and number-crunching experience to Microsoft Office, but adheres to open standards, runs on Linux and Solaris (as well as Windows), and costs less than $100. ...Full Story

separator

Who's Doing What to Whom?

Ultimately, there has to be one open standard. Not a couple of them,"  [October 25, 2005]

 

Punk, Ziegel & Co. analyst Steve Berg...Full Story

separator

"Fat access?" I'm sure we said "Thin access:" As anyone who has spent any time in standard setting knows, the opinions on one proposal over another of those actually involved in the process at hand often involve (how to say this in a non-judgmental away) more than just technical superiority. The following story involving a slow-moving standard at the IETF demonstrates not only that individual companies can and do push for results based on more than technical excellence, but that if the standards process drags on long enough, the same companies can come late to the realization that an opposing camp's proposal looks pretty good after all (usually just after their own product strategy has changed.)

LWAPP wireless standard back from the dead
By: Peter Judge
Techworld October 22, 2005 The lightweight access point protocol (LWAPP) is back from the dead. It is now the leading proposal for multi-vendor wireless LANs, according to an IETF standards group. In 2003, network managers wanting a wireless LAN to cover their building were offered wireless switches as an alternative to stringing together standalone access points. The switches were often criticised because the "thin" or "dumb" access points they used were proprietary. ...Full Story

separator

BioIT

I3C is dead; long live HUPA:  One of the areas outside ITC that has begun to adopt consortium type processes is biotechnology.  This is not surprising, since BioIT is a heavy user of IT technology, because some of the challenges it faces can be best addressed through consensus activities (e.g., genomic development), and because of its academic roots.  The first item below reports on one example of an IT standard created by and for biotech use.  But while bioIT standards activities continue to move ahead strongly in other venues as well, such as the W3C, HL7 (Health Level Seven) and CDISC, it appears that the I3C has quietly died in the dark, apparently the result of the Semantic Web's potential to better solve the problems which the I3C was originally created to address.

Report from HUPO 2005 Munich
By: Sandra Orchard, Henning Hermjakob, and Rolf Apweiler
BioITWorld.com, October 4, 2005 -- The Proteomics Standards Initiative session of the Human Proteome Organization (HUPO) 2005 congress was opened by the current chair, Rolf Apweiler of the European Molecular Biology Laboratory - European Bioinformatics Institute (EMBL-EBI), who summarized the achievements of the HUPO Proteomics Standards Initiative (HUPO-PSI) to date. The Molecular Interaction XML interchange standard is already widely used, and all of the major publicly available databases now make data available in this format [1]. Five of these databases -- BIND, DIP, IntAct, MINT, and MPact (MIPS) -- have formed the International Molecular Exchange consortium to jointly curate and exchange data, with data exchange commencing early in 2006. ...Full Story


I3C: Missing in Action
By Salvatore Salamone

Bio-IT World, October 28, 2005 -- 
Sometime within the last year, the Interoperable Informatics Infrastructure Consortium (I3C) quietly disappeared. Sadly, perhaps, almost nobody noticed.  Researchers and vendors launched the I3C with the noble goal of developing interoperability standards for the life sciences that would make it easier to access, exchange, and share data….So why did the I3C just vanish with so little fanfare? Opinions among some of the I3C founding members vary, but the consensus is that the work of the I3C is being carried out today in other standards bodies. ...Full Story

separator

Standards are Serious (Aren't They?)

Department of "Huh?"/Quote of the Month: 

In an open-standards environment, the interfaces are stable and it promotes collaboration and interoperability, whereas open source is just like a bunch of kittens tied together with rubber bands [that] move off in sort of many different positions  [October 19, 2005]

 

Nevada CIO Ted Savage, explaining open source software...Full Story

 

 

 
L10 Web Stats Reporter 3.15 LevelTen Hit Counter - Free PHP Web Analytics Script
L