Consortiuminfo.org Consortium Standards Bulletin- May 2005
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
  Untitled Document
Untitled Document

 

May 2005
Vol IV, No. 5

Revolution Time (Again)

EDITOR'S NOTE: HERE WE GO AGAIN
   
EDITORIAL: THESIS, ANTITHESIS (SYNTHESIS?)
Twenty years ago, we had one, traditional, global standard setting system. Then, with the rise of consortia, the IT industry had two systems. We still do, and perhaps not the best of either. Print this article (PDF)
   
FEATURE ARTICLE:

STANDARDS, CYCLES AND EVOLUTION: LEARNING FROM THE PAST IN A NEW ERA OF CHANGE

Throughout most of the Twentieth Century, the world worked to create a rational, centralized, democratic system of standard setting. Then, a significant portion of one industry opted out when the demands on that system changed. Now, the system is once again under stress. What will happen this time? Print this article (PDF)
   
TRENDS:

THE RISE OF THE METASTANDARD CONSORTIUM

Standard setting organizations, like any other type of entity, can get trapped in stovepipes. It takes a new type of organization to come up with the standards-based solutions demanded by a modern, networked world. Print this article (PDF)
   
STANDARDS BLOG:

ALL SOCIAL STANDARDS ARE LOCAL (OR ARE THEY?)

Throughout most of history, business behavior, like other behavior, was affected by social standards. What happens when business consolidates and quarterly profits are all that matter? Print this article (PDF)  
   
NEWS SHORTS: THIS MONTH'S TOP STORIES
A Subcontinental Open Source License? W3C to Enable Web Access for Mobile Devices; EU Parliament Continues to Wrestle with Software Patents; Microsoft Says "Lets' Talk" to Open Source Community; White House Says Democrats Not Welcome at Standards Meetings; Vendors Scramble for Chinese 3G Business; WiMax Vendors Jump the Standards Gun; and, as always, much more

Print this issue (PDF)




EDITOR'S NOTE:

HERE WE GO AGAIN

They say you want a revolution.
Well, you know, we all want to change the world

 
John Lennon/Paul McCartney, “Revolution”

Systems exist in two states: stasis and change. In the technology area, one might argue that only the latter actually exists in nature, and that if two states exist, they are instead evolutionary change, and revolutionary change.

When technology is in the revolutionary stage, there is a need for standard setting to follow suit. Right now, we are in such a stage, and that is the topic of this issue.

In our Editorial, we examine how radical shifts to new types of systems can be compared to their precursors in order to create a synthesis that is a better platform than either.

In our Feature Article, we look at how the last major shift in standard setting occurred in the late 1980s as a reference point for what is to come.

Our Trends piece, in turn, examines a new type of organization – the MetaStandard Consortium – that has recently evolved to meet the new demands of a modern, networked world.

And finally, in this month’s Blog entry, we examine how social standards of business behavior are being replaced, in a similarly radical change, with formal, modern corporate responsibility standards.

As always, we hope you enjoy this issue.

    Best Regards,
 
  Andrew Updegrove
  Editor and Publisher
     


EDITORIAL

THESIS, ANTITHESIS (SYNTHESIS?)

Andrew Updegrove

Some 200 years ago, the German philosopher Georg Wilhelm Freidrich Hegel set himself a daunting goal: to develop a philosophical theory that could be used not only to explain the past, but to predict the future as well.

Hegel believed that the universe was governed by a rational and positive process, and consequently that his goal would be achievable, if only the universe could be sufficiently understood. To achieve that ambition, he adopted a mode of thinking that, while different in technique, made him a kind of ancestor to Einstein and his successors, who embarked on a quest to refine the partial explanation of natural laws known as Newtonian physics into an empirically provable "Theory of Everything".

Hegel posited that human history demonstrated an ongoing contest between competing theories, each of which claimed to explain reality. His revelation was that such theories could be used as tools to reach a fuller understanding of life, even if each theory was known to be imperfect as a starting point. The method he developed to achieve this end has come to be known as the "thesis/antithesis/synthesis" model of analysis.

In this methodology, our understanding of reality is refined by first describing what appears to be objective reality (the thesis). Once the thesis has been explicated, its opposite (the antithesis) can also be described. True reality may be assumed to lie somewhere between these two extremes. By critically comparing thesis and antithesis, a truer understanding of reality may be achieved and described (the synthesis). The synthesis then becomes thesis upon which the next round of the exercise is based, and the process is continued in asymptotic fashion to progressively narrow the gap between human understanding and objective reality. Hegel referred to the elusive, ultimate truth towards which this search was directed as the "absolute idea."

Hegel's methodology works best in disciplines where the goal is to understand ostensibly timeless values, such as in science ("what is light?") and philosophy ("what is truth?"). In human history, however, an interesting inversion occurs. Systems often evolve from original thesis towards what seems to be a sort of "absolute idea", becoming more likely to eventually breed a new antithesis when the absolute idea has been attained.

This phenomenon can be seen most obviously in political systems. If the leadership of a chief in a tribe makes sense, then why not of a prince in a city state, and ultimately a king in a nation, each with ever more absolute power? However, while the concentration of power into rigid hierarchies can bring relative stability and secure borders, it may also lead to unrest when that power is abused. In the particular case, this can lead to a rebellion. Sometimes, the result can be a new antithesis that represents a new political system entirely: democracy, socialism or communism.

The same dynamic can also be observed in quasi-political systems, such as religion (think of the Reformation), and, indeed, in standard setting. Since the dawn of recorded history, elementary standards have existed (weights, measures, coinage, and so on). With the advent of international trade and more sophisticated technologies, a broader range of tools was needed, and a global, hierarchical system rapidly involved that was capable of creating universally recognized and implemented standards of many types.

But as that standard setting system became more complete, it also became more bureaucratic. As it was global, it was also difficult for any individual company to influence an outcome too greatly, or even for any group of companies to affect results too easily. When the pace of information technology (IT) innovation accelerated and the matters at stake became highly strategic, the traditional system was not deemed to be adequate by some IT vendors that were anxious to achieve the full commercial promise of their new inventions, or to further their strategic interests.

The result was a fragmentation of the IT standard setting infrastructure that has been at once creative, frustrating, fruitful and inefficient. Multiple structural and procedural antitheses (e.g., consortia and open source projects) have not only been posited, but also implemented hundreds of times over, with demonstrable success. In some instances, synthesis can be observed in the field as well. For example, a number of open source projects (e.g., the Eclipse Foundation) now operate on top of a consortium-like infrastructure; many global consortia (e.g., the W3C and OASIS) are becoming indistinguishable from their accredited, national brethren; and some accredited bodies (e.g., ASTM and IEEE) now welcome members from beyond their historical, national borders.

But while examples of synthesis can be observed, the process of combining the best that both the new as well as the old methodologies had to offer has not, in our view, been completed. As a result, while consortia offer much that is desirable to those that join them, they do not offer many of the benefits that the traditional standard setting system offers, such as: the ability to centralize work in one place; the ability to coordinate related work within a single system under similar rules; greater resources and political influence; increased likelihood of global adoption; and more. Both the thesis and the antithesis still coexist, and synthesis has not yet been achieved. Indeed, there is not even any current movement in the IT industry to achieve it.

Of course, the technology world is hardly static, and therefore the answer to the question of "how should standards be set?" can never be answerable in the same way that the answer to the question "what is light?" may, someday, be known. If, by some remote chance we ever got the standard setting process “right,” that virtuous state would be likely to evanesce almost before its perfection had been realized.

But the real world, unlike the virtual spheres of philosophy, does offer one advantage: the organizations that set standards offer real-time data that can be observed. Thus, as those that push and jockey in the marketplace play out their experiments of thesis and antithesis, we can work towards, if not an Absolute Idea, at least a more introspective and productive synthesis upon which to do the work of the future.

Perhaps the road into the 21 st century would be smoother if we paid closer attention to the lessons of an 18 th century philosopher.

Comments? Email:

Copyright 2005 Andrew Updegrove



FEATURE ARTICLE

STANDARDS, CYCLES AND EVOLUTION:

LEARNING FROM THE PAST IN A NEW ERA OF CHANGE

Andrew Updegrove

Abstract: History includes times of both gradual evolution, as well as sudden revolutionary change. Standard setting experienced a burst of revolutionary change in the late 1980s that led to the development of a new type of standard setting organization (the consortium) when the information technology industry found the traditional, global standard setting infrastructure to be inadequate. Today, the consortium infrastructure is proving to be inadequate to the demands of a modern, networked world, and new structures will need to evolve in order to meet those needs.

Introduction: By 1985, a century-long process of rapid evolution in standard setting culminated in an orderly, hierarchical, global infrastructure – a mature and respected industry in its own right. This infrastructure was as broad as it was deep, covering not only every type of manufacturing industry, but food, telecommunications, safety, and most other aspects of modern life as well.

Twenty years later, that infrastructure continues to serve traditional industry well, but has been challenged, and in some cases abandoned, by those wishing to enable new information technology (IT) standards. What happened?

No single factor caused this fragmentation in standard setting. Instead, a number of new market dynamics contributed to the result. Those factors included a desire for greater process speed to match the rapid evolution of technology; increasing national and regional economic competition; the recognition that standards could often convey strategic advantage for those that had the greatest influence on the results; the recognition that success in some types of standard setting would also require marketing support; and efforts to confront the dominance of early market leaders in discrete products areas, such as operating systems.

The most obvious resulting challenge to traditional standard setting was presented in the late 1980s by the rise of consortia. As time progressed, many consortia became indistinguishable from accredited organizations, and their standards enjoyed increasing respect, often being referenced even by the traditional, global standards organizations that they were challenging.

Today, the global standard setting infrastructure (including these newer organizations) is finding itself once again confronted by a host of new challenges, including:

  • The convergence of information and communication technologies (ICT)
  • The rise of open source as a development model
  • Ongoing testing of the effectiveness of WTO agreements to prevent the use of standards as technical barriers to trade
  • Desires to reap the full potential of a globally networked world
  • Rising political recognition of the importance of the Internet and the Web and society’s dependence upon it
  • Spam, phishing and other Internet abuses that perhaps may only be satisfactorily defeated through coordinated, global government action
  • A new convergence of life sciences and IT

When the last wave of challenges confronted the standard setting infrastructure, new solutions evolved organically, allowing the marketplace to adapt in an adequate, if chaotic fashion. Much was gained in the process, but perhaps some benefits of the old system were lost as well. How will our world, which is even more dependent on standards today than it was twenty years ago, fare as the system is called upon to rise to these new challenges?

In this article, we review the standard setting infrastructure that existed in the late 1980s, and examine how that infrastructure failed to adequately accommodate that first wave of IT based innovation. We conclude by presenting a similarly critical review of the standard setting infrastructure of today, and offer some thoughts on how that system must once again evolve to meet the new challenges of tomorrow.

The old world order 1: In one of the less appreciated accomplishments of modern society, the developed nations and industries of the world created an extremely sophisticated, global, consensus based process that successfully created a bewildering array of standards – and continues to do so today.

More impressively, these same stakeholders recognized at an early stage that all would benefit if many kinds of standards were implemented globally, rather than nationally or even regionally. The result was the creation of a number of international organizations that were joined voluntarily by scores (and in some cases, even by the majority) of the world’s nations. Those that became central to the eventual IT industry were the following:

          The International Organization for Standardization (commonly referred to as ISO; a Greek derivative, rather than an acronym): Established in 1946, ISO was formed to enable the creation of industrial standards of every type through voluntary participation by national representatives (“Member Bodies”, where a national standards body exists, and “Correspondent Members”, in the case of countries that lacked a nationally chartered body). In 1985, ISO had more than 80 Member Bodies, some 70% of which were actual government agencies, or entities created by law.

In the late 1980s, the formal structure of ISO involved a General Assembly that met on a triennial basis, overseeing a Technical Board, Council and Executive Board. The Council in turn supervised a Central Secretariat (with a large staff, resident in Geneva, Switzerland), six standing Committees of the Council, and, at the bottom of the organizational structure, over 160 Technical Committees and joint Technical Committees.

Under the ISO system, new technical committees may be formed, but only if no existing committee is deemed to be suitable for the task. As of 1989, there were approximately 2,400 committees, subcommittees, working groups and study groups in ISO.2

          International Electrotechnical Commission (IEC): While its scope is limited to electrical and electrotechnical disciplines, the importance of the IEC is nevertheless large in the modern world. Besides having a more narrow focus than ISO, the IEC’s methodology is also somewhat different: it creates “specification standards”, or minimum requirements, rather than interoperability standards.

Like ISO, the IEC is headquartered in Geneva, Switzerland, includes national bodies as its members, and has a similar organizational structure. Unlike ISO, however, every member nation is entitled to have a vote on every Technical Committee – even those in which it has not opted to actively participate. Formed in 1906, the IEC had some 40 full members (with additional informational members) in the late 1980s. In 1988, the IEC hosted 82 Technical Committees (including two shared with ISO), and more than 100 subcommittees. 3

          International Telecommunications Union (ITU): Unlike ISO and the IEC, the ITU is the creation of an international treaty, rather than a body in which interested nations voluntarily participate. Similarly, it creates regulations, which are mandatory, rather than mere standards, which are voluntarily utilized (or not). Perhaps not surprisingly, the ITU has the largest membership of the three “Big Is”, with some 160 members during the period in question. Like ISO and the IEC, its headquarters is in Geneva, Switzerland.

The purview of the ITU is radio, telegraph and telephone regulation, and its work plan extends beyond standardization to coordination and planning. The ITU traces its origins to 1865, making it the most venerable of the three Big Is. At the top of the ITU organizational stack is the “Plenipotentiary Conference”, which during the period in question met at infrequent and irregular intervals. Below the Plenipotentiary Conference (in order) were an Administrative Council, followed by a layer of “Conferences”, a General Secretariat, two Plenary Assemblies, two Secretariats, and (finally) multiple Study Groups. 4

Supporting and participating in these three global organizations were scores of national standards bodies. With the notable exception of the American National Standards Institute (ANSI) 5, virtually all of these national organizations host most, or all, of the standard setting activities that are conducted within their respective national borders.6

On the cusp of change: In 1985, then, God was in his heaven, and all was largely right in the standard setting world, securely headquartered in Geneva, that most comfortingly neutral of all locations. The hallmarks of this highly evolved system were order, coordination, control, and predictability. All of which was hardly surprising, given the concurrent efforts to create the League of Nations, and then the United Nations, the needs for which this infrastructure had been developed, and the concessions to be expected in order to create consensus.

But there were also costs to this centralized system, suggested by the multiple layers of hierarchy and the default mode of participation at the national level. These costs included the dilutive demands of consensus, and the length of the development and adoption process. In 1988, for example, the average process in the IEC took eight years to complete. In the case of screw threads (the subject matter of ISO TC 1, the first of the sequentially numbered ISO Technical Committees), such a gestation period was considered to be tolerable.

In the beginning, those seeking to set information technology standards took the existing process for granted and in stride. Technical Committee 97 (“Computers and Information Processing”) was formed by ISO in 1960. At about the same time, the European Computer Manufacturers Association (ECMA) was also launched. Partially in response, the Accredited Standards Committee for Information Processing Systems (X3) was formed by ANSI to address similar technical matters in the United States. 7

In 1987, ISO and the IEC recognized the need to avoid duplication and competition in the computer area. The result was the formation of Joint Technical Committee 1 (JTC 1), into which the activities of ISO TC 97 and two IEC subcommittees were merged. Given the superior amount of innovation in the computer field that was taking place in the United States at the time, ANSI was offered, and accepted, the role of Secretariat for JTC 1. ECMA in turn was offered a liaison relationship, and each of the significant organizational constituencies was, to one extent or another, thus accommodated. Most of the participants in the activities of JTC 1 were standards professionals, well versed in the ways of traditional standards development.

Superficially, then, the infrastructure needed to meet the standardization needs of IT vendors and customers had been provided for, just as those of other emerging industries had been met within the existing system. But then, things began to change.

The rise of a new world order: 1987 was the last year during which the old order held uncontested sway in the field of IT standard setting. By the end of the decade, a dramatic shift in the center of effort had begun with the launch of a trickle, and then an increasing flood, of new organizations that were neither governmental in membership, accredited in process, nor anticipating eventual endorsement by any of the Big Is of their output.

Speed in delivery has often been cited as a reason for this faulting in the world of standard setting, but that was hardly the only, or, especially in the beginning, necessarily even a major factor. One need only look to the initial organizations formed to discern a more compelling reason to find a different motivation.

For example, many of the early consortia focused on a single area: operating systems. And a disproportionate number of these organizations were formed and/or joined by a core group of U.S. hardware vendors, such as IBM, HP, Motorola, NCR, Sun, and Unisys.

Instead of speed, what concerned these companies most was the rising dominance of Microsoft 8, and, to a lesser extent, factors such as the increasing unanimity of European competitors, acting both regionally and through ISO. Other fears were in the air, including the success of “Japan Inc.” in seizing market share from American semiconductor vendors. U.S. companies complained that they were unfairly hampered by domestic antitrust laws in comparison to their Asian competitors.

Congress responded with passage of the National Cooperative Research Act of 1984. 9 With the election of Ronald Reagan in the same year, a more business-friendly antitrust attitude settled in to Washington. In 1987, SEMATECH was created by 14 U.S. companies – and the U.S. government -- to collaboratively develop the wherewithal to turn the tide of trade in the semiconductor industry.

The stage was set, therefore, for American companies to think outside the traditional competitive box to envision other types of solutions (e.g., research and development), undertaken by a self-selected group of companies, to meet commercially identified challenges.

The result was the creation of an early crop of consortia striving to bolster UNIX in general, and the platform of one or more companies in particular. Examples included 88open (creating hardware and software standards supporting and promoting the UNIX-based Motorola 88000 RISC microprocessor)10, and SPARC International (supporting a competing Sun Microsystems RISC microprocessor). Each was an effort to break the tightening “WinTel” stranglehold that Microsoft and Intel were enjoying in the marketplace. Platform vendors viewed the wealth created by Microsoft’s proprietary ownership of the core desktop operating system with envy. And as Microsoft planned to enter the server market, there was fear as well.

One strategy to meet this challenge through standards involved the concept of creating “open systems” that would be able to interoperate, regardless of the operating system upon each was based. The effort to create a set of Open Systems Interconnect (OSI) standards, however, resulted in a system that was too constraining, and was a business failure in consequence.

This meant that the focus needed to be on the operating system – and happily, there was an (more or less) appropriate alternative available upon which all could focus to stave off the WinTel juggernaut as it prepared to move above the desktop: the UNIX operating system developed and owned by AT&T.

Of course, at the same time that hardware and silicon vendors were concerned about the advance of WinTel, they were also continuing to compete fiercely among themselves. When AT&T and Sun entered into a cooperation agreement in 1987, other UNIX-dependent vendors banded together to form the Open Software Foundation, with the goal of creating a new, UNIX-based operating system of their own. Sun and AT&T responded with the formation of their own organization, called UNIX International, and the so-called “UNIX Wars” were on.11

None of this activity, of course, had anything to do with traditional standard setting in the sense of collaboratively seeking consensus-based solutions to common problems. Instead, a speedy solution between a limited group of like-minded companies, supported by a common marketing front, was the strategic goal – hardly the stuff of a traditional, accredited standard setting process.

While it would be too simplistic to attribute the rise of consortia solely to UNIX Wars strategies and an industry floundering in its efforts to avoid being steamrolled by Microsoft and Intel, the consortia formed for precisely such reasons were a significant factor in breaking with tradition, and creating a comfort factor and familiarity with this novel way of collaborating. Other organizations of significance, such as the Object Management Group, or OMG 12 (formed to facilitate the proliferation of object oriented programming through the development of appropriate standards), soon followed, each founded to pursue standards-related goals outside of the traditional, accredited standards development organization (SDO) infrastructure. Often, these new efforts were launched as independent, non-profit, tax-exempt organizations created to not only develop, but maintain standards throughout their effective life.

As would occur fifteen years later with the advent of open source (for different reasons), those interested in pursuing specific collaborative goals in the late 1980s found the existing system to be inappropriate, inadequate or unwilling to meet their purposes. Their solution was to simply opt out of the existing system and create one of their own.

Adaptation: The consortium genie was well out of the bottle by 1990, with consortia multiplying at the expense of JTC 1 and other SDOs active on IT matters. Rather than launch a new activity within an existing SDO, technology companies increasingly started a new organization to do the specific job at hand, with whatever degree of process, budget and help they desired. At one end of the spectrum, such an effort might involve a small, self-selected cadre of vendors engaged in a joint development project that they hoped would give birth to a de facto standard. Under this model, additional participants, if any, would be admitted only by unanimous agreement. At the opposite (and more common) end of the spectrum, as with OMG, the broadest international participation was the goal, and a formal process was created to achieve respected results.

As the IT industry became more important and traditional SDOs lost market share to consortia, SDOs adapted in various ways to make their process appear more competitive. These adaptations included the creation of the “publicly available specifications” (PAS) process by the IEC, under which a specification created by a consortium could achieve a degree of recognition by the traditional standards infrastructure (and by means of which the IEC, in turn, could become associated with more cutting-edge IT standards). However, the response to the PAS alternative was tepid.

The advent of the Internet and the Web dealt a further blow to the traditional SDO process, as speed truly became an issue. Even consortia were deemed to be too slow to be useful for a time, and de facto standards that could be seized upon with no process at all became (briefly) more attractive than those that were more open.

But even before the advent of the Internet bubble years, an “Animal Farm” like process had set in, with many consortia becoming progressively more like their SDO brethren in process, scope, self-image (and, at times, process speed as well). Today, governmental agencies and universities as well as commercial entities are significant participants in many consortia, and the standards that the more significant consortia create are often referenced, and even solicited for adoption, by many SDOs. Consortia, in turn, are more often being urged by their members to seek formal acceptance of their standards by global standards organizations in order to facilitate the sale of products in those parts of the world (such as China) where a “Big I” imprimatur continues to carry a higher value even in the IT space.

At the same time, many traditional SDOs in the United States, such as the International Electrical and Electronics Engineers Computer Society (IEEE) and ASTM International (formerly the American Society for Testing and Materials), each of which is accredited by ANSI, have come to accept foreign as well as domestic members.

In consequence, the result of the first wave of change in the IT area has been not so much a permanent schism, but a blurring of edges and a gradual convergence of the two parallel systems. This blurring has resulted from new consortia as well as venerable SDOs learning from each other, competing with each other, and dynamically evolving to better address the needs and expectations of their respective stakeholders.

New challenges: Of course, while SDOs and consortia were each adapting to the initial dynamics that led to the creation of consortia in the first place, new (and equally profound) changes in the marketplace were emerging. Today, the collective weight of these new stresses on the SDO/consortium standards infrastructure are once again (in this author’s view) reaching the type of tectonic tension comparable to that which existed in the 1980s. Already, the advent of open source has opened a third road to the creation of “commonalities”13, and further revolutionary, as well as evolutionary changes may be in prospect.

These new challenges include the following:

          ICT convergence: While the quest for interoperability in IT has always been challenging, the advent of the Internet, the Web and mobile devices have multiplied the challenges enormously. Not only must hardware and software interoperate within a local area network, but they are now being called upon to do so over a global telecommunications backbone. Similarly, a single wireless device is now expected to host virtually any type of activity that heretofore would have been undertaken solely on a camera, desktop computer, play station, DVD player, or a telephone.

Historically, most IT challenges could be addressed in a fairly narrow technical and economic context. Convergence of this magnitude, however, gives rise to a host of new issues, of which the following are only a sampling:

  • The standards required to permit such a range of activity involve multiple industries, each with its own standard setting, intellectual property and economic customs. Often, each industry assumes, or demands, that others conform to its expectations. Obviously, only one camp can win at this game, resulting in a difficult route to resolution.
  • Power structures are also at risk: not only are hardware and software vendors meeting telecommunications players at the standards table and having to deal with their concerns, but telecommunications carriers are confronting the fact that Voice over IP (VoIP) services – based on IT technology rather than classic CT technology -- have now become commercially viable, creating new stresses on an industry that already is struggling with fierce competition and razor-thin profit margins.
  • With thousands of patents potentially being infringed by a single mobile device, the tolerance for payment of royalties becomes increasingly slim. In industries where gaining revenue from insertion of proprietary technology into a standard has historically been a strong motivation for parties, this reality is unwelcome. Similarly, in industries where patent pools have not historically been used, addressing the problem through this (otherwise useful) mechanism is unfamiliar. Finally, differences in the patentability of software between the United States and other countries creates further strains.
  • Standards may become more difficult to create when what they enable must work on more types of devices (e.g., how do you fit a browser on a mobile phone? How do you display content that will be usable on both a 2-inch display and a 20-inch monitor?)
  • Due to recent court cases, intellectual property rights (IPR) policies have become both contentious in development and tedious in implementation. Harmonizing rules between two organizations sharing a liaison relationship is difficult enough. Coordinating a joint development process with discordant policies is much more so.
  • Most IT standards are set by consortia, while most telecom standards are set by SDOs. The intellectual property policies of each type of organization, while they have similarities, are less alike across this divide than between two organizations of the same type.

One virtue of the traditional standard setting system was that its centralized, multi-layered structure made issues such as the above more likely to be identified in advance, and provided mechanisms whereby multiple efforts could be coordinated and conflicts resolved. The consortium system, instead, relies on a sort of neural network of liaison relationships of varying degrees of efficiency. Each of these liaison relationships, many of which are ad hoc or dependent upon a single volunteer member representative for maintenance, represent a potential point of weakness as well as an opportunity for communication.

          Open Source: The rise of open source as a development model has challenged consortia in something of the same way that consortia challenged SDOs. Instead of SDO members striking off on their own, however, this time it is the employees of the members themselves that became the pioneers. Unresolved issues in this area include the following:

  • As open source methodology becomes more commercially important, vendors are investing ever-greater resources and strategic reliance on the future of open source. With this increasing investment will come (presumably) a greater desire for control that will run counter to the individualistic, democratic roots of open source, perhaps sapping its creative energy.
  • Open source as a methodology is still in its infancy from a process perspective. Significant challenges remain to be worked out, such as how inadvertent infringement of IPR will be avoided, how open source and open standards organizations can and will work together, whether the “benevolent dictator” model personified by Linus Torvalds will perpetuate (and if so, how issues such as dependency, succession and trust will be resolved on a particular, as well as a systemic, basis).

          Public policy: The potential of the Web has attracted political attention at the global level. How will technical process values and the perceived interests of society be balanced?14

          Regulatory power: In recent years, governments have focused more on the improper use of standards to erect barriers to trade more than the lack of effective world standards. But the ability of standards-based efforts to control email abuses such as spam and phishing may be ineffective unless those solutions can be deployed universally. Ultimately, the power of an organization such as the ITU, rather than the apolitical powers of a consortium, may be needed to achieve an effective solution.

          New areas for standardization: New technical areas such as nanotechnology and bioinformatics are beginning to require standards. Will existing methodologies be adequate, and if so, which ones? How will cultural and other issues be addressed with the convergence of such disparate disciplines as IT and genomics?

Conclusions: In many ways, the way to ease the tension between current market forces and the extant standard setting infrastructure may be to hearken more to the lessons of the past. Many of the challenges described above derive from the need to achieve consensus among broader constituencies. Like a teenager that rebels against authority and then returns to many of the values that seemed most constraining when she becomes a parent herself, it may be that the pendulum in IT standard setting will swing back towards a greater degree of global coordination.

To date, however, consortia have shown little desire to form a common organization, or to formalize any other sort of analogue to the SDO hierarchy. The result has been a greater degree of duplication (sometimes productive and sometimes not), less interoperability in output, and greater opportunities for individual companies (or groups of companies) to have disproportionate influence on technical outcomes.

All of which is not surprising, since consortium staffs are both small and preoccupied with the challenges of accomplishing their internal tasks. More significantly, the largest technology companies seem to be largely untroubled by this situation. Without pressure from those that pay the bills, it is unlikely that consortium managers will take the initiative to form any new congress of consortia.

And, in truth there is a great deal of flexibility in the consortium model, and this flexibility has allowed solutions addressing the weaknesses in the current infrastructure to be created in particular cases, if not systemically. Examples include the evolution of consortium-like platforms (such as the Open Source Development Labs (OSDL) and the Eclipse Foundation) to support further progress with open source, as well as the creation of “MetaStandard Consortia” (see this month’s Trends article ) to address complex technical/business challenges through the creation of profiles and roadmaps of standards created by other consortia.

Whether such evolutionary innovations will be adequate to meet the need remains to be seen. But there is a greater likelihood that this may happen for several reasons. First, the non-SDO system is less rigid, and therefore more able to address new challenges without the need to create an entirely new system outside the existing infrastructure. Second, the non-SDO system is more results-oriented, and is perfectly happy to create not only standards, but reference implementations, test suites, open source software, registries, and whatever other commonalities are needed to accomplish a given goal. In truth, a new consortium concept is far more likely to come from the less constrained imaginations of the marketing and sales side of the corporate house than an engineering lab.

Third, with a consortium, victory is not assumed (especially when more than one consortium has been formed to address the same challenge), as it was by SDO members twenty years ago, when everyone knew exactly which technical committee of which Big I would be assigned to create a given standard. Consequently, consortium efforts tend (at least at times) to be more attuned to customer needs.

Still, one cannot help but look back with some nostalgia to a time when the IT industry was a fully committed member of an acknowledged, global system, created for the express purpose of developing non-competing, rationally related, universally adopted standards. Perhaps a way can be found to create structures that would allow the IT industry to enjoy more of the benefits of such a system in the future, without once again becoming constrained by the lack of flexibility that led to the fragmenting of the IT standard setting world some twenty years ago.

Comments? Email:

Copyright 2005 Andrew Updegrove

Endnotes

1. For a more in-depth history of the organizations discussed below, see Carl F. Cargill, Information Technology Standardization: Theory, Process, and Organizations, Digital Press, 1989. Vintage factual data included in this article that is not otherwise attributed is derived from pages 125 – 148 of this book.

2. Today, SIO has 99 Member Bodies, 41 Correspondent Members, and 10 Subscriber (“small economy”) members. ISO’s current organizational structure may be viewed at: http://www.iso.org/iso/en/aboutiso/isostructure/isostr.html

3. As of this writing, the IEC has 52 full and 13 Associate Members. The current organizational structure of the IEC may be viewed at: http://www.iec.ch/about/struct-e.htm

4. The ITU today has 189 member states. Like ISO and the IEC, its structure is now somewhat different, and may be viewed at: http://www.itu.int/aboutitu/structure/

5. ANSI follows a “federated” model of approving “American National Standards” created by individual United States standards development organizations with processes approved by ANSI. Despite being a non-profit entity independent of the United States government, ANSI is acknowledged as the national representative of the US in ISO and the IEC.

6. Over time, an increasing number of regional bodies came into being, particularly in Europe, following the decision to move to private sector standardization as the single European market year of 1992 approached. See: http://www.consortiuminfo.org/links/cats.php?ID=29

7. ANSI itself does not set standards. In consequence, X3 was managed by the Computer and Business Equipment Manufacturers Association (CBEMA), which later became the Information Technology Information Council (ITI). X3 itself became an organization in its own right: first called NCITS, and later INCITS, its current name. For a history if INCITS and the work begun in X3, see: Andrew Updegrove, "INCITS: Then and Now” The ConsortiumInfo.org Consortium Standards Bulletin, Vol. II, No. 5. April 2003.

8. The author, who began helping create consortia in 1988, was in the habit of privately referring to each new organization as the “Not Microsoft Consortium.” Under the bylaws of each of these organizations, Microsoft would have been eligible to become a member but (not surprisingly) never applied.

9. The NCRA was amended in 1993 to protect development activities (resulting its name being changed to the National Cooperative Research and Production Act of 1993, or the NCRPA. Most recently, it was amended once again in 2004, to explicitly extend protection to standard setting – but only by standard setting organizations themselves, and not their members. See, Andrew Updegrove, “What Does 1086 Mean to Consortia?” The ConsortiumInfo.org Consortium Standards Bulletin, Vol. III, No. 6, June 2004.

10. 88Open was the first consortium that the author helped form.

11. For a succinct history of UNIX, and the role of three consortia (X/Open, the Open Software Foundation, and their conjoined successor, The Open Group), see: http://www.unix.org/what_is_unix/history_timeline.html

12. OMG was the second major consortium the author helped to create (the total now numbers over sixty).

13. A commonality, as coined and defined by this author, is “whatever tool we need; that we need to agree on; to do the job that needs to be done.” Speaking of commonalities instead of standards allows emphasis to be placed on problem solving rather than methodology, and reminds us that standards are tools, and not ends in themselves. For more, see: Andrew Updegrove, “A Look into the Future: Not Standards, but ‘Commonalities,’” The ConsortiumInfo.org Consortium Standards Bulletin, Vol. III, No. 2. February 2004.

14. The multi-year World Summit on the Information Society (WSIS), conducted by the ITU under the auspices of the United Nations, is addressing this topic. See: Andrew Updegrove, “Who Should Govern the Internet?” The ConsortiumInfo.org Consortium Standards Bulletin, Vol. III, No. 7. July 2004.



TRENDS:

THE RISE OF THE METASTANDARD CONSORTIUM

Andrew Updegrove

Abstract: Most standard setting organizations rightly focus on limited subject matter areas. Historically, there has been no need for sophisticated standard suites, and hence no infrastructure evolved to efficiently create them. But in a modern, networked world, accomplishing a given set of business tasks may require the use of a set of tightly coordinated standards from many different standards organizations. As a result, a new type of organization has evolved to address this situation by assembling standards suites, rather then setting the standards themselves. This article explores the reasons why such organizations are necessary, and describes three of the first such entities to be formed.

Introduction: The creation of standards has historically been a discrete exercise. A given standard typically related to a single object (e.g., the luminosity of a light bulb, in the case of a performance standard), or at most two objects (e.g., the light bulb and its socket, in the case of an interoperability standard). Standards of a given type (e.g., not only sockets, but electrical plugs, cords, conduits, and connection boxes as well) could be created within a single organization of like-minded individuals that had little need to interface with the standards bodies of other trades.

In the early days of information technology (IT) and communications technology (CT), this was still largely the case. Computer connector specifications could be formed by technical committees within the International Electrotechnical Commission (IEC) that were concerned only with connectors, and radio frequency specification by other technical committees formed within the International Telecommunications Union (ITU). Even with the rise and proliferation of consortia, the situation did not markedly change. Computer connectors might now be specified within a consortium (the PCI Computer Manufacturers Group, or PICMG), but the need for PICMG to maintain working relationships with other standard setting organizations in order for each to do their respective jobs was still limited.

With the increasing complexity of IT and CT, and the convergence of both (ITC), however, this is no longer the case. The advent of the Internet and the Web creates the capability, and therefore the desire, to connect everything with everything. The historical challenge (daunting enough) of achieving interoperability with a network maintained by a single owner has been replaced by the urgent goal of enabling a useful degree of interoperability between the networks, and indeed the individual computers, owned by everyone on earth.

While the standards-based architecture of the Internet and the Web has already achieved this goal at the raw connection level, it does not allow a myriad of tasks to be performed without the use of additional standards, some of which are in existence, and many of which are not. Facilitating the creation, coordination and utilization of standards to enable the easy performance of such tasks is one of the challenges with which industry is currently preoccupied.

Unfortunately, the existing standard setting infrastructure was not created to facilitate this task. IT and CT standards of the type that now need to be assembled to perform a specific task are likely to be set by a variety of standard setting organizations (SSOs), some of which are likely to be consortia and others accredited standards development organizations (SDOs). Equally likely, some of these same organizations may view each other as competitors. Many will not historically have had any reason to work together. And if they have cooperated in the past, these efforts may not have involved actual joint development projects.

A new type of consortium: In an interesting display of innovation, a new type of collaborative effort has been conceived to address this need. This process recognizes the need to assemble and promote a suite of standards to enable the performance of a set of cross-platform tasks, but does not itself create the standards that comprise the suite. Instead, it assembles “roadmaps” or “profiles” of standards that have already been created by existing SSOs, and then identifies them to the industry for implementation as a package in order to do something that could not otherwise easily be done – or done at all.

The new type of organization formed to engage in such a process might best be called a “MetaStandard Consortium,” because its output is, in a manner of speaking, still a standard of sorts: a standard made up of standards.

To understand how a MetaStandard Consortium achieves its goals, we will examine three examples: In order of their formation, they are: the Web Services Interoperability Organization (WS-I), Mobile Imaging and Printing Consortium (MIPC) and Network Centric Operations Industry Consortium (NCOIC).*

Web Services Interoperability Organization: WS-I was formed to allow a new type of Web-based computing to emerge called “Web Services.” Under the Web services model, existing applications become “services” that may be called upon in an interoperable environment involving disparate operating systems. Achieving this goal requires a variety of standards and protocols that had already been developed by consortia such as the W3C, IETF and OASIS prior to the formation of WS-I, as well as the development of many more new specifications specifically tailored to make Web services possible. The WS-I “Web services standards stack” contains seventeen horizontally and vertically arranged modules of standards, which they visually present as follows:

Historically, there would have been three ways to accomplish this task: work within each organization to create the standards that seemed to be most appropriate to be created within that consortium; work within one of the organizations to “build out” all of the missing standards; or start a new organization to do the job. Each of these approaches, however, had shortcomings.

Working within three organizations to achieve a goal that had not been endorsed by any of the organization under a common plan would not be likely to result in a timely, coordinated suite of standards. Working within a single consortium, on the other hand, would not only threaten the other two, but would also not take advantage of the strength of each available organization. Finally, starting a new consortium would have antagonized all three groups, while increasing the likelihood that competing standards would be produced to accomplish some elements of at least some of the same tasks.

While WS-I prefers to refer to itself as an “integrator” that sits “downstream” from the standard setting process, its role has in fact been far more active. Where gaps in the stack exist, the founding members of WS-I have moved aggressively to fill them, by creating them in prototype form and then shopping each one to the organization it deems to be most appropriate to take it to final, consensus-based adoption.

In the more than three years since the launch of WS-I in February of 2002, a veritable blizzard of specifications have been created by the so-called “Men In Black” (Microsoft, IBM and BEA Systems) that have most aggressively committed to the achievement of the WS-I mission. A varying group of additional companies participated in the creation of many of the individual specifications, and each specification was offered to (and, to date, accepted by) an existing consortium, most frequently W3C or OASIS.

In order to jump-start the hoped-for industry-wide adoption of Web services, WS-I has also assumed an active promotional role, and created a variety of tools for implementers to use, including (besides standards profiles) guidelines and conventions for facilitating interoperability, sample applications (use cases, usage scenarios, sample code, and more), testing tools, and white papers. The concept has proven to be attractive: within fifteen months of its formation by nine companies, WS-I membership grew to 170.

In effect, the founders of WS-I conceived a supporting layer of structure both upstream as well as downstream of the actual standard setting organizations rather than seeking to coordinate the creation of the standards from inside the organizations themselves (of which they were already members). These layers permitted the core standard structure needed (in the eyes of the Men In Black) to be built out in record time, and with a coherency not otherwise obtainable.

At the same time, the members of WS-I bestowed upon themselves – and in particular upon the Men In Black -- an unprecedented level of control that they would not otherwise have enjoyed.

Mobile Imaging and Printing Consortium: The goal of MIPC is both prosaic and challenging: facilitating the printing of pictures taken by “mobile terminals,” and particularly cell phones. And while that goal may seem ordinary, the economic value of encouraging such behavior for mobile terminal and printer vendors, as well as telecom carriers, is enormous, with some 1.4 billion mobile phones in use worldwide – a number that continues to grow rapidly.

The expected volume of mobile device image printing was predicted by InfoTrend to be five billion pictures in 2004, increasing to 37.2 billion printed images in 2008, driven by the availability of increasing image capture resolution at decreasing cost, and the fact that 85% of all wireless phones sold in 2004 were expected to incorporate cameras. Future phones are expected to offer zoom lenses, intelligent applications and other enhancements.

While almost all of the basic pieces in the technology chain exist (camera; wireless delivery; software; printer), none of these elements was specifically designed to work effortlessly with the other elements to produce the desired result. Similarly, no single SSO exists that is concerned with more than a piece of the chain that begins with capturing an image with one device, and printing it with another. Thus, while the goal to be achieved is quite specific (printing pictures) rather than systemic (enabling generic interoperability, as with WS-I), the challenges are similar.

In order to solve the problem, Canon, Epson and HP founded MIPC in 2004. Other companies (e.g., Brother, Kodak, Lexmark, Motorola, NEC, Nokia, Samsung, etc.) came on board thereafter. The target acquisition devices were defined as “Mobile Terminals”, a term that includes camera phones and PDAs with full telephone capabilities, but excludes laptops (however enabled) and PDAs unable to make long distance connections.

Like WS-I, the output of MIPC is a set of standards (referred to by MIPC as a “Guideline”) accompanied by recommendations intended to facilitate interoperability. The target audience for the Guidelines is the developer community, which will hopefully follow the Guidelines in order to enable the “use cases” that the Guidelines are intended to facilitate.

The Guidelines take advantage of multiple existing technical approaches, rather than mandating a single approach. For example, there are three different described methods to connect and print, each using a pair of already existing, deployed standards that were deemed to be most appropriate: Bluetooth using BPP, USB using PictBridge, and memory cards using DPOFA.

Like WS-I, MIPC’s activities are promotional as well as technical, in order to encourage wide usage of the Guidelines by developers. To further encourage uptake, MIPC also stages “Plugfests” at regular intervals around the world, at which printer and mobile terminal vendors can test the interoperability of their devices and address any issues that may be discovered.

Network Centric Operations Industry Consortium: The ambitions of NCOIC dwarf even those of WS-I. While (like WS-I) the technical goals that NCOIC are generic, they are (like MIPC) specific to a given objective: making the United States Defense Department’s vision of the military of the future possible. Or, as stated at the NCOIC website, to help “accelerate the achievement of increased levels of interoperability in a network centric environment within, and amongst, all levels of government of the United States and its allies involved in Joint, Interagency and Multinational operations.”

At the macro level, this envisions an enormous, interoperable network accessed by hundreds of thousands of simultaneous users. At the micro level, it would mean that the data from a single battlefield sensor would be immediately known, and available to, all those on the network who have appropriate access rights, from the Humvee driver approaching the same sensor, to the Chairman of the Joint Chiefs of Staff thousands of miles away. The name given to this vision is “Network Centric Operations” (NCO), and the architectural challenges that must be addressed to achieve it are considerable.

The NCOIC was formed in August of 2004 by 28 companies, including many of the largest defense contractors in the United States (e.g., Boeing, General Dynamics, Lockheed Martin, Northrup Grumman, and Raytheon), as well as a broad range of the largest hardware and software companies (e.g., Cisco, EMC, HP, IBM, Oracle and Sun).

The planned deliverables of NCOIC fall into several categories:

  • An analysis of “pertinent government agency architectures, capability needs and mandated open standards” to identify what is working, what isn’t, and how the situation can be improved.
  • An evaluation of NCO architectural work already in process, involving initiatives such as GIG (Global Information Grid), NCOW (Net-Centric Operations and Warfare Reference Model), ForceNet (a Navy initiative focusing on the acquisition, sharing and usage of information superiority to “generate transformational combat effectiveness”), and LandWarNet (the Army’s portion of the GIG), among others.
  • Developing and defining an NCO Reference Model.
  • Assisting government agencies in developing a “secure information management overarching architectural framework/reference model”; identifying appropriate open standards and how they are currently being used; assessing available interoperability techniques in the NCO context; supporting “reusable long-term solution models that can be scaled and/or replicated, rapidly and cost effectively, for every enterprise”; and identifying the “widest possible community of open standards-based product types” that can be used to rapidly achieve a conversion to NCO.
  • Developing standards to fill gaps through NCOIC Technical Working Groups.
  • Promoting increased awareness, adoption and use of identified open standards and accelerate the move towards NCO.

The schedule for completing these deliverables is ambitious, and multi-day, face-to-face meetings are held at frequent intervals. Not surprisingly, the NCOIC expects more from its members than do most consortia that actually set standards as their primary focus. The highest level of NCOIC membership bears not only a dues obligation of $150,000 per year, but the requirement of providing the services of multiple “full time equivalent” (FTE) employees as well. The next level of membership costs $75,000, and also entails an FTC equivalent. A third tier of membership, with reduced privileges and commitments, is available to corporations for dues ranging from $3,000 to $25,000 (depending on member revenues) and to Academic and non-profit members for only $1,000.

Conclusions: MetaStandard consortia provide a highly targeted solution to the inability of the current standard setting infrastructure to meet the highly complex challenges of a modern, networked world. At the same time, however, the need to form such an organization necessarily results in delay in meeting market needs, as well as increased cost for those that must form and participate in them.

The rise of this new type of organization mirrors the evolution of the standard setting consortium some twenty years ago, which also resulted from the perceived inadequacy of the existing infrastructure to meet the standard setting needs of the information technology industry in a pre-networked world. Now, with the advent of the Internet and the Web, the consortium infrastructure itself is proving to be insufficient to meet the needs of those that initially created it.

The recognition that SSOs, like their corporate members, can become trapped in technology “stove pipes” is an important one, and the creativity demonstrated by those that have developed the concept of a MetaStandard Consortium to address that issue is to be commended. Experimentation will doubtless continue, with each organization taking somewhat different approaches as “best practices” evolve, either generally or for particular types of circumstances. Creating guidelines and profiles of standards that already exist (as do MIPC and NCOIC) is pragmatic and sufficient where the standards building blocks already exist, while conceiving and then farming out specifications (as have the core members of WS-I) may be necessary where goals are set before standards efforts have been begun elsewhere.

At the same time, the need to develop and deploy this new type of organization highlights the fact that the technology world has changed, but the infrastructure that creates its standards has not. There is today, due to convergence and the ability of the Internet to link everything to everything, a need for greater collaboration among those that set standards.

While creating more MetaStandard Consortia will help to address this problem on an interim basis, by definition the technique does so on a situational rather than a systemic basis: for every situation where a group of companies takes the initiative to form a MetaStandard consortium, there will be others that will be addressed in a less holistic fashion through a network of liaison relationships.

We think that the rise of the MetaStandard consortium is only a first step in the evolutionary changes that will be needed to fulfill the promises of a networked world. Unless more attention is paid to not only the promise that the future holds, but the limitations that the existing standard setting infrastructure is bound by, our enjoyment of those promises will be delayed.

Comments? Email:

Copyright 2005 Andrew Updegrove

* MIPC and NCOIC are clients of the author and his law firm, as is OASIS, one of the key consortia setting the Web services standards discussed below. For reasons of confidentiality, all information contained in this article relating to these three organizations is based on information available at the public pages of their respective websites.

For further background, see:

The May 2003 issue of the CSB was dedicated to the creation of Web services standards, and disagreements at the time over who should be setting them. See:

Who Should Set the Standards for Web Services?
http://www.consortiuminfo.org/bulletins/may03.php#editorial

The Role of Web Services Standards Bodies: In Their Own Words (Interview with WS-I, W3C and OASIS): http://www.consortiuminfo.org/bulletins/may03.php#featured

New Wine - Old Bottles: WS-I Brings a New Dimension to the Art of Making Standards Succeed: http://www.consortiuminfo.org/bulletins/may03.php#trends

MIPC Developer FAQ:http://www.mobileprinting.org/developers/faq

NCOIC FAQ: http://www.ncoic.org/htm_library/library.htm



FROM THE STANDARDS BLOG:

#28 All Social Standards are Local (or are they?)    While social standards have imprecise parameters, they are no less real (and important to society) than technical standards. Unlike their moral cousins, religious standards, social standards have no written reference point. In consequence, while religious standards of conduct are understood in more or less the same way by people the world over throughout long periods of time, social standards tend to be in a state of constant evolution.

Because social standards involve how those with whom we interact view us – favorably or otherwise -- they are intrinsically local. For example, behavior that would be viewed favorably in a “get ahead” society like America might be seen as unacceptable in a more traditional, class-based society.

Social standards tend to stabilize society, whether they instantiate moral values or not, since they regulate behavior without the need for enforcement by formal authorities. One reason they work so effectively is because social reaction follows social action so directly and (often) decisively, creating an ongoing feedback loop: act one way, and people we know think we are “good,” and choose to associate with us. Act another way, and people we know think that we are “bad”, and disassociate themselves from us. If we care more about the company of others than we do about gaining the monetary or other rewards that “bad” behavior may reap, then we act in the way that will be viewed as “good”.

The result is that one of the most powerful types of standards that regulates human behavior is, by nature, powered by local perceptions. If our actions will only have a negative impact at a great and anonymous distance, then a different type of regulating force is needed to control our actions: moral conviction (we do what is right because we would respect ourselves less if we didn’t); religious concerns (Someone else would think less of us); or the force of law (we might not only be fined or go to jail, but those around us would learn of our bad conduct, subjecting us again to the force of social standards).

Local, social forces can (or at least used to) have a significant impact on how corporations behave. For a time, those forces encouraged corporations to become better and better social citizens. In recent years, however, almost all of the evolutionary changes in modern commerce have acted to neutralize this effect.

Let us see how this process has played out, and whether other types of social standards have evolved to compensate for the loss.

100 years ago, most businesses of every type in the United States (as elsewhere) were owned and operated locally. Most service, retail and manufacturing businesses were still relatively small, and typically employed anywhere from a handful to a few hundred persons. Manufacturing concerns were largely family owned or controlled and, even as they employed an ever-larger percentage of the workforce, still usually operated out of a single, local, manufacturing facility.

Those who owned such businesses in the years before income and inheritance taxes became significant could accumulate significant wealth, and became the pillars of the communities in which they lived. In that role, they were expected to support their communities by taking leadership positions on charitable, educational and local bank boards, and to generally exhibit a concern for the welfare of the community.

Business owners that raised capital were even more closely tied to local opinion. In those pre-SEC times, stock was often sold locally in a face to face process, just as seeking angel investors occurs today. The founder of the Fort Howard Paper Company actually sold the initial shares of his company door-to-door. Those that bought into the new venture were fortunate indeed to have been at home when the founder rang the bell. In that era, those who purchased stock looked forward not to public offerings or the eventual sale of the company, but active economic participation in the business through receiving dividends.

Sadly, social standards did not much benefit those on the shop floor, since those that owned the companies, sat on the bank and charitable boards, and lived in the big houses on the hill did not mix socially with “the lower classes.” Because owners socialized with others who were well to do, a different force – unionization – was needed to upgrade the working conditions of those that supported the owners. But eventually, the values of society in general changed, and a business owner might find himself judged by his social peers in part based upon how he treated those that he employed.

By the 1960s, corporations achieved what might be considered the high point of domestic social responsibility, with some glaring exceptions involving practices such as polluting the environment that had not, as yet, been identified as examples of bad corporate citizenship. True, profit was important, but so were pension plans, reliable dividends, and supporting the local community. A significant percentage of the productive capacity of America was still closely held, and therefore management decisions could be shaped as much by social forces as profit motives, if the owner so chose. A local owner would not be likely to fire10% of the breadwinners in his hometown to relocate his factory abroad, unless he planned to relocate his family as well.

In the 1970s, corporate pensions were a major and increasing source of support to those in retirement. The assumption of both a blue as well as a white-collar worker was that he could not only work for a strong company for life, but that the same company would support him through retirement as well. Likewise, widows and orphans could own a portfolio of blue chip stocks and bonds, and expect to hold that portfolio over time, focusing on current return rather than on speculative increases in value. And a Boeing or a First National Bank of Boston was expected to be a major supporter of the community in which it was based, generation after generation.

Since then, of course, the basis for each of these assumptions has changed dramatically. New companies create 401(k) programs that carry no permanent funding obligations, and their managements not only plan on expanding and contracting their work forces in synch with economic conditions, but hope for a certain rate of turnover so that they can re-hire at the bottom of the pay scale.

Stock is now a poker chip rather than a long-term hold, and few new companies intend to ever pay a dividend. Rather than aspiring to one day achieve listing on the Big Board, NASDAQ has become the place to be. And with consolidation, one corporate headquarters after another has become just another branch office of a conglomerate, with far-reduced incentives to support local institutions.

The same consolidation has weakened social conduct towards workers as well. Wal-Mart can purchase its goods abroad and pay the lowest possible wages at home in part because there is no local accountability for its actions. Those who make the decisions are not part of the community that is affected. Ironically, those who most need the low prices that a Wal-Mart can offer are those under-educated, laid off, formerly well paid manufacturing workers whose market value has suffered most from the business model that makes the same low prices possible.

What all this means is that in order for social concerns to still have an effect on corporate behavior, they need to be expressed through direct action, rather than through the force of unspoken social standards of conduct. Since affecting the actions of large corporations is not easily accomplished, a significant number of people must agree on what conduct is unsatisfactory before the necessary consensus can form from which meaningful action can flow. In other words, while the social standards of a village can still affect behavior, that village must now be national, regional, or even global before sufficient weight accumulates to actually affect corporate conduct. And rather than a snub at the country club, an impact on the bottom line is now necessary to do the job.

Today, this is accomplished at two primary points: at the ownership level, through selectivity in stock purchases and through shareholder initiatives, and at the customer level, through selective shopping and the occasional public call for a boycott.

To date, the latter point of pressure has been the more successful than the former. Even without organizing boycotts, the success of the first hybrid cars to be offered to the public is already affecting design decisions in Detroit. The sneaker buying habits of socially conscious, upscale purchasers has also had an impact on how manufacturers treat their direct and indirect employees in Third World countries.

Still, if socially conscious investment funds become more popular, they may also have as significant an effect, especially if a tipping point in mutual fund investment choices is passed at which their decisions will propagate through the marketplace. Once social investing becomes significant enough to affect stock prices (if, indeed, that does occur), then even those that are only financially motivated will also demand ethical behavior on the part of public companies.

But how does one determine what the standard of proper corporate conduct is, and who is entitled to make that determination? Absent a standard, will corporations be conforming to “social standards” in the traditional sense, or simply economic market forces, with buyers of stock and shoes simply designating different types of product production methods (for example) as being desirable in addition to color and style?

It as at this point that the worlds of social standards and technical standards meet. Last year, ISO, the International Organization for Standardization, formed its first committee to set a non-technical standard. The subject matter? Corporate responsibility. In the future, corporate board members will be able to decide whether or not to adopt standards of corporate conduct that are more comprehensive and consistent, rather than reacting simply to the headlines of the day.

But will this really be a return to social standards in the sense of “doing right because I want to be seen as good”, or simply a new way to lower the volatility and increase the earnings multiples of a company’s stock?

Probably the latter. At the end of the day, the impact is the same, even if an old and wholesome dynamic in the marketplace fades from view, perhaps forever.

Comments? Email:

Copyright 2005 Andrew Updegrove

The opinions expressed in the Standards Blog are those of the author alone, and not necessarily those of
Gesmer Updegrove LLP

Postings are made to the Standards Blog on a regular basis. Bookmark:

THE REST OF THE NEWS

For up to date news every day, bookmark the ConsortiumInfo.org
Standards News Section


Or take advantage of our RSS Feed

New Initiatives

Legally, we have to move very carefully because the Americans have a tendency to sue anybody for anything. [May 11, 2005]
 
Deepak Phatak, announcing an initiative to develop the "Knowledge Public License"

Outsourcing GNU? India is not only using open source seriously, but thinking about it as well, as it its far-flung engineering schools seek to foster innovative thinking as well as raw coding proficiency. The following article reports on an initiative to fine-tune the open source-licensing model to foster for-profit innovation as well as collaborative contributions (and all without getting sued by Americans, if at all possible).

India eyes own open-source license
CNETnews.com, May 11, 2005 --
Deepak Phatak of the Indian Institute of Technology has kicked off an effort to create the Knowledge Public License, or KPL, a licensing program that will let programmers share ideas with one another while at the same time allowing them to retain the rights to their own software modifications. The license will likely function much like the Berkeley Software Distribution or the MIT License programs, he added. The idea is to create an environment where developers can take advantage of the collaborative power of the open-source movement while giving individuals the ability to exploit their own twists. Ideally, such a program could also help ease the raging tensions between the open-source software movement and proprietary software companies. ...Full Story

Back to the Past: Some of the very first types of standards in human history were agreed upon methods of measurement – length, weight, and purity. And while standard setting has gone on to all manner of different challenges, each relies upon being able to describe what is to be done and how to test whether the results are successful. This means that the right measurement tools must exist before new standards tools can be created. NIST is on top of that, as indicated by the first story below. And in a modern example of what may be perhaps the oldest of all types of standards – symbolic representations – the second article reports on the latest efforts to utilize standardized pictures to convey a thousand words, in this case about video content..

NIST Launches Initiative to Take Pulse
NIST Press Release, Washington, DC, May 11, 2005 -- An initiative to "roadmap" the nation's future measurement needs was announced today by the Commerce Department's National Institute of Standards and Technology (NIST). Necessary advances in measurement capabilities are basic to technological innovation, U.S. industrial competitiveness, safety and security, and quality of life. "The nation's measurement system is a vital element of our innovation infrastructure," NIST Acting Director Hratch Semerjian said during testimony before the House Subcommittee on Environment, Technology, and Standards. "The goal of this very important initiative, which will be undertaken in close cooperation with the private sector and other agencies, is to ensure that the nation's highest-priority measurement needs are identified and met. We need to be certain that the U.S. measurement system is robust so that it can sustain America's economy and citizens at world-class levels in the 21st century." ...Full Story

ETSI Human Factors designing access symbols to indicate special services for disabled users of ICT equipment
ETSI Press Release, Sophia Antipolis, May 11, 2005 -- ETSI has set up a new task force, Strategic Task Force (STF 286), to design, test and standardise a new set of five international symbols that are intended to be used to show the availability of special access services for users. The work will be complete by the end of 2006, with the publication of an ETSI Standard entitled; "Human Factors (HF); Access symbols for use with video content and ICT devices". The symbols can be used for example, to indicate that a film or television programme is provided with sub-titles or audio description. ...Full Story

What did I do with my glasses? In the modern world, if something is wireless and has a screen, however small, we’ll want to surf the web with it. Not surprisingly, the same tools that take a 17” monitor and the horsepower of a desktop system for granted don’t necessarily work on a PDA with a 2” screen and no hard drive. The W3C is going to do something about that, to help the 1.4 billion people on earth that carry a cell phone.

W3C Launches "Mobile Web Initiative"
W3C Press Release, Chiba, Japan, May 11, 2005 -- Today, at the WWW2005 Conference, the World Wide Web Consortium (W3C) announced the launch of the Mobile Web Initiative (MWI) - an endeavor to make Web access from a mobile device as simple, easy, and convenient as Web access from a desktop device. "Mobile access to the Web has been a second class experience for far too long," explained Tim Berners-Lee, W3C Director. "MWI recognizes the mobile device as a first class participant, and will produce materials to help developers make the mobile Web experience worthwhile. " Potential of Mobile Devices on the Web Not Yet Realized Many of today's mobile devices already feature Web browsers and the demand for mobile devices continues to grow. ...Full Story

New Initiatives

This isn’t determined by us, so from here on it’s going to get kind of random . [April 21, 2005]
 
John Bosak, Chair of the OASIS UBL TC, announcing that the next translation of UBL will be into Danish

What did you say? One of the problems with standards is that there is no universal language of technology, as there once was a common language of science (Latin, during the Enlightenment, at least). Given that standard setting is by its nature a low-budget, resource-constrained process, reliably high-quality translations are in short supply. At OASIS, a start on addressing this deficit is being made with respect to one standard, at least: UBL.

OASIS Approves First International Dictionary for UBL
By: John K. Waters
ADTmag.com, April 21, 2005 -- The Universal Business Language (UBL) is on its way to becoming truer to its name. The English-only standard for XML business documents in B2B applications, approved last November by the OASIS standards consortium, has been translated into four new languages. OASIS last week approved the first edition of the UBL 1.0 International Data Dictionary (IDD), which comprises over 600 business data definitions from the UBL 1.0 schema, combined with translations of the definitions into Chinese, Japanese, Korean and Spanish….Full Story

Reading, writing, and metadata: The world of academia, education and research continue to benefit from a series of standards efforts that are ongoing in a number of consortia and accredited standards bodies. The following articles report on an interesting collection of efforts in three different locations that address the same goal: facilitating on-line research. The first and second articles relate to efforts that will allow the best and most appropriate content to be found, while the third describes an initiative to provide more content to find.

New British Standard for online learning is published
PublicTechnology.net, May 11, 2005 -- new British Standard (BS 8419 Interoperability between metadata systems used for learning, education and training) seeks to provide guidance to developers and managers of learning resources in the UK on the use of metadata. By applying this standard, users will benefit from improved choice and quality for learners, effective cataloguing, searching and retrieving of learning materials and integration between systems to economise on content costs among other advantages. The Working Group took the standard from the DTI's approval for the original business case through to publication. ...Full Story

OpenURL Now a National Standard
NISO Press Release, May 4, 2005 -- The OpenURL Framework for Context-Sensitive Services (ANSI/NISO Z39.88-2004), which defines an architecture for creating a context-sensitive networked service environment, has received approval as an American National Standard. The standard had been in trial use since June 2003 and is now deployed in Google Scholar The OpenURL standard allows for the emergence of many different web-based service environments in which the context of the user is taken into account. For example, if the OpenURL is used in a service environment supporting the scholarly information community, a researcher or student searching for a scholarly information resource can get immediate access to the most appropriate copy of that item. In this case "appropriateness" reflects the user's context, such as location, cost of the item, and the contractual or license agreements in place with the information suppliers the university library does business with. ...Full Story

Dutch academics declare research free-for-all
By: Jan Libbenga
The Register, May 11, 2005 -- Scientists from all major Dutch universities officially launched a website on Tuesday where all their research material can be accessed for free. Interested parties can get hold of a total of 47,000 digital documents from 16 institutions the Digital Academic Repositories. No other nation in the world offers such easy access to its complete academic research output in digital form, the researchers claim. ...Full Story

Intellectual Property

Software is the combination of an original work of one or more algorithms, that is to say, a set of mathematical formulae. As Albert Einstein has said, a mathematical formula is not patentable. It is by nature an idea, like a book, a set of words, or a chord in music. [April 23, 2005]
 
European Parliament legal affairs rapporteur Michel Rocard

Ad in! The intellectual property rights tennis match being staged in the EU Parliament is continues into extra games, with the latest point going to the foes of European software patents. The following article details the Parliamentary duel in detail – a battle that that makes approving a conservative appeals court judge in the U.S. Senate seem like a romp in the park in comparison.

Dramatic Changes Proposed for EU Patent Proposal
By : Matthew Broersma
eWeek, May 18, 2005 -- European parliamentarians have put forward a list of more than 200 amendments to the European Union's proposed legislation on IT patents, which, if approved, would dramatically change the character of the controversial proposal. At stake is whether the EU will bring in more permissive rules on software patents, bringing it into line with patent practices in the United States and Japan. Currently, patents on pure software and business processes are not enforceable, making it impossible for large companies to bring their patent arsenals into play in the region. ...Full Story

Open Source

First you have to start with some dialogue. We are now interested in it, and we'd like to do this. [April 29, 2005]
 
Microsoft General Counsel Brad Smith, saying to the open source community for the first time, "We need to talk”

Come again? In one of those through-the-looking glass moments that seems to happen more frequently these days with Microsoft (remember the public love fest between unlikely stage-mates Scott McNealy and Steve Ballmer?), Microsoft General Counsel Brad Smith publicly stated that it was time for “bridge building” between the proprietary software icon and the open source community. What that means, exactly, remains to be seen. But it’s a start. Meanwhile, the other half of the WinTel proprietary team was busy forming its own internal open source group, as reported in the second article following.

Microsoft Reaches Out to Open-Source Community
By: Darryl K. Taft
eWeek, April 29, 2005 -- Microsoft Corp. has extended an olive branch to the open-source community, calling for a sit-down to discuss how the software giant can better work with the open-source world. But don't expect to see an open-sourced version of Windows any time soon. Microsoft is making nice with its open-source adversaries, while continuing to defend its rights to hold and use its arsenal of software patents. At a recent conference sponsored by the Association for Competitive Technology (ACT) in Cambridge, Md., Brad Smith, Microsoft's general counsel, called for bridge building between Microsoft, its competitors and the open-source community. ...Full Story

Intel Forms Internal Open-Source Group
By : Stephen Shankland
CNETnews.com, May 10, 2005 -- Intel spokesman Michael Houlihan confirmed the creation of the Open Source Program Office and said on Tuesday that Jon Bork, formerly general manager of the home product group, was named its leader on Thursday. The group parallels a similar one that handles Microsoft relations and operations, Houlihan said. Bork will lead Intel's engagements with Linux sellers and other open-source technology suppliers. Open-source divorce Intel has long been a supporter of Linux, which runs chiefly on x86 processors such as Intel's Pentium and Advanced Micro Devices' Opteron. Intel is working more actively to boost the operating system now, however. ...Full Story

Walking the Walk: Last issue, we reported that IBM had adopted open source methodology for internal development. This month, as noted in the following article, its welcoming the Firefox onto corporate desktops. The [Linux] penguin, of course, is already a core part of the IBM strategy. The second article following notes another area in which IBM is leading the open way: proposing an articulate strategy that embraces and employs both open standards and open source (not to mention liberal access to its vast patent portfolio). The rest of the industry lags in both regards.

IBM backs Firefox in-house
By: Martin LaMonica
CNETnews.com, May 12, 2005 -- IBM is encouraging its employees to use Firefox, aiding the open-source Web browser's quest to chip away at Microsoft's Internet Explorer. Firefox is already used by about 10 percent of IBM's staff, or about 30,000 people. Starting Friday, IBM workers can download the browser from internal servers and get support from the company's help desk staff. ...Full Story

IBM Launches Open Plan for Vertical Markets
By: Darryl K. Taft
eWeek, May 9, 2005 -- IBM has added key weapons in its arsenal for attacking vertical markets: its expertise with open source and open standards and its vast intellectual property reserves that can be released at a moment's notice. The Armonk, N.Y., systems maker plans to bundle open-source technology with open standards and release IBM patents, when necessary, in an effort to go after verticals and enhance its services opportunities, company officials said. ...Full Story

Who's Doing What to Whom

We do not view sending experts to international meetings on telecom issues to be a partisan matter. We would welcome clarification from the White House. [May 2, 05]
 
Nokia V.P. Bill Plummer, after four U.S. delegates to a telecom standards conference are dropped because they contributed to John Kerry's campaign

Politics as usual: One doesn’t usually think of standard setting as a political exercise (at least, not in the Beltway sense of the word). When standards relate to international trade, the two worlds can intersect, as when standards are used to erect technical barriers to trade. Still…

Any Kerry Supporters On The Line?
By: Viveca Novak and John Dickerson
Time Magazine, May 2, 2005 -- The Inter-American Telecommunication Commission meets three times a year in various cities across the Americas to discuss such dry but important issues as telecommunications standards and spectrum regulations. But for this week's meeting in Guatemala City, politics has barged onto the agenda. At least four of the two dozen or so U.S. delegates selected for the meeting, sources tell TIME, have been bumped by the White House because they supported John Kerry's 2004 campaign....The White House admits as much: "We wanted people who would represent the Administration positively, and--call us nutty--it seemed like those who wanted to kick this Administration out of town last November would have some difficulty doing that," says White House spokesman Trent Duffy....One nixed participant, who has been to many of these telecom meetings and who wants to remain anonymous, gave just $250 to the Democratic Party. ...Full Story

Story Updates

Chinese companies must develop high technology and we must make our own technical standards if we hope to change from 'made in China' to 'made by China'. [May 14, 2005]
 
Liu Qingtao, Lenovo Group

A billion potential customers focuses the mind: We dedicated our last issue to China, and the many standards-based facets of the emergence of that enormous country as a powerhouse on the world economic stage. Much of the current action involves the roll out of the third generation of cell phone technology – and who will make the profits from that technology in a market that already owns more cell phones than there are people in the United States. The following selection of articles summarizes the key themes at play: the resentment of China over the disadvantages it has historically suffered at the hands of the countries that already own the key technology patents, the upcoming decision by Beijing over who will get the licenses to roll out the technology (and what that technology will be – foreign or domestic), concerns in Washington over how standards will be incorporated into China’s trade policy, and, finally, the jockeying going on among Chinese and foreign manufacturers that want to be sure that they get a substantial piece of the action, no matter what standards are selected.

Nation strives for more say in technical standards
By: Zhao Xiaohui
China View, Beijing, May 14, 2005 -- After paying "patent rent" for many years, Chinese high-tech companies are faced with more and more intellectual property rights disputes and trade barriers. It has become an urgent task for China to strive for more say in making technical standards. "The formation of technical standard has shifted from 'marketdecides standards' in the past to today's 'standards lead markets'," said Zhang Qin, deputy director of the State Intellectual Property Office. ...Full Story

China to let market decide 3G standards and timeframe - MII
Forbes.com, Beijing, May 18, 2005 --
China will issue policy governing third generation mobile telecommunications (3G) at the 'proper time' for the market, China's Ministry of Information Industry (MII) said. Speaking at the Global Fortune Forum in Beijing, the head of the MII, Wang Xudong, did not give a timeframe on when 3G licenses will be issued or which standard China will adopt, only saying that China will use the technology at a 'proper time'. 'In short, we believe that the 3G technology promotion, at the end of day, depends on the maturity of the technology and the development of the market (in China) for us to make a decision,' Wang said. ...Full Story

Congress seeks to head off U.S.-China 'standards wars'
By: George Leopold
EETimes, May 12, 2005 -- Congress has jumped into the fray over whether technology standards are being used to erect trade barriers designed to protect emerging Asian industries. Exhibit A during a hearing this week (May 11) was China's attempt last year to establish a wireless standard known as the Wireless Authentication and Privacy Infrastructure. The WAPI security scheme would have required U.S. companies to manufacture two sets of chips, one for the Chinese market and another for the rest of the world. China ultimately backed down under U.S. pressure, but one lawmaker predicted that "China will continue to attempt to use standards to favor Chinese manufacturers." Added Rep. Vern Ehlers, R-Mich., chairman of the House Science technology and standards subcommittee: "U.S. companies and standards setting organizations are concerned that our trading partners are using technical standards as trade barriers to U.S. ...Full Story

ZTE Leads TD-SCDMA 3G Revolution With Widest Product Portfolio
webitpr, May 16, 2005 --
ZTE Corporation (Shenzhen: 000063, Hong Kong: 0763), China's largest listed telecommunications manufacturer and leading wireless solutions provider, has unveiled a full range of TD-SCDMA commercial equipment, making it the leading vendor for this 3G technology. Following successful MTNET tests, ZTE's TD-SCDMA equipment has demonstrated diversified 3G multimedia services including voice phone, streaming media and data download. The portfolio comprises core networks (ZXWN, ZXTR-RNC and ZXTR-NODEB) and service servers and was shown at last month's TD-SCDMA International Conference in Beijing. ...Full Story

Ericsson to develop TD-SCDMA technology for China
Reuters.com, Stockholm, May 10, 2005 --
Ericsson will develop technology to support new TD-SCDMA high speed mobile phone networks in China, the company's technology chief said on Tuesday. Chinese operators are now evaluating different 3G technologies, which allow users to transmit pictures and use e-mail through their phone. Ericsson currently has no technology to support TD-SCDMA standards, expected to be one of several mobile network platforms which may be deployed. Ericsson will either develop the technology itself or in partnership with other firms, Chief Technology Officer Hakan Eriksson told Reuters on the sidelines of the company's capital markets day. ...Full Story

China's 3G wins more favor from foreign telecommunication operators
By: An Bei
China View, Beijing, May 17, 2005 -- China's first homegrown third generation (3G) wireless telecom standard, the TD-SCDMA (time division synchronous code division multiple access) technology, is winning growing favor from foreign investors at the brink of being put into commercial operation this June, according to the Economic Daily on Tuesday. At the 2005 International TD-SCDMA Summit held in April in Beijing, Alcatel Shanghai Bell Co., Ltd. and Datang Mobile jointly demonstrated the solution to TD-SCDMA. "It is a sign of significant progress in China's TD-SCDMA industry to show that the solution is reliable, and that is the basis for it to be put into commercial operation," said Gerard Dega, president of Alcatel Shanghai Bell. ...Full Story

Standards and Your Business

No one has mobile WiMAX standard equipment . [May 17, 2005]
 
Alvarion CEO Zvi Slonimsky, commenting on standards "gun jumping" claims of other vendors

Gun jumping redux: The wireless space seems particularly prone to the practice of “gun jumping:” announcing what are purported to be standards-compliant products before the compliance tests have been developed and distributed to testing centers so that vendors can find out whether their customers will actually be able to use them as promised. When the hype around a new standards-based technology is strong enough, vendors will sometimes not even wait until the standards are done, as reported in the following article. The results are usually unhappy, as initial customers are disappointed, and those vendors that wait may find that the market has become cautious and less ready to trust them when their (actually compliant) products are put onto the market.

Ready for mobile WiMAX
Without a standard there's no equipment
By : Hadass Geyfman
Globes Online, May 17, 2005 -- The WiMAX market has not even been launched yet, but activity surrounding it is gathering pace as the estimated launch date for WiMAX equipment draws closer. One of the questions currently exercising the minds of operators and analysts alike is whether it will be possible to upgrade existing wireless-wireline telecommunications equipment to fixed WiMAX standard equipment, or whether operators will have to replace equipment. The fixed WiMAX standard has already been set, but it is not possible to supply operators with standard compliant equipment because there is no compatibility yet between the equipment of all the different manufacturers. ...Full Story

Regulators meet IT directors: With the increasing movement of government to adopt IT-based tools and solutions has come a need for companies to get on the same band wagon – or else. The following selection of articles from around the world focus on the issue of regulatory compliance, using standards-based (and mandated) systems.

Banking Regulators to Launch XBRL-Powered Call Report Database
By: Ivan Schneider
InformationWeek, May 10, 2005 -- The Federal Financial Institution Examination Council (FFIEC) will soon launch a project that will enable federal banking regulators and the public to access a common pool of information about the banks under their supervision. The initiative, known as the "Call Report Modernization Project," revolves around a Central Data Repository containing the quarterly regulatory filings of over 8,400 financial institutions. All of the information within will be "tagged" using eXtensible Business Reporting Language (XBRL), a cross-industry standard for representing financial data. ...Full Story

OMG Forms Two Groups to Address Complex Regulatory Compliance Issues
OMG Press Release, Needham, MA, May 10, 2005 -- The Object Management Group(tm) (OMG(tm)), today announced an initiative to help Chief Compliance Officers, IT directors, CFOs and Legal executives address the IT impact of the many regulatory compliance requirements. Global firms are constrained by hundreds of government regulations, ranging from broad governance rules such as Sarbanes-Oxley and the Gramm-Leach-Bliley Act to local privacy laws. Identifying all the appropriate laws for all geographic operating markets - and dealing with inconsistencies - is a daunting task for IT. Minimizing risk while maximizing the ROI on compliance projects requires a standards-based integrated approach to compliance management. "We used to make a big distinction between regulated and unregulated industries, but today, virtually every enterprise is constrained by a combination of governance, privacy, security and environmental regulations. ...Full Story

IT decision-makers not complying with governance and standards is a high risk strategy for UK business
Principia, May 3, 2005 --
According to the latest survey from the National Computing Centre, the 'Benchmark of IT Strategy 2005', 44% of IT decision-makers surveyed admitted not being fully aware of IT standards and legal requirements. The survey which was conducted amongst 300 IT decision-makers shows a lack of awareness of the requirement for the IT function and infrastructure to comply with IT standards and legal requirements. Of the 44%, half were only partly aware of IT standards and legal requirements, whilst the other half were neither aware of such requirements nor aware of the impact on IT. Stefan Foster, Managing Director of NCC said, 'This is an alarming figure, indicating significant lapses in compliance and poor adoption of best practice. What is more shocking is the potential impact this will have on businesses who don't comply. The public sector will soon demand compliance as part of the tendering processes through initiatives such as e-Gif (e-Government Interoperability Framework). Larger corporates will also insist on compliance to standards so as to minimise risk in their supply chains, so non-complying IT functions beware . you could affect the fundamentals of your business.' ...Full Story

Xinhua Finance to Pioneer XBRL Financial Reporting Technology in China
PRNewswire, Hong Kong, May 12, 2005 --
Xinhua Finance (Tokyo Mothers: 6399), China's premier financial services and media company, is set to play a major role in the expansion of the international business reporting language XBRL (eXtensible Business Reporting Language) to the China market through its U.S. subsidiary Mergent, Inc. Xinhua Finance has become the first XBRL Direct Participant Member in China and will be working with regulatory authorities and major financial institutions in the development of Chinese XBRL rules. It will benefit from Mergent's expertise in XBRL, which is already used throughout Mergent's databases and services. ...Full Story

Standards and Society

We want interests and needs of a larger community to be represented in the work that we do. The World Wide Web requires input from the whole world.   [April 20, 2005]
 
W3C representative Ian Jacobs, announcing W3C membership fee discounts for developing country participation

One world, many issues: The potential value of the Internet and the Web to third world countries is undeniable, and many are working in diverse ways to help realize that potential. In the first article below, the W3C has announced that it will make it more financially feasible for representative interests of developing nations to participate in the development of the Web. The second reports on the results of a conference on how to address issues of multilingualism on the Internet. And the third, less happily, focuses on non-technical impediments to free speech and freedom of access to information: the conviction of a reporter in Tunis – the site of the upcoming meeting of the World Summit on the Information Society.

W3C cuts member fees to help developing countries
By: Paul Festa
CNETNews.com, April 20, 2005 -- Claiming that its existing fee schedule discouraged such groups from joining, the Web's main standards body on Wednesday said it had given small companies and nonprofits in lower-income countries a 15 percent to 60 percent fee reduction. "We want interests and needs of a larger community to be represented in the work that we do," said W3C representative Ian Jacobs, who added that the fee reductions were part of a larger international outreach effort. "The World Wide Web requires input from the whole world." The W3C's new fee schedule, effective April 1, groups nations into four categories established by the World Bank: high-income, upper-middle income, lower-middle income and low-income. ...Full Story

Multilingualism in Cyberspace Conference Concluded in Bamako
WebWorld, May 11, 2005 -- Essential steps to ensure that a language, that is not yet represented on the Internet, is included in cyberspace, were identified at the conference on "Multilingualism for Cultural Diversity and Participation of All in Cyberspace" that UNESCO and partners organized in Mali's capital Bamako last week. The over 130 participants from 25 countries concluded that there is a need for written national language policies that must address the issue of language in cyberspace. They stressed that standards are crucial to create, access, disseminate and preserve multilingual content in cyberspace, particularly in endangered and lesser-spoken languages. Participants also pointed out that local content is critical to foster a multilingual cyberspace and to ensure that members of all communities can share in the benefits of cyberspace. ...Full Story

Lawyer's trial slated as a "mockery"
Reporters Without Borders, April 29, 2005 -- Reporters Without Borders condemned a "mockery of a trial" in which a lawyer was found guilty of posting "false news" on the Internet and urged democratic countries to boycott the World Summit on the Information Society (WSIS) in Tunis in November 2005 unless Tunisia ended its Internet crackdown and released him. Mohammed Abbu was sentenced overnight on 28-29 April to three years and six months in prison "at the end of a trial that trampled on the most elementary rules of law," said the worldwide press freedom organisation. "The charges against him were baseless. He was really punished for having used the Internet to criticise government corruption," it said. "In a cruel irony, he will be in prison when the WSIS opens in Tunis, in November 2005 - a conference on the circulation of news and information on the Net." ...Full Story

Standards in Action

SCP [Support Center Practices] Certification raises the image of our Beijing, China call center throughout the industry and is an excellent marketing differentiation tool. [May 3, 2005]
 
Dr. Baumin Lee, CIO of 95Info, Inc, predicting that your next tech support call may go to China

Support Center Practices (SCP) Certification Program Extends Global Reach in Asia Pacific
Yahoo.com, San Diego, CA, May 3, 2005 -- Service Strategies Corporation, administrator of the Support Center Practices (SCP) Certification program, today announced that organizations in China, Taiwan, India, Malaysia, Singapore, and Japan have adopted the SCP standard and have achieved SCP Certification over the last twelve months. The world's leading service and support providers use the Support Center Practices (SCP) Certification program as a roadmap for service excellence. "SCP Certification raises the image of our Beijing, China call center throughout the industry and is an excellent marketing differentiation tool," states Dr. Baumin Lee, CIO of 95Info, Inc. "By achieving SCP Certification, our customers are assured that we are committed to maintaining high service levels and quickly and efficiently solving issues." ...Full Story

Standards Are Serious (Aren’t They?)

If we don't know where we are, how will we know when we improve? [May 10, 2005]
 
JSR Genetics' Brian Edwards on the need for "a workable national [artificial pig insemination] standard that would meet the needs of both the studs and producers

Keeping your eye on the ball: What can we say: when a nation has lost confidence in its supply of pig semen, its time to call in the standards experts. And kudos must also go to the farmers that are willing to work with porcine studs to set those standards. But in the absence of adequate sow recording, it appears that challenges face this worthy effort. Stay tuned.

Britain to get 'national standard' for sow productivity
ThePigSite.com, UK, May 10, 2005 -- There was a groundswell of feeling among producers last year that some [swine]infertility problems might be caused by variable-quality purchased semen - so the decision was taken by NPA to work with studs to produce a national AI [artificial insemination] protocol. The intention of this project is to restore confidence in the use of purchased AI and to use this as a springboard to help the industry start to improve sow productivity generally, including farrowing percentage, litter size and litters-per-sow-per-year. The breeding companies suggested an independent expert be employed to help draw up a workable national standard that would meet the needs of both the studs and producers, and as a result BPEX approached respected Dutch specialist Hanneka Feitsma....When the standard is agreed it will be auditable, so producers who buy semen will know they are getting a consistent product....Whilst the breeding companies are keen to work together to ensure producers receive a consistent product, they are also clear that many producers will need to sharpen up their AI standards....The work of the AI national standards team is hampered to a degree by the lack of sow recording these days in the national herd. "If we don't know where we are, how will we know when we improve?" wonders Brian.... ...Full Story

 

 
L10 Web Stats Reporter 3.15 LevelTen Hit Counter - Free PHP Web Analytics Script
LevelTen dallas web development firm - website design, flash, graphics & marketing