Consortiuminfo.org Consortium Standards Bulletin- May 2006
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
  Untitled Document
Untitled Document

 

May 2006
Vol V, No. 5

BRIDGING THE OPEN SOURCE- OPEN STANDARDS DIVIDE

EDITOR'S NOTE: EVERYBODY'S TALKING (BUT WHAT ARE THEY SAYING?)
 
EDITORIAL: MEETING IN THE MIDDLE
It's easier to focus on what separates open source and open standards rather than on what they have in common. But it's more useful to talk about how the two processes could work together more productively. Print this article
   
FEATURE ARTICLE:

THE FREE STANDARDS GROUP: SQUARING THE OPEN SOURCE/OPEN STANDARDS CIRCLE

Open standards must be fixed in order to be useful, but open source software is constantly evolving. And while open standards can prevent vendor lock-in, open source licensing terms guarantee vendors the ability to achieve that end. Is there a way to have the best of both worlds? The answer is yes, and the Free Standards Group is proving it. Print this article
   
STANDARDS BLOG: LET THE READER BEWARE AS ODF NEWS COVERAGE INCREASES
As the battle heats up between the OpenDocument Format (ODF) standard and the Microsoft XML Reference Schema (now called "Open XML" by Ecma), there's more and more press coverage. And that's not necessarily a good thing. Print this article
   
CONSIDER THIS: THINKING ABOUT STANDARDS INSIDE OF THE BOX

Fifty years ago a humble, standardized commodity product first went into action - and rapidly changed almost everything about global shipping. Print this article

 
ANNOUNCING:       

OPEN FORUM FOR STANDARDS DEVELOPERS

Updated information on this follow-up to last-year's landmark meeting between representatives of the consortium and accredited standards development communities.
   
NEWS SHORTS:  
ISO Now an ISO/IEC Standard; ODF Journalists Decline to Debunk Disinformation; Sun Says "Yes, But" to Open Sourcing Java While Motorola Open Sources Supporting Tools for Java on Mobile Devices; Semantic Web Play Attracts VC Financing; OASIS Approves Standard to Cut Down on "Techno Babble;" New Consortia Spring Up Everywhere (as do New Initiatives and New Standards); and, as always, much more.

Print this issue (PDF)


 

EDITOR'S NOTE:

EVERYBODY'S TALKING, BUT WHAT ARE THEY SAYING?

IT vendors these days are in love with the phrase "open source and open standards" (as in "you should buy our open source/open standards-based solutions"). 

If you haven't already noticed that fact, here's a little experiment to try that will make the point:  Google the phrase "open source" AND "open standards."  I just did, and got 4,700,000 hits, with 284 leading to recent news articles.  Odds are, though, that few of those hits would take me to a knowledgeable discussion of how open source and open standards can and should work together.

In fact, while open source and open standards each work well in isolation, those who create them haven't always played very well together when they find themselves in the same sandbox.  As those 4,700,000 hits suggest, however, there's a great need for both sides to learn how to optimize the relationship between open standards and open source.  And that's the theme of this issue of the CSB, revisiting a topic I last covered in detail in the March 2005 issue.  That issue was called What Does "Open" Mean? and my editorial then was titled A Call for Communication Between Communities.

I think that there's been some progress since then on the communication front, with many on each side now recognizing the need to coordinate their separate activities.  But the logistics of that coordination remain challenging, given the differences between these two very different methods of addressing what are nevertheless in some respects the same problems.

In this month's Editorial, I therefore explore some of the technical and social reasons why open source and open standards can sometimes be a difficult fit, but point out that both the open source and the open standards communities – not to mention end-users – are becoming increasingly interdependent.  The result is the need for each camp to adopt a spirit of partnership with the other, to work out the type of solutions that will benefit all.

The Feature Article this month spotlights just such a partnership between the worlds of open source and open standards, formed to address an issue of great importance to all IT users: the potential for Linux to fragment, leading to a replay of the "Unix Wars" of the past.  That partnership is manifested by the Free Standards Group (FSG), a consortium that brings together representatives of the open source community, vendors and others to set standards that enable multiple, compliant Linux distributions to interoperate with applications, and therefore allow end-users to avoid lock-in to a single Linux distribution. 

That article includes a lengthy interview with Jim Zemlin, the Executive Director of FSG, and Ian Murdock, its CTO and the creator of Debian, a leading free Linux distribution.  By way of disclosure, I'm proud to be a director of FSG, which is also a client of my law firm.

My Standards Blog selection for this month focuses on a different sandbox situation: the ongoing competition between the proponents of the OpenDocument Format (ODF) specification (now an ISO adopted standard), and the Microsoft XML Reference Schema, now called Open XML by Ecma, a European standards group that is fast-tracking it for submission to ISO as well.  As this story has grown increasingly prominent, it has received increasing attention in the press, but not all of the stories that have resulted have been well reported.  In some cases, the reporting has been merely careless, while in others it has served to help spread disinformation, resulting in my admonition to Let the Reader Beware as ODF Coverage Increases.

My Consider This… piece for May covers a happier topic, celebrating the 50th anniversary of a little-appreciated, commoditized product: the humble yet ubiquitous shipping container, whose usually battered image belies the fact that its standardized features have enabled the transformation of the global shipping industry.

The issue closes, as usual, with the Rest of the News – a collection of what I thought were some of the most significant and interesting stories of the past month, selected from those that I posted on a daily basis at the Consortiuminfo.org Standards News Portal.

Finally, please note the updated information regarding a June 20-21 Conference in New York City that will bring together representatives of all types of standard setting organizations to discuss matters of mutual concern.  The list of confirmed attendees is already large and growing, but there's still room for you to participate as well, so consider registering to do so today.

As always, I hope you enjoy this issue. 
    Best Regards,
  signature
  Andrew Updegrove
  Editor and Publisher
   

2005 ANSI President’s
Award for Journalism


 

EDITORIAL

MEETING IN THE MIDDLE

Andrew Updegrove

Few phrases appear together more often in technology news today than "open standards" and "open source."  As often as not, these words are used by vendors and service providers in materials promoting their wares.  In general, that's a good thing, because it indicates that the marketplace is associating value with open standards and open source software – a perception of value that vendors and service providers wish to borrow upon when they associate these phrases with their offerings.

But it's also a bad thing, for several reasons.  One is that these phrases are too often used to describe tools and environments that are not actually "open," or that do not in fact achieve interoperability.  The obvious danger arising from such loose usage is that confidence in open architectures and systems may be undermined.

But there is a second, and more difficult challenge to be addressed before the full potential of systems based upon open standards and open source software can be realized.  That challenge is the fact that the relationship between open standards and open source software is still being negotiated, and I use that term advisedly.  For example, many "open" interoperability standards are subject to the right of patent holders to require implementers to pay royalties and sign non-transferable licenses, thus rendering such standards unusable in open source settings.  In consequence, standards subject to such restrictions are very much "not open" in the eyes of the open source community. 

On the other hand, the open source community has not yet taken advantage of the value that open standards can provide for its own work product, where the absence of licensing terms that restrict rights to make derivative works enables the kind of "forking" of open source software that may greatly decrease its usefulness.  The proliferation of such multiple variations of the same software may eventually lead to the capture of crucial open source software by proprietary vendors that create "sub brand" distributions.  This can occur if such a distribution requires that other software needed by the und-user must be adapted to the requirements of the sub-brand.  Once this occurs, that software may not be interoperable with other distributions, requiring costly relicensing or patching if the customer wishes to later move to a different distribution. 

The result of this kind of capture would be the type of "lock in" of customers on specific versions of open source software that already exists in the realm of proprietary software.  Linux itself may be at risk of suffering just such a fate, given the fact that vendors are free to differentiate the distributions that they build around the same Linux kernel, and thereby encourage potential customers to buy their support services and add-on software.  If end-users lose the option of making easy migrations from one Linux distribution to another, open Linux source software begins to look "not open" to those that create and use open standards.  Ultimately, this could lead to a destructive replay of the "Unix Wars" of the recent past.

In an ideal world, every customer should be able to assemble an IT environment comprising whatever mix of open source and proprietary software and hardware it found to be most suited to its needs, and open standards would ensure that all of these elements would work together harmoniously.  Moreover, that customer could swap new elements in and out of her systems without concern or costly patching, and ISVs could avoid costly porting to multiple platforms. Such an "open architecture" would dramatically decrease both the costs as well as the risks of ownership. 

Today, much of the potential value of such open architectures remains both unrealized and at risk, although there are enough pieces in place to demonstrate the value of achieving such an interoperable nirvana.  Ways need to be found to close the gaps that remain, so that customers can gain the cost, flexibility and other benefits of open source software while enjoying the range of product options and pervasive interoperability benefits that open standards can offer.  Finding a way across the divide that in many ways separates the proponents of open source software from those that advocate open standards, however, will be challenging.

At the heart of the problem lie a number of seemingly insurmountable differences between open standards and open source: the first is the fact that standards describe certain attributes of things, whereas open source software is the thing itself.  Another is the reality that open source is not only a thing, but also a set of strict licensing requirements, and these requirements are both more rigid as well as more demanding regarding the intellectual property rights (IPR) of technology developers than are those of traditional open standards.  Yet another difficulty arises from the fact that one required open source licensing term guarantees the licensee the right to change the software in question in any way that the licensee wishes, while the value of open standards relies on requiring that the standardized aspects of the subject of the standard do not change at all.

As disparate, and even mutually exclusive, as some of these differences seem to be, there are ways to bridge the gap, if both the open source as well as the open standards communities are willing to work together.  An example of such a successful collaboration can be found in the Free Standards Group, a standards development consortium that works in real-time with the Linux community to create standards to prevent the forking of Linux and other open source software.  Each side voluntarily concedes a bit of freedom in order to achieve mutual goals, to the benefit of all.

In order to achieve the reality of open architectures, then, the following commitments are needed from both the open source and the open standards communities:

1.  Each community needs to take the "must have" requirements of the other community's regime as a given.

2.  Vendors must conclude that the value of the sales opportunities that they will gain from the proliferation of open architectures exceeds the economic value of the IPR that they agree not to assert.

3.  Most importantly, both communities must enter into a spirit of partnership, based upon the realization that only through working together can the reality and promise of pervasive open architectures be achieved.

The benefits of such a partnership are clear.  It's time that proactive, visionary leaders on both sides of the open source – open standards divide begin building bridges across that gulf, so that together they can create the open architectures of the future.

subscribe

Comments? Email:

Copyright 2006 Andrew Updegrove


 

FEATURE ARTICLE

THE FREE STANDARDS GROUP:
SQUARING THE OPEN SOURCE/OPEN STANDARDS CIRCLE

Andrew Updegrove

Abstract: The creation of open standards for software and the development of open source software seem as different as night and day, due to the great differences between the two end products: the former describe features of, and interfaces between, software, while the latter is the software itself. Again, the utility of the former relies upon mandating that corresponding elements of compliant products remain unchanged, while a feature of the latter (guaranteed by licensing terms) is that it may evolve on a constant basis. But achieving the full value of open standards and open source software in the modern information technology world requires that open source software support open standards, and therefore that these very different tools be developed in a coordinated fashion – a seemingly insoluble dilemma. The following interview with the Free Standards Group, a voluntary, consensus-based non-profit organization formed to create standards for Linux and other key software in the open source development "stack," demonstrates how this difficult goal can be achieved, based upon novel techniques and the agreement of both constituencies to coordinate their activities in a real-time, collaborative fashion to the mutual benefit of those involved, as well as to the end-user community.

Introduction: Open source software (OSS) and open standards share tantalizing similarities and frustrating differences. The similarities include the promise (for end users) of a greater range of less expensive products, and (for vendors) reduced research and development costs, lower risks, and larger and faster-emerging markets. Unfortunately, the differences include historically incompatible licensing regimes and profoundly different attitudes towards the dynamism of software.

During the early years of the open source revolution, these differences caused few problems.  This was because the first OSS projects created discrete, stand alone programs, and few software standards had historically been encumbered by licensing terms that were incompatible with open source licenses.  In recent years both of these situations have changed, increasing numbers of OSS products have been developed, and more and more patents have been asserted against key standards as well as OSS.

The most prominent example of pervasively deployed OSS is the "LAMP" Web server stack, which comprises the Linux operating system, Apache Web server, MySQL database management system, and the Perl, PHP and Python scripting/programming languages.  The wide adoption of the LAMP stack by major enterprise customers has helped credential the commercial viability of community developed software, and accelerated the development and update of other open source programs.  The increased credibility of such OSS has also inspired major vendors, such as IBM and (more recently) Sun, to base more of their strategic direction on the provision of services based in part on OSS, rather than on the sale of their historically proprietary products.

As already noted , however, there have been increasing patent challenges to both OSS and open standards.  The ongoing litigation by SCO, which claims (among other things) that elements of its proprietary software have been included wrongly in Linux, cast a temporary shadow over that operating system and OSS in general.  Additionally, so called "trolls" have asserted their patents against implementations of a variety of standards after those standards became widely deployed.

The result has been a heightened sensitivity to intellectual property rights (IPR) on the part of both IPR owners as well as users of OSS and open standards.  This has resulted in a rude shock for the open source community, which has increasing needs to utilize open standards in order to integrate OSS interoperably into mixed end-user environments.  Because many of those open standards are created under traditional "RAND" terms (i.e., "reasonable and non-discriminatory terms"), this may be impossible, because RAND terms allow IPR owners to both charge royalties as well as forbid sublicensing of rights conveyed.

There is a second challenge to OSS, however, that has been less recognized.  That challenge arises from the dynamism of OSS itself.  At the community level, OSS is under constant, real-time evolution.  And at the user level, OSS licensing terms permit the making of whatever changes are desired, whenever the user so desires.  Open standards, in contrast, only work when certain elements of a software program are kept constant, or are only allowed to change in ways that would not jeopardize interoperability. 

Of course, an open source community can make it a priority to maintain interoperability, and simply avoid making changes that would break compliance with a desirable open standard, and an end-user can make the same decision.  But what if the community is not so motivated, or puts a higher value on innovation in some area than maintaining compliance with the standard?  And similarly, what if the developer of the standard decides to permit inclusion of IPR in its next release of the standard, and the owner of that IPR insists on requiring a license that is incompatible with open source licensing requirements?

The tolerance of open source licenses for the creation of derivative works that vary from the original causes a second problem that can undercut the value of open source software itself.  This can arise when multiple distributions of the same OSS are each changed in ways that require the developers of other software to tailor their own wares to the unique requirements of each distribution.  The result is that the owner of one such OSS distribution can no longer easily migrate to another nominally equivalent distribution without relicensing or patching its other software.  In short, the end-user has become just as "locked in" by the vendor of its OSS distribution as it might have been if it had licensed a proprietary alternative.

This is already happening in the Linux environment, where the distributions of Linux continue to multiply, each optimized in one way or another for a particular use, or to increase its appeal to commercial users.  If left unchecked, the result would be a replay of the fragmentation experienced by Unix in the past, and the loss once again of the benefits to end users of a marketplace based upon multiple versions of a common operating system, none of which requires the end-user to lock itself in to the software and services of a single vendor.

The way to avoid this result is once again through the development of standards.  Using Linux as an example, the standards create a layer between the operating system OSS and the applications that run on top of it.  If each distribution vendor agrees to support the standard, then an end-user may change distribution vendors at will, with little additional cost associated with making the change.

In principle, the creation of such standards sounds simple.  But in practice, there are challenges, because the open source community that is creating the OSS must buy in to the value of the open standards.  If it does not, then it can evolve the OSS in ways that break backwards compatibility.  Similarly, the developer of the standards must commit to continuing to track the development of the OSS wherever it goes, and its members must agree to forego royalties and restrictive licensing terms with respect to any patent claims that might be infringed by the developer's standards.

In short, things are not so simple, since both the OSS community and the standards developer must be aware of the constraints and goals of the other in order for the combination of the OSS and the OSS standards to have optimal value at minimum compromise. 

Moreover, the role of the standards developer and the software developer are reversed in time.  Rather than developers creating software to implement a completed standard, the developer of the standard is creating a standard to keep pace with the ongoing development of the OSS – an altogether different and more challenging proposition.

The only way to achieve this result is for the two processes to proceed in parallel, and with a degree of communication that would not be necessary in traditional standards development.  A partnership of sorts is therefore required, in which each participant recognizes the value of the other, and agrees to work in cooperation to achieve results of mutual benefit.

Today, there is a single example of such a novel partnership, and happily that partnership is succeeding.  Not surprisingly, it was created to first address the danger of Linux fragmentation, and its name is the Free Standards Group (FSG).  Given the fact that the risk of forking and lock-in is common to all OSS, the FSG example is worthy of study, and of emulation in any situations that the FSG does not itself plan to address.

The following interview is intended to showcase how the FSG has creatively bridged the historical divide between OSS and open standards and is succeeding in guarding Linux from suffering the fate of Unix.  [Disclosure:  the author is a director of the FSG, which is also a client of his law firm.]

The interview:  This interview was conducted in May of 2006 by email with Jim Zemlin, the Executive Director of the FSG, and Ian Murdock, FSG CTO, chair of the FSG Linux Standards Base workgroup, and creator of the Debian GNU/Linux free Linux distribution.

I.   Overview Questions

1.  How did the FSG come about, and what was its original goal?

The Free Standards Group was formed in 1998 to promote open source software through standards. Specifically it was formed to prevent the fragmentation of Linux. The participants in the Linux ecosystem realized early on that an open standard delivering application portability was absolutely necessary to prevent the fragmentation of Linux. Out of this concern, the community banded together to form the Free Standards Group (FSG), a standards body tasked with developing open, international standards that would deliver on the vision of binary portability within a competitive Linux distribution ecosystem.    Basically, we are trying to do for Linux what Unix tried to do and failed, resulting in Solaris, AIX, HPUX, Novell, etc…

2.  How has that goal changed over time, and how would you phrase the mission of FSG today?

The goal has not changed, nor has the mission of the FSG. As the market for Linux has expanded and its importance to the industry has risen, so has the importance of the work of the FSG. 

3.  Please describe the LSB – what it standardizes and why these particular elements are important.

The Linux Standard Base (LSB) is an application binary interface (ABI) for Linux and Linux-compatible platforms. The LSB draws on the source standards of the IEEE POSIX standards and The Open Group's Single UNIX Specification for many of its behavioral interface definitions. Some interfaces are not included in the LSB, since they are outside the remit of a binary runtime environment; typically these are development interfaces or user-level tools. The LSB also extends the source standards in other areas (such as graphics) and includes the necessary details such as the binary execution file formats to support a high volume binary application platform. It includes written binary interface specification, a set of test suites for both distributions and applications writing to the standard, and a sample implementation for testing purposes.  

Since it is a binary specification, the LSB is divided into both general and processor-specific components.

The LSB Specifies:

Common Packaging and Install Guidelines

Common Shared Libraries and their Selection

Configuration Files

File Placement (FHS)

System Commands

ABIs for System Interfaces (both application and platform level)

We expand our platform standard over time as the nature of the use of that platform changes.   This change happens as new run times above the OS are introduced or new performance enhancements to the underlying architecture are introduced.   The FSG concentrates its efforts in the following areas:

  1. Developing and improving existing standards
  2. Developing and implementing testing and certification programs in support of its standards
  3. Conducing outreach and education campaigns to encourage ISVs to target the Linux platform, providing technical support and resources
  4. Enforcing the LSB brand with compliant distributions and applications

4.  To whom is the success of the LSB important, and why?

The success of the LSB is important to the entire Linux ecosystem including end users, ISV’s, distribution vendors, systems vendors, and open source developers.

  • End Users: The LSB is important to end users to preserve choice by achieving interoperability at the Linux OS platform layer.  End users have always wanted to “make is as easy to get out of a particular platform as it is to get into one if I am not getting the support or innovation I require.”   As with any standardized component of a technical solution an choice among compatible implementations reduces cost, allows for easier support, and provides for a long-term support ecosystem that can be counted on.    Specific users of Linux include government agencies who mandate choice in their procurement policies, large financial services companies who use cutting edge applications on Linux, and small to mid market users who want to look to a “Linux Standard” mark to indicate compatibility rather than resorting to guess work around what runs on which version of Linux.
  • ISV’s: Making it easier for applications to target the Linux platform is very important for the entire Linux community; this includes Linux distribution companies, hardware vendors and software vendors. The success of the LSB is also important for end users who would like to see a healthy set of applications for the Linux platform without being locked in to a single Linux distribution. Without a widely supported binary standard for Linux, a single vendor, de facto standard will emerge, effectively removing choice and locking end users in. 
  • It is important to understand platform market forces in order to understand the importance of our standard.  An operating system is only as strong as the applications that run on top of it. While Linux presents unique challenges to software developers (including multiple distribution targets), it also affords them a tremendous market opportunity as the Linux systems vendors grow a massive market based on the shared invention of the open source platform. The Linux Standard Base (LSB) was created to eliminate much of the heavy lifting required by software developers (ISVs) targeting multiple platforms today. In other words, the LSB enables ISVs to cost effectively target the Linux platform, reducing their porting, support and testing costs while achieving a global market for their applications.
  • Distribution Vendors: Organizations such as Red Hat, Novell, Debian, Ubuntu, Red Flag Linux in China, Turbo Linux in Japan and others.  Linux distributions vendors benefit from the LSB in the following manner: First it enables them to collectively grow the ISV ecosystem for their platform. Second, it allows them to fulfill a promise of “choice” to their customers – something not taken lightly in the value proposition of the Linux brand.  Third, it allows them to share the support burden of developing and supporting a mission critical OS across a community while preserving interoperability and backward compatibility.   The Linux distribution community has found multiple ways to compete around this standard without “commoditizing” their business.   In fact, by profiting from support and services rather than locking customers into a single solution the Linux vendor community has made substantial inroads in the market against Microsoft and Unix.
  • Systems vendors:  Companies such as IBM, HP, Intel, AMD, Dell and others are looking to provide their customers with a choice of solutions at the highest price performance level they can offer.   While these organizations vary in business models and often have units that cross the spectrum of technical solutions, they are increasingly becoming high level solution providers and trusted advisors to their clients.    In this world, shared support and development for a key underlying component of their business offerings is a highly valuable proposition.    The existence of a standard which enables them to partner with a choice of providers of Linux systems is essential to preserving this choice.
  • The open source community:  The open source community is deeply concerned with the concept of freedom of choice in computing.   To them this freedom comes from access to the source code.   However, the community is becoming increasingly savvy to the idea that source code alone will not offer the freedom they desire and that the combination of a standard that enables islands of code to interact together and in turn grows the network community of users of that technology will eventually increase the number of collaborators in their shared development model. 

5.  What is the current status of LSB adoption, implementation and certification?

LSB adoption can be characterized as “at the beginning of a great adoption curve.”   Development and compliance with the core standard is complete and through the efforts of the FSG, the Linux Standard Base has achieved great strides in the past 24 months, including the following key accomplishments:

  1. LSB 3.0 launched in 2005 resolving key issues between major distribution vendors (including key C++ libraries), resulting in the support of all major distribution vendors
  2. LSB 3.1 launched in 2006, including support for portable desktop applications for the first time. All major distributions (including Asianux, Debian, Novell/SUSE, Red Hat, and Ubuntu) have committed to certifying to LSB 3.1.
  3. The LSB workgroup now has a steering committee containing key stakeholders of the community, including senior engineering resources at Red Hat, Novell, Ubuntu and major ISVs. This directly translates into alignment of the LSB with the major distributions’ release schedules.
  4. The Free Standards Group board of directors now reflects key stake holders of the Linux ecosystem with senior members from such companies as HP, IBM, Fujitsu, Intel and Novell
  5. For the first time, major ISVs such as CA, Veritas, Oracle, MySQL, BakBone and others have either joined the FSG and given their public support of the standard
  6. We have engaged a short list of "masthead" ISVs with the goal of getting their applications certified to the LSB, and RealNetworks has already agreed to certify RealPlayer, its widely used media player.
  7. New memberships have increased by 70 percent over the last year, including the addition of over a dozen ISV members where there was previously no ISV participation at all
  8. Funding has nearly tripled over the last 24 months, allowing the organization to hire a professional management team
  9. The Chinese Government has become a certification authority for the LSB and is using the LSB as the base of its emerging national standard for Linux; efforts in India are underway
  10. The Department of Defense is now mandating LSB compliance in their procurement contracts; other end users are also planning on doing this
  11. The FSG has experienced dramatically increased visibility and awareness of the standard through new marketing efforts, including features in The Wall Street Journal, Business Week, USA Today, eWeek and the Associated Press
  12. The organization is now staffed with an experienced management team, including Jim Zemlin as executive director and Ian Murdock, creator of Debian, as CTO and workgroup chair of the LSB

6.  What other standards will be important to complete in addition to the LSB?

The FSG currently works with other development groups like freedesktop.org to incorporate their technical work into the official standard for Linux.

7.  Is the FSG unique, or are there other open source standards groups in existence?

The FSG is the only standards body devoted to open source. The unique nature of open source standards (standardizing the downstream implementation of disparate upstream projects) lends itself to a unique standards body which can effectively work with all members of the open source community.

8.  The LSB was recently adopted as an ISO standard.  What will that mean for the LSB?

The LSB was approved as a Publicly Available Specification (PAS) by ISO/IEC (the International Standardization Organization and the International Electrotechnical Commission), and is published as International Standard 23360.

ISO approval shows the world that Linux is a serious, mainstream operating system, a serious companion to POSIX systems. It provides a benchmark between procurement and vendor, preserving healthy competition without allowing fragmentation of the market. Standards have been shown to contribute more to economic growth than patents and licenses combined, and the LSB will open the door to Linux as a requirement in large scale (e.g. Government) procurements. The approval of the LSB also makes it easier for individuals, companies and governments to concentrate their efforts on one unified program.

II.  Structure

1.  What interest groups are represented in the FSG today, and why is their active participation important?

There are four main groups who are represented in the FSG today.

  • Distribution vendors. The distribution vendors are the enablers of the standard. Without their participation, the standard cannot achieve any success. And without their participation in the creation of the LSB, their support for it would be unlikely.  As of April 2006, all major distribution vendors have pledged to certify on LSB 3.1, and senior representatives from each of them are part of the steering committee of the LSB workgroup. They will be attending the LSB Summit in June 2006 in Boston and helping to shape the roadmap of the LSB.
  • ISVs. They will implement the specification in their development efforts and use the support provided by the FSG, eventually certifying to the standard.   Not surprisingly, the Free Standards Group has also made good process with obtaining early support from many key ISVs: Oracle, Bakbone, VERITAS, IBM Software, Novell, Levanta, RealNetworks, MySQL, Hyperic and many others have recently engaged with the LSB in various forms (participation in the steering committee and LSB summit, certification, roadmap input, etc.)

The Free Standards Group recognizes that different issues face ISVs of different sizes and is structuring its messaging and ISV outreach efforts accordingly. While larger ISVs have the clout to mandate which Linux platforms their customers must use, the situation is often the reverse for the smaller ISVs, with the larger customers mandating which Linux platforms the ISV must support (i.e., "we've standardized on Fedora Core 2, so you'll need to support that if you want our business"). Of course, those platforms will invariably differ between customers. Ironically, then, the ISVs that are least in the position to support a multitude of Linux distributions are the very ISVs that have little choice but to do so. For smaller ISVs, then, the LSB provides substantial benefit, because it allows them to support a wide variety of distributions with a single build; and while the LSB may not provide 100 percent assurance of portability, less than 100 percent is often sufficient for the small ISV (not to mention better than they would be able to do on their own for any given distribution, given their often limited resources).

For larger ISVs, the value proposition of the LSB is substantially different, as the larger ISVs will never see validating to anything less than a full implementation as viable. However, given the diversity of the Linux ecosystem and the disproportionately large potential upside available to the larger ISVs, these larger players are often challenged to find a way to support regional- or market- specific Linux variants. They still would like to broaden the market for their Linux products while minimizing the cost (and risk) associated with supporting additional platforms, which is necessarily much greater than for the small ISVs. For example, while Oracle formally supports only Red Hat, Novell/SUSE, and Asianux, it makes clear that its products work on Debian too (because many regional variants are Debian-based), though use on Debian is officially unsupported. Here, the LSB provides the larger ISVs with a clearer way of offering some assurance of functionality with the "long tail" of distributions but also makes clear that this assurance is less than full support.

  • End Users. End users are the tailing adopters of the standard, yet they can assert significant influence on both distribution vendors and ISVs. End users need to see the LSB as a form of risk management in their Linux strategy. This is the demand from end users to fulfill the promise of choice. There is proof that acceptance has begun with early adopters. A handful of large Fortune 500 companies have stipulated LSB compliance in their procurement policy, license and support contracts, including the Department of Defense.  More need to follow.
  • Open Source Community. The open source community represents an amalgamation of software projects which are integrated into a single computing solution. A fundamental flaw in the open source development model is the lack of coordination across projects. There is good internal coordination among but not across projects. Because Linux distribution is made up of a broad set of disparate projects, different projects need to be coordinated effectively to achieve interoperability and backward compatibility across distributions. It’s important that maintainers of these projects remain involved in the LSB process, track downstream implementation of the use of their software and give feedback to the roadmap of the downstream standard. It is important that the maintainers of those projects are aware of existing computing standards such as the LSB so they can work in a cooperative fashion to accelerate the adoption of their technology. The LSB and activities of the FSG is the best type of organization to coordinate these activities. Currently, key members of the community are involved in the LSB workgroup and will be attending the LSB summit.

2.  How is the Board structured, and why?

Currently the Free Standards Group has three classes of membership. Financial commitments vary, according to the type and level of membership:

  1. Individual Membership: Open to any person who is interested in supporting free and open source software.
  2. Nonprofit: Open to any registered nonprofit organization that is supporting free software and open source software. This class of membership is also open to educational institutions.
  3. Corporate: Open to any commercial entity engaged in the production, manufacturing, support, development, or sale of products supporting free and open source software.

In many ways, the Free Standards Group is structured much like consortia that traditionally create open standards. It has one major distinction, however, since it has to balance the needs of the open source community instead of purely being structured to advance corporate interests. Board members are elected by members and represent each of the three classes of membership. Directors are elected for a two-year term. 

3.  In what ways is the structure of FSG different from a typical open source project?

The task put before the Free Standards Group is a complicated one: it must bring “cathedral” structure to the open source “bazaar.” In order to be relevant to the distribution vendors and to ISVs, deadlines must be met and the specification must be current, timely and up-to-date. The FSG is closer in many ways to a commercial company than to an open source project. An open source project can iterate its software early and often, constantly updating and adding features as implemented. A standards body, however, must have predictable, publishable release cycles that are in synch with the distribution and software vendors who implement the standards. In order to meet these demands it also cannot be staffed 100 percent with volunteers. Key technical leadership and project management/consensus building roles must be full time FSG staff.

4.  How many paid staff are there, and what are their roles?

There is an executive director (Jim Zemlin) responsible for the overall direction of the FSG. A CTO (Ian Murdock) who is responsible for the technical direction of the FSG (and who is also the chair of the LSB workgroup).  The organization has a director of marketing/communication (Amanda McPherson) responsible for marketing, communication and outreach activities. The organization currently employs one engineer and a staff in India and plans on adding more for core LSB work.

5.  Approximately how many individuals are involved at the technical level, and what do they do? 

There are currently over a dozen individuals devoted full time to the technical work of the FSG, with many more involved on a part time basis. These people are employed by such member companies as Intel, IBM, HP, etc and work on the specification, test suites and certification programs.

III.  FSG – Linux Interface

1.  Which open source projects does FSG actively engage with?

Primarily the Linux distributions but also many of the constituent projects, particularly if those projects provide a platform that developers can target that could benefit from better integration with the broader Linux platform. Good examples here include the GNOME and KDE desktop environments. Each of these desktop environments is a platform in its own right, but a desktop isn't much use unless it is well integrated with the operating system underneath. Furthermore, ISVs targeting the Linux desktop ideally want to provide a single application that integrates well regardless of which environment happens to be in use.

2.  How does FSG work with the Linux development team and the Linux process?

Actually, the LSB doesn't specify the kernel--it only specifies the user level runtime, such as the core system libraries and compiler toolchain. Ironically, then, the _Linux_ Standard Base isn't Linux specific at all--it would be entirely possible (and probably not altogether hard) for Solaris to be made LSB compliant. The LSB is entirely concerned with the application environment, and the kernel is usually pretty well hidden at the application level.

3.  Does the Linux community participate in FSG as well?

Yes, though most participation comes from engineers that work for the various companies that have an interest in Linux (Intel, IBM, Novell, HP, Ubuntu, etc.). However, there's nothing particularly unusual about that. Most open source development these days is done by commercial interests, not by college students working out of their dorm rooms, which seems to be the common perception. (Of course, a lot of it starts there, but the best developers eventually figure out how to get paid to do it.) Whether you're interacting with paid engineers or unpaid volunteers, though, a key to success in the open source community is getting the right people to buy in to what you're doing and, ideally, getting them to participate. In general, the FSG mission resonates well with the open source community, and we have little difficulty getting that buy in and participation.

IV.  FSG – Linux Dynamics

1.  I've heard you describe the relationship of the open source and open standards processes in "upstream" and "downstream" terms.  Given that open source development is "real time" and ongoing-release, while standards have traditionally operated on a fixed basis, with nothing changing for a period of time, how do you make this work?

One way to understand this is look at the attributes of a successful open source project.   Success is relative to the number of developers and users of a particular set of code.   Apache is a good example.   As the community iterates code with similar functionality, for example a web server or a C compiler, the participants end up aligning themselves around one or in some cases two projects.   Smaller projects tend to die.  The ones that succeed then join the many other packages that are integrated into a platform such as Linux.  

The trick in standardizing then is to decide which snapshot in time – which interfaces from those packages at that point across all these packages - will guarantee interoperability.    By coordinating with these disparate upstream projects which versions of their code are likely to be broadly adopted downstream with the distro vendors, we provide a framework for those working both upstream and downstream.  In the case of the Linux distros, we help them cooperate in order to bring meaning to the term "Linux" in terms of the type of interoperability that is commonly expected on an operating system platform such as Windows or Mac OS.

This effort requires ongoing awareness of the spec development process itself both upstream and downstream, and a rapid feedback framework for all parties.  It also requires a coordinated parceling of the testing efforts to the appropriate sub-projects.   In other words, we are applying the bazaar method of open source coding to the development of standards.    That is how the community plays and we are a part of that community.

2.  At the process level, what other aspects of open source development are most problematic for standard setting, and vice versa?

Before answering that question, there's one very important thing to understand about the FSG, and that's that we don't define standards in the same way that a traditional standards body defines standards. And that's just the nature of the beast: The open source community is vast, complex, amorphous, and continually in motion. It's also an integral part of what we do. So, the FSG by nature isn't just a well-defined consortium of technology vendors that can define things unilaterally. It's a well-defined consortium of vendors, certainly, but it's also more than that, in that the vast, complex, amorphous, continually moving open source community needs to be represented at the table. In a lot of ways, what we're doing at the FSG, namely bringing together open standards and open source, is unprecedented.

Clearly, our interactions with the open source community affect the processes we use to build the LSB and our other standards. We can't just say "this is the way things are" the way we'd be able to do if our constituency was smaller and more self-contained. Instead, the way we define standards is far more about consensus building and observation--we watch what's happening in the open
source community and industry and track what's emerging as a "best practice" through natural market forces and competition.

One of the challenges of the LSB project, then, is understanding what   technologies have become or are becoming best practice, so that we can begin the process of incorporating those technologies. Another challenge is dealing with a moving target--after all, although the process of defining the standard is different, at the end of the day, the standard has to be every bit as precise as, say, a plumbing specification, or it won't guarantee interoperability. Fortunately,
we already have a model to follow here, namely the Linux distributions, which perform the analogous task at the technology level by assembling the various open source components into a cohesive whole.

So, our task essentially boils down to tracking the technologies that ship in the majority of Linux distributions, and in building a layer of abstraction, a metaplatform of sorts, above the multiplicity of distributions so that application developers can target a single, generic notion of Linux rather than each distribution individually.

We also work to increase participation in the ongoing development of the  standard and to facilitate collaboration among the key stakeholders to more rapidly reach consensus around the best practices. The goal here is to capture in the LSB roadmap not just what exists in the current generation of the major distributions, but what's coming in the next as well. After all, ISVs developing Linux applications today will often see the next generation as a primary target.

3.  What compromises (technically and process-wise) have the Linux and FSG communities had to made in order for the LSB to be practical while not impeding the work of either side?

The biggest challenge in what we do is probably no different than in any other standardization effort: Balancing the need for standards with the need for vendors to differentiate from each other. However, in the open source world, this tension is probably more pronounced due to the speed at which development occurs. I'd say the biggest compromise the open source community makes is understanding the importance of standards, backward compatibility, and all the sorts of things that tend not to be "fun" but which are vital to commercial acceptance--and being committed to doing what needs to be done. On the FSG side, the biggest compromise is being fairly hands off and leaving it to the marketplace to determine which of many alternatives is the best practice. The real key is making sure interoperability problems don't crop up in the process, and the key to making sure that doesn't happen is ensuring all the parties are in a constant dialogue to make sure the right balance is struck.  We see that as one of the roles of the FSG--providing a neutral forum for these kinds of conversations between the key stakeholders.

V.  Looking to the Future

1.  Where else are organizations modeled on the FSG needed?

I wouldn’t frame it as where else is an FSG needed but rather where should the FSG go from here? At the end of the day, the LSB is a development platform standard. Some developers target the operating system in C or C++; others target middleware platforms like Java or LAMP; others are moving further up the stack to the web, where applications span site and even organizational boundaries (think of the various "mashups" that are happening around the so-called "Web 2.0" applications like Google Maps). Today, we cover the C/C++ level pretty well, but we need to move up the stack to cover the other development environments as well. The ultimate goal is to provide an open standard developers can target at any layer of the stack that's independent of any single vendor.

So, the short answer is that we aspire to provide a complete open standards based platform (“metaplatform” is actually a more accurate way to describe it), and Linux is obviously just one part of such a platform. We need to move up the stack along with the developers to incorporate the higher level platforms like Java and LAMP. We need to extend the coverage of the operating system platform too, as we've done in LSB 3.1 with the addition of desktop functionality and are doing around printing, multimedia, accessibility, internationalization, and other areas in LSB 3.2. Even at the operating system level, there's nothing inherently Linux specific about the LSB, so there's nothing preventing us from encompassing other open platform operating systems, such as the BSDs or Solaris. In the end, it's about all open platforms vs. closed platforms, where the closed platform du jour is Windows.

So, the real question is, how can the open metaplatform better compete against Windows? For one, Windows has .NET. Linux (and the other open platform operating systems) have Java, but it's not as well integrated, and it's not as well integrated because of the Java licensing. Sun has indicated they're going to open source Java as soon as they address the compatibility concerns. We have a lot of experience in that area, so perhaps we can help. In the end, it all comes down to a strong brand and tying compatibility testing to the use of that brand, which is the approach we take with the LSB. There's no reason a similar approach couldn't work for Java, and the benefit of a integrated Java with the open metaplatform would be enormous.

Obviously, doing all of that is an enormous amount of work, undoubtedly an impossible task for any single organization to accomplish on its own. Then again, so is building a complete operating system, and a lot of little companies (the Linux distribution vendors) managed to do it by taking preexisting pieces and fitting them together into integrated products. And, as it turned out, the whole was a lot more valuable than the sum of its parts.

We take the same approach on a few levels. First of all, the LSB is an open process, so the best way to get something into the standard (assuming it's a best practice, i.e., shipping in the major Linux distributions) is to step up and do the work (i.e., write the  conformance tests, etc.). In other words, we leverage the community the same way an open source software project would. Second, there are a lot of open standards efforts tackling pieces of the overall problem, and we seek to incorporate their work. In that sense, we're essentially an integrator of standards, a hub of sorts, much as the Linux distributors are essentially integrators of technology. We don't have to solve the total problem ourselves, just provide an open framework in which the relevant pieces can be fitted together.

2.  In the long term, should the standardization process and the open source process merge?  In other words, is there a benefit to there being an independent FSG, or in the future would it be better if the open source community incorporated this role into its own work?

Currently, there is no better way to balance the needs of a competitive distribution community with application interoperability. An independent standards provider bridges the gap between the open source community and the distributions implementing their software by allowing best practices of the latter to be standardized, thus making it easier for ISVs and end users to actually use the platform. The open source community does not want to concern itself with this standardization concern, nor should they. An independent consortium can drive consensus while being politically sensitive to the needs of its constituents.

3.  What is the single thing that open source advocates most need to "get" about standards, and need to work harder to accommodate?  Same question in reverse?

It would be helpful if those in some of the upstream projects participated more closely with our standards efforts. They are already doing this but there is always room for more participation. Closely tracking of the projects into the standard (or just through a database) will provide a great deal of service to ISVs and the distribution vendors. We plan on offering this service.

In the other direction, standards bodies need to recognize that open source development is fundamentally different than traditional software development. When working with the open source community, participation and buy-in are critical—you can't just declare something to be so and expect the open source community to just follow suit—as is the ability to move quickly. For the FSG's part, we understand all of this very well—after all, we grew out of the open source community—but it's an observation other standards efforts would do well to keep in mind as open source and open standards increasingly intersect.

VI.  Open Ended

What haven't I asked that I should have to complete this picture?

Andy, I think it is worth noting that Linux is different from Unix or Microsoft Windows where the brand has a rigorous testing and certification process.  the Linux brand is owned by Linus Torvalds who does not choose to enforce that brand through any kind of technical compliance mechanism.   This is creating an oncoming crisis for Linux as differing groups seek to define the brand.

The computing industry has made an enormous bet on Linux. By many accounts the total investment in research, support, development and marketing of the Linux brand exceeds many billions of dollars. This investment is spread predominately over a few “computing giants” who stand to profit from the culmination of products and services sold in support of the Linux operating system. By all accounts, the rise of Linux has been precipitous, creating wealth for the ecosystem while spreading the risks among those who shoulder the burden of development and marketing.

While there has been an enormous investment in the Linux brand, the un-enforceability of the Linux trademark introduces a great amount of risk for those who have made financial bets on the positive impact of the name. As Linux evolves from the world of the data center to the world of the desktop and mid-market, customer expectations will also evolve. Data center customers expect software to require customization to work in their environment. Mid-market, small businesses and end users, however, have different expectations. They want their Linux application to run on their Linux operating system. If it doesn’t work “as promised” by the industry giants who have created the Linux brand, the customers will experience Linux as broken or poor performing. A negative halo effect will ensue for those vendors who have supported or sold products on that system. Who will benefit the most in this scenario? The makers of the Solaris and Windows operating systems.

Many would say there is a solution. A Linux distribution (or sub-brand) can guarantee applications run on its version and thus make the guarantees to the customers (and ISVs). As long as the systems vendors and ISVs work with the right sub-brand (distribution) everything will work. While in many ways this has worked so far in the history of Linux, there are risks associated for the Linux ecosystem.

A successful sub-brand subsumes the Linux brand for itself. As long as others in the ecosystem remain committed to the sub-brand, all is well. But as the power of the sub-brand rises, its ability to hold its business partners hostage with unreasonable licensing, business or financial demands does as well. There is a long history in the computing industry of sub-brands (especially software companies) accomplishing this feat.

There is also risk that the sub-brand will evolve up the software stack to compete directly with its business partners. Or, as has been discussed widely in recent media coverage, a large competitor could acquire the sub-brand and change the Linux ecosystem over night. For an industry with a multi-billion-dollar investment in Linux, this risk is unacceptable. At the most this risk is something that should be stopped, but at a minimum this risk is something that should be managed -- especially when the cost associated with that risk management is relatively low. A well supported open standard – the Linux Standard Base – and attendant developer out reach and education is the best way to manage that risk.

subscribe

Comments? Email:

Copyright 2006 Andrew Updegrove


 

FROM THE STANDARDS BLOG:

LET THE READER BEWARE
AS ODF NEWS COVERAGE INCREASES

Tuesday, May 16 2006 @ 06:20 AM EDT

There have been a number of stories published on-line in recent days that warrant both comment and qualification.  The good news is that more and more journalists are being attracted to the OpenDocument Format (ODF) story, largely because of the increasing credibility of the threat to Microsoft Office that ODF poses.  A measure of the appeal of that story line is the fact that it is beginning to surface in articles appearing in the mainstream press (look for a story in Fortune magazine this week, for example). 

The bad news is that some of these articles have been poorly researched and/or reported.  The result is that more care is now needed when reading the news than was required a short while ago when only a small number of reporters were covering the story, each of whom had taken the time to acquire a good understanding of what was involved, and had the chronology of events and the facts in focus.  Free lancer John K. Waters and ComputerWorld's Carol Sliwa, in particular, have impressed me with the quality of their coverage. 

In this entry, I'll look at some of the significant news that has broken in the past week, and highlight the ways in which it has, and hasn't, been accurately reported on-line.

Let's start with one of the big news stories that emerged yesterday: the first public, pre-release demonstration by IBM of some of the ODF-supporting features of its new "Hannover" release of Lotus Notes.  The news that the next release of the Notes client would support ODF is not itself new (IBM had announced last January that its Workplace Managed Client (WMC) software would support ODF), but the demo offered a media opportunity to showcase the fact that progress was proceeding apace. 

Before turning to that article, let's see what we might want to know about this story itself.  IBM says that there are currently 125 million users of Lotus Notes.  If the ODF features of the next release of Notes are available to all users (and not just those that buy into the WMC architecture), then ODF support by Notes would be big news indeed, akin to Microsoft bundling its Explorer Internet browser along with Windows.  The reason is that these users won't have to convert to anything new - OpenOffice, or StarOffice, for example - they simply have to upgrade their copy of Notes in the ordinary course.  So - if this is the case, then some very significant percentage of those 125 million people will gain the option of producing documents in an ODF compliant form.  That  would represent not only a powerful value to those users, but also to non-Notes users that know that if they make a switch to OpenOffice or StarOffice (or Notes), they will have greater freedom of document exchange outside of the Microsoft Office environment.  In short, they won't be buying the "first telephone," but will already have lots of people to "call" after they make the switch.

Will that be the case?  It looks like the answer to that question is "yes," based on the IBM press release issued in connection with the demonstration.  As a result, whether or not IBM is successful in selling its WMC concept to the marketplace, it appears that there will be many, many Notes users worldwide that will be capable of saving documents in ODF form in the future.  Given that major users (such as governments) are increasingly concerned with archiving documents that are not formal desktop documents (such as email), this capability is doubly important.  So there is indeed a real story here to be reported.

Now let's look at the first article filed by someone who attended the Deutsche Notes Users Group conference in Karlsruhe, Germany, where the demonstration was given.  You can find it here.   It looks from the press release that he has the facts of the demonstration down accurately, but his story line is decidedly peculiar, painting IBM's support of ODF as  "the completion of a nearly decade-long change of heart at IBM, which in 1996 openly eschewed ODF as something nobody really wanted."  At minimum, that statement is a puzzler, since in 1996 a German company called StarDivision owned the software that would eventually provide the template for ODF, and that software was proprietary.  In short, I'm not aware that ODF was a gleam in anyone's eye at that point, much less dismissed by IBM.  Further, IBM was a long-term member of the OASIS Technical Committee that created the ODF specification.

The article closes with another head shaker:  "If Office and Notes truly do make this a real format war next year, then it's likely that Microsoft's reasons for having joined the OASIS body responsible for drafting OpenDocument, could be called into question."  My only guess on that one is that he is referring to the fact that Microsoft joined the INCITS subcommittee dealing with the submission of ODF to ISO.  But who really knows?

Another major item that drew a lot attention in ODF circles yesterday was the fact that Gartner released a report stating that it is now "highly unlikely" that ISO will approve the Microsoft Open XML specification now being processed in Ecma. In support of that conclusion, it states:  "ISO will not approve multiple XML document formats (0.7 probability)."  The parenthetical indicates that the authors assign a 70% probability of accuracy to their conclusion.  The entire document comprises only a few paragraphs of conclusions and recommendations, with no back-up for its conclusions. 

While supporters of ODF will be delighted if Gartner proves to be correct with its ISO prediction, I am aware of no reason at this time to conclude so firmly that "ISO will not approve multiple XML document formats."   If you want a detailed analysis instead of a few paragraphs of conclusions, give Steve Walli's carefully considered blog entry on the Gartner assessment a read.

How about this article at TechWorld, from May 11?  It includes the following introductory statement:

Massachusetts, the US state administration leading the charge for open-source document formats, has approved a third-party plug-in that could keep Microsoft Office on its desktops.

Sorry.  All Massachusetts has done so far is to issue a Request for Information (not even a Request For Proposal yet).  It still has to evaluate what comes back and decide which plugin(s), if any, will meet its needs.  [ComputerWorld later got in touch with me to say that TechWorld came up with that line all on its own, and that they would ask TechWorld to remove it (TechWorld later did). Eric Lai's original article in ComputerWorld can be found here.]

And then there's this headline, from an article in Information Week:  Trade Group Blasts Massachusetts Call for Office Plug-In.  The problem with that title is that it just doesn't square well with these outtakes from the Initiative for Software Choice (ISC) press release upon which it is based:

The RFP [sic] reflects a wise use of market dynamics....Importantly, the RFP, as well as a plug-in's purported existence, will allow Massachusetts to meet its stated goals in a cost-effective, market-friendly way….We applaud these and subsequent market-based developments.

Hmmm.  not very blasty, from my perspective, although in truth the press release does roast Massachusetts for its original ODF decision.  But that would require a different title, wouldn't it?

My final example, and in some ways the most troubling, is an article that appeared recently at FCW.com, called Mass. relaxes open-format mandate, written by Aliya Sternstein.  That article includes the following statements:

Massachusetts is apparently loosening a mandate that all agencies replace Microsoft Office software with products that support open formats just as a major international standards body is endorsing the first open format for archiving government records.... Massachusetts, the state government that had been leading the shift from proprietary to open formats, relaxed its mandate that all agencies replace Microsoft Office by 2007....[State CIO Louis Gutierrez's] statement is a reversal of the state’s earlier stance, several software industry officials said.

Problem is, Massachusetts hasn't "loosened the mandate" at all  – it's simply looking into the availability of interim tools that could ease its transition to an ODF-only environment more.  This article could be considered simply a failure to understand the recent RFI issued by Massachusetts to find plugins that might ease the timely transition to ODF-compliant software, but for one fact:  the author actually interviewed Louis Gutierrez, the Massachusetts CIO, who would have made the distinction perfectly clear.

Moreover, the author is also guilty of the faux-objectivity that I decried a few days ago in this blog entry.  In that piece, I wrote:

[S]adly, under today's journalistic styles, press releases not only generate stories, but also the obligation to quote from them in order to be "even handed," no matter how outrageous the statements quoted may be.  The result is that false statements are broadcast around the globe without being contradicted in the story itself.    Some inevitably take root, especially when promoted by large marketing budgets.

It's ironic and unfortunate that what is supposed to be even-handed journalism serves so well to help to spread the Big Lies, and to help them flourish.

There is a particularly egregious example of this practice in the FCW.com story (as well as a number of conclusions that appear off-base).  It goes like this:

“We prefer the marketplace to choose the open-source formats,” said Michael Wendy, a spokesman for the Initiative for Software Choice, a coalition of software companies. “We don’t have anything against open source. Our rub is when you have a government mandate saying, ‘Thou shalt only use open source to meet government procurement needs.’ If these products are truly better, they’re going to win out.”

Last week, Gutierrez said his state’s mandate never excluded proprietary software and was never meant to appear anti-Microsoft.

The fact is that the Massachusetts policy does not now, and never has, mandated the use of only open source software.  So why give credence to this assertion at all?  Instead, the two positions are presented in a, "she said, he said, you decide" format, leaving the reader unsure what the facts of the matter are, and therefore as likely as not to believe the former false statement as the latter accurate one.

As you can see, the increase in coverage of ODF is not without issues.  The moral of this story, as with any other story in the news, is to Let the Reader Beware.  Consider the sources of what you read, test their conclusions against facts you already know, and give greater credence to those authors who have followed the story for a long time, and that have demonstrated both a willingness to get to the bottom of things as well as objectivity in their reporting.

In short, if you have been following the ODF saga closely and read something that just sounds wrong, it probably is, unless you've come to trust the source.

For further blog entries on ODF, click here

Bookmark the Standards Blog at http://www.consortiuminfo.org/newsblog/ or
set up an RSS feed at: http://www.consortiuminfo.org/rss/

Comments? Email:

Copyright 2006 Andrew Updegrove

subscribe


 

CONSIDER THIS:

[][][] May 24, 2006

#39 Thinking About Standards Inside of the Box

One of the most interesting things about standards is their power to change the world.  If that seems like an impossible statement to substantiate, consider this: fifty years ago, one man launched a de facto standard that within a few decades became universally adopted, made many of the largest ports in the world obsolete while elevating others to the first rank, decimated the ranks of a powerful union, dramatically increased the speed, economy and handling of cargo, allowed ships to triple (or more) in size, and completely transformed virtually every other aspect of global trade as well.  And all of this was accomplished without any governmental action, through voluntary adoption of that standard.

Intrigued?  Here's the story.

From time immemorial, bulk cargos have been shipped in barrels, boxes and other containers of a size that could be manhandled into position below decks on ships.  Divers today still regularly discover wrecks of Phoenician, Greek and Roman ships laden with amphorae that once held olive oil, wine and other commodities.

Such "break bulk" cargo containers had many shortcomings: they were heavy, required individual handling, and were subject to tampering and accidental damage.  But up until 1956, they remained the state of the art for moving hundreds of millions of tons of cargo around the world each year.

That began to change fifty years ago last month, when Malcolm McLean loaded 58 huge metal containers in Newark, New Jersey on a ship bound for Houston, Texas.  With that first shipment, the age of containerized shipping began, and little about global cargo transport has been the same again.  Here are some representative facts to back that statement up, taken from a recent article in the San Francisco Chronicle:

  • In 1959, the industry average for loading and unloading cargo was just over a half ton per man-hour. By 1976, it had risen to 4,234 tons per man hour
  • During the same period, the average turn around time for a ship in port decreased from three weeks to 18 hours
  • In 1950, the average cargo vessel carried 10,000 tons at a speed of 16 knots. After container shipping enabled the efficient handling of larger cargoes, the average had risen to 40,000 tons at a speed of 23 knots.  Today, the average container ship carries 6,600 20-foot containers, and delivers 77,000 tons at a speed of up to 24.8 knots

Ports that embraced the new concept flourished, while many that didn't failed.  For example, San Francisco virtually ceased to be a port, while across the bay Oakland became one of the busiest ports in the world.  New York suffered the same fate, as the Port of New Jersey, endowed with the ample room for expansion that Manhattan lacked, embraced the technology required for fast loading and unloading of containers.

Today, virtually all cargoes are shipped in containers, except for those that are susceptible to one of the other modern techniques that provide equally rapid loading and unloading: pumping, in the case of liquid cargoes like oil and natural gas, and mechanical conveyance, in the case of loose materials, such as gypsum and cement.

One of the beauties of the containerization concept is that it provides something for nearly everyone in the value chain:  manufacturers can load containers securely, packing them to their liking to prevent damage in shipment; rail owners can load them quickly and securely on flat bed rolling stock; ship owners can reduce expensive dockage expenses and down-time in port by orders of magnitude; and customs officials can seal containers at point of departure and confirm that those seals are unbroken at point of arrival.

Of course, as with any transformation, there have been a few losers in addition to the many winners: the ranks of longshoreman have plummeted, as mano-a-barrel cargo handlers gave way to highly trained technicians perched high in the spidery gantries that now dot huge waterfront container facilities.  And the age of the "tramp steamer" also came to an end: no longer could retired college professors and other budget-conscious travelers book one of a handful of rooms on a cargo ship bound to exotic ports, reclining in deck chairs between stops, and knocking about bazaars during the lengthy loading and unloading process.  Today, ships are far too large to bother with such incidental cargo, and in any case, with time in port reduced to only a few hours, there is too little payoff for travelers after spending so much time at sea.

So complete has been the conversion to containerization that even the standard of measurement for cargoes has changed - from the ton to the TEU, an acronym derived from "20 foot equivalent unit" - which is to say a container box 20 feet long, 8 feet wide, and 8 ½ feet high.  The fixing of these dimensions and the other elements of the specification for the standard shipping container is at the heart of the transformation of global shipping, because without these necessary elements the end-to-end shipping system that utilizes containers could not have come into existence.  The beauty of the container, after all, is to dramatically increase the size and weight of the unit that requires handling, and such handling must therefore necessarily be mechanical.  Once mechanical, it should also be as rapid as possible, in order to maximize the potential benefit. 

The result is that a number of sophisticated and expensive mechanical handling mechanisms are required to move these massive shipping units, one for each point in the shipment process, from factory, to truck, to ship, to rail, and so on.  Such mechanisms would be cheaper to build and faster to operate if containers became standardized in every respect – not just dimensionally, but in points of attachment, maximum weight and ability to withstand abuse.  Once these and other attributes became fixed, then the handling devices could in turn be standardized, and mated to the same elements.  The result was the rapid development and deployment of interoperable shipping systems, all based upon these same standardized elements.

All in all, a powerful concept indeed, and one that swept the world with astonishing rapidity  notwithstanding the very substantial capital investments required to convert to container handling.  Two years after McLean commenced containerized shipping in the Atlantic, the Matson Navigation Co. launched similar operations in the Pacific.  The rest, as they say, is history – enough history, in fact, to fill no less than three new books occasioned by the fiftieth anniversary of the shipping container: one by Marc Levinson, an economist, called The Box:  How the Shipping Container Made the World Smaller and the World Economy Bigger, one by Joseph Bonney (editor of the shipping industry Journal of Commerce) and Arthur Donovan, a maritime historian, called The Box that Changed the World, and yet another by Brian J. Cudahy, with a similarly breathless title:  Box Boats: How Container Ships Changed the World.

In short, a dramatic example of the power of standards to incentivize competitors in diverse industries around the world to abruptly change the way they do business, to their benefit and to ours – all by voluntarily agreeing, in their own self-interest, to think inside the same standardized box.

Comments? Email:

Read more Consider This… entries at: http://www.consortiuminfo.org/blog/

Copyright 2006 Andrew Updegrove


 

FEATURED MEETING:

OPEN INVITATION: AN OPEN FORUM
FOR STANDARDS DEVELOPERS

Last month's CSB announced a meeting to be held in New York City on June 20-21, following up on a landmark meeting held last year in Boston between representatives of the consortium and accredited standards development communities.  Like last year's meeting, the upcoming meeting has been organized and is sponsored by the American National Standards institute (ANSI). 

More details are now available at the ANSI Website, including the full agenda of the meeting and additional details on registration, lodging and attendees.

Representatives of more 30 consortia and accredited standards development organizations have already registered, as well representatives of government agencies and major vendors.  There is still a limited amount of space available, so if you plan to attend, be sure to register today.


THE REST OF THE NEWS

For up to date news every day, bookmark the ConsortiumInfo.org
Standards News Section


Or take advantage of our RSS Feed

OpenDocument Update

Gee, thanks for making that all clear:  As you can see from the paired quotes below, emotions are running high in the struggle between the ODF standard, now adopted by ISO, and the Microsoft XML Reference Schema that was contributed to Ecma (the format is now referred to as Open XML), which is also headed for consideration by ISO.  The result has been an ongoing flood of not just opinions, but also disinformation, much of which has been dutifully reported in the press.  All too frequently, even the most egregious and easily exposed disinformation has continued to be repeated in articles in the on-line press, usually with no effort to make it clear what the true story is, and what is purely an attempt to spread FUD (Fear, Uncertainty and Doubt).  In each case below, I've supplied what I understand to be the facts of the matter.  For further (and more egregious) examples, see the Standards Blog entry in this issue, as well as On the Art (?) of Disinformation:  Telling the Big Lie and the several Standards Blog entries that followed over the next two weeks.  Some real news from the past month also follows

 

It is now unlikely that ISO will adopt Microsoft's Open XML document format [May 15, 2005]

 

Gartner research report, opining that ISO will not approve two standards intended to do the same thing ...Full Story

   
Ecma made all standards for DVD — five competing rewriteable/recordable formats. They all do the same thing [May 6, 2006]
  Ecma Secretarie General Jan van den Beld...Full Story
   

The straight scoop:   ISO/IEC can and does approve closely standards to accomplish closely related tasks. If individual members choose to do so, however, they may vote against a standard on the basis that it is duplicative of an existing standard. The balloting on XML ISO, when it eventually reaches ISO, will therefore be watched closely.

spacer

I understand that Massachusetts is under the gun to migrate, and that this might make it easier to fulfill their mandate [but] I see anything that extends the life of Microsoft Office as problematic [May 11, 2006]

 

Louis Suarez-Potts, community manager for OpenOffice.org on Office - ODF converters...Full Story

   
Yes, I want to see OpenOffice on every desktop. But I think in many ways you are right to say that, yes, we are extending the usefulness of Microsoft Office [May 11, 2006]
  OpenDocument Fellowship's Gary Edwards, on it's recently announced Office - ODF converter...Full Story
   

The straight scoop:   Just about everyone has an opinion about whether plug-ins that provide easy conversions between Office and ODF Compliant products are good or bad for ODF or Office. Microsoft has consistently stated since late last summer that it expects such tools will be created, indicating that it, perhaps, thinks that they will be more good than bad for Office. The Massachusetts ITD, which has issued an RFI to locate eligible examples, clearly thinks that they will be helpful in transitioning to an all-ODF environment.

spacer

Q: Microsoft has said that Massachusetts was out to penalize their products. Are they right?

A: Absolutely not. The state came up with a standard, not a procurement policy. [May 2, 2006]
 

Former MA CIO Peter Quinn, being interviewed by Governing.com's Ellen Perlman...Full Story

   

Q: Could Microsoft conform to the OpenDocument Format if it chose to?

A: Microsoft is not designed to output this format. The policy is very explicitly to exclude Microsoft technology. [May 2, 2006]
  Microsoft's Alan Yates, being interviewed by Governing.com's Ellen Perlman...Full Story
   

The straight scoop:   In a recent (but not yet posted interview), I'll quote then Secretary of Finance and Administration of Massachusetts to the effect that Microsoft stated last year that supporting ODF would require a trivial amount of work. At no time has the Massachusetts policy been "anti Microsoft," although it would be fair to say that Microsoft is "anti-ODF."

spacer

Screwed again [May 12, 2006]

 

John Winske, Disability Policy Consortium, reacting last year to news of ODF adoption in MA...Full Story

   

We have to support where our user base is, and like it or not, that's the Microsoft operating system, applications and browsers [May 12, 2006]

  Ben Weiss, CEO at accessibility tools developer Algorithmic Implementations Squared...Full Story
   

The straight scoop:   Accessibility tools come from fairly small and poorly-funded independent software vendors, resulting in a "chicken and egg" situation: why build tools that people aren't yet ready to buy? The situation is being addressed through multiple initiatives within ODF supporting organizations, such as OASIS, and by individual vendors, such as Sun and IBM.

spacer

It is highly probable that we will strongly recommend the use of open document formats to public administrations. On the other hand, it is unlikely that we will make a specific recommendation, in case we will have two ISO standards at a later point in time [May 8, 2006]

 

EC IDABC source, on whether the IDABC will still recommend ODF for adoption by EC agencies, now that Microsoft's Open XML will be submitted to ISO

   

We have our standard, now let us use it [May 8, 2006]

  ZDNet.UK Editorial, advising EU member states to get off the pot, and adopt ODF...Full Story
   

The straight scoop:   There has been no reported decision to reverse an announcement made last October by the IDABC to support ODF when it was approved by ISO; the first report is based on an interview with an unnamed source. Bottom line: wait and see on this one.

spacer

OpenForum Europe Welcomes Announcement from ISO
Press Release
OpenForum Europe May 6, 2006 OpenForum Europe strongly welcomes the news that ISO has formally voted for Open Document format (ODF) to be recognised as an international standard....OFE calls upon every European National Government to act now to formally adopt ODF as the recognised open standard for data formats and to take practical action to implement this. OFE has in the past been critical of Europe in lagging behind the rest of the world in adoption of ODF, for example the Commonwealth of Massachusetts, and will be seeking assurance from the European Commission on the steps it will now be taking to provide leadership to Member States. ...Full Story

spacer

ISO Approves OpenDocument Format
W. David Gardner
TechWeb.com/InformationWeek May 4, 2006 The International Standards Organization (ISO) has approved the OpenDocument Format (ODF), giving a boost to firms and organizations opposing Microsoft's proprietary office software.... "Given the ongoing unhappiness in Europe with Microsoft over what the EU regards as unacceptable bundling and other practices, this may be particularly significant, especially when taken with the desire of many European and other purchasers to use open source products whenever possible," said Andrew Updegrove, a partner in the Boston law firm of Gesmer Updegrove. ...Full Story

spacer

Open Source

LT:  [M]ost of it really is about the social side when you go to conferences and you will find people sitting at the same table with laptops and they will send each other emails

CNN: So the face to face thing is a little bit overrated?
LT: I think so
[May 19, 2006]

 

Linus Torvalds, explaining the social side of open source to a CNN interviewer...Full Story

   

Where to begin? Sampling open source news these days is a bit like using an eyedropper to take a sample from a fully-employed fire hose.  As a result, the following sampling is more eclectic than comprehensive.  The first item is a regrettably (because rare) short interview with the legendary and reclusive Linus Torvalds.  The second refers to a seasonal announcement by Sun that yes, it will, open source Java – in some yet to be determined way that will prevent forking  (Note to Jonathan Schwartz: see this month's Feature Article for the way), while the third indicates that others are more than happy to throw their code into the open pot without the same hesitations.  The final item from this month's news is taken from the UK's irreverent ("Biting the Hand That Feeds IT") TheRegister.com, which suggests that the US is falling behind in something of an Open Source Arms Race (did you know there was one?)

spacer

Reclusive Linux founder opens up
Kristie Lu Stout
CNN.com May 20, 2006 Usually media shy, the 36-year-old Finn invited Kristie Lu Stout and the Global Office team into his home for an insight into life at the helm of the operating system that is giving Microsoft some serious headaches. ...Full Story

spacer

Sun promises to open source Java (again)
Ryan Paul
ArsTechnica.com May 18, 2006 Sun's executive vice president of software, Richard Green, reiterated the company's intentions to open source Java at the JavaOne conference yesterday in San Francisco. Green claims that, although the company is dedicated to following through on its promise to open the Java source code, it still isn't sure how it will do so....Sun needs to follow through with its promise, or stop talking about it. Without a clear timeline, these perpetual reassurances only reinforce the perception that Sun is still trying to create an image of openness without having to liberate any actual Java source code. ...Full Story

spacer

Motorola goes opensource for mobile phones
Richard Wilson
ElectronicsWeekly.com May 17, 2006 Motorola says it wants to encourage greater unification of Java technology within the mobile phone industry. To take a lead the company will open source its Java test framework and sample test cases and will develop the reference implementation and compliance tests for Motorola-lead JSRs such as the Mobile Information Device Profiles (MIDP) 3.0 specification. According to the mobile phone firm, the aim is to “create a more common, open environment for mobile Java platforms through an open source project”. ...Full Story

spacer

US in open source backlash
Ashlee Vance
TheRegister.com May 5, 2006 The US has fallen way behind other nations with regard to its embrace of open source technology, and the situation may only get worse. Open source coders face their grandest test to date as organizations place more and more scrutiny on the origins and value of FOSS (free and open source software) products.That's the word that came down today from an august panel here at the World Congress on Information Technology (WCIT). Some members of the panel reckoned that countries in Europe, Asia and South America have a greater appreciation for the open source lifestyle. ...Full Story

spacer

China Update

In order to break the monopolization of international standards for mobile storage in China's market, the country begin to research and develop its own for the sector [May 18, 2006]

 

Chen Fangqing, head of China's Mobile Storage Standard Working Group (MSSW)…Full Story

   

Meanwhile, in a country [not really] far away:  In the world of technology, of course, nothing is really far away, which is one reason why paying attention to what is going on in the world's largest country (by population) and perhaps economy, some day in the future. This month's news includes another "home grown" standards area, a reminder that open source software development is alive and well in the Peoples Republic, and finally a comparative indication of how large the stakes are regarding those home-grown standards: One of the standards that China has developed internally to avoid paying huge royalties to foreign vendors is called TDSCMA - a 3G telephone standard that would compete with foreign alternatives. As the last item below indicates, for a country with a population many times the size of Korea's, the stakes are huge.

spacer

China to Release Part Contents of Mobile Storage Standard
TMC.net May 20, 2006 (SinoCast Via Thomson Dialog NewsEdge)BEIJING--In order to break the monopolization of international standards for mobile storage in China's market, the country begin to research and develop its own for the sector. Now, the work has been fully activated and part contents of the standard are expected to be released in the latter half of this year. ...Full Story

spacer

China calls on open-source community for advice
Sumner Lemon
IDG News Service May 28;, 2006 China is counting on senior members of the open-source community to help formulate policy ideas to promote open-source software, according to a local software executive. The China Open-Source Software Promotion Union (COPU), a government-backed industry group, has established a think tank comprised of 19 prominent open-source executives from overseas to develop a framework for better international cooperation. ...Full Story

spacer

South Korea's CDMA royalties top $2.6 billion
Sean Shim
EE Times.com May 13, 2006 SEOUL, South Korea — Royalty payments by South Korean mobile phone makers using Qualcomm Inc.'s CDMA technology totaled $2.63 billion over the past decade, according to a report by the Ministry of Information and Communication. The report said domestic handset makers paid a total of 3.03 trillion won from 1995 to 2005 to use the wireless phone technology developed and patented by the San Diego-based company. The CDMA standard is the core technology for South Korea’s mobile telecommunications networks. Korean cellphone makers are known to pay royalties totaling 5.25 percent of factory prices for domestic sales and 5.75 percent for overseas shipments. ...Full Story

spacer

Semantic and NextGen Web

CERF is built on a state of the art semantic web technology infrastructure that enables extensible management of rich research content [May 8, 2006]

 

Press release of Rescentris, announcing investment by VCs in a Semantic Web based technology...Full Story

   

Just follow the money:  The first two items below are usual fare that we've been seeing in the news for years about the Semantic Web: articles about, or interviews with, Tim Berners-Lee and musings about What it All Means. All of which has lulled many into thinking that the Semantic Web has been sputtering. In fact, it appears that it's beginning to take hold, even in the flinty eyes of venture capitalists. This is the first time that I've included a funding press release, and there's a reason for me breaking with tradition by including the Rescentris notice below: there are a variety of ways to measure uptake. One such measure of credibility is when venture capitalists start to believe in the future of a new technology or technical approach - as they evidently have in this case: the Rescentris platform, according to the press release, "is built on a state of the art semantic web technology infrastructure that enables extensible management of rich research content."

spacer

Web Inventor Sees His Brainchild Ready for Big Leap
Lucas van Grinsven
eWeek.com May 21,2006 AMSTERDAM (Reuters)—The World Wide Web is on the cusp of making its next big leap to become an open environment for collaboration and its inventor said he has not been so optimistic in years. Still, Tim Berners-Lee, the Briton who invented and then gave away the World Wide Web, warns that Internet crime and anti-competitive behavior need to be fought tooth and nail. ...Full Story

spacer

Rescentris Raises Series A Financing to Fuel Corporate Growth; Successful ELN for Biology Attracts Oversubscribed Funding Round
Press Release
Business Wire May 8, 2006 COLUMBUS, Ohio--Rescentris, Ltd., a leading provider of electronic laboratory notebook (ELN) software for the life sciences industry, today announced the completion of Series A financing....Rescentris' flagship software product, the Collaborative Electronic Research Framework(TM) (CERF), integrates scientific content management with a full featured electronic lab notebook...CERF is built on a state of the art semantic web technology infrastructure that enables extensible management of rich research content. ...Full Story

spacer

Semantically speaking
Davey Winder
PCPro.com May 7, 2006 It's generally accepted that Hanns Oertel in his 'Lectures on the study of language' was the first to clearly distinguish between the formal and the semantic definitions of a word - the latter relates to its significance or meaning. That was back in 1901, and exactly one-hundred years later I first read about 'The Semantic Web' in a Scientific American article that included a notable name among its three authors: Tim Berners-Lee. ...Full Story

spacer

Wireless

If you're shopping for Wi-Fi gear, should you care?...I have to join an increasing chorus of pundits saying no [May 2, 2006]

 

PCWorld reviewer Yardena ARAR, evaluating pre-approval 802.11n products ...Full Story

   

When the hurley-burley's done:  There's no end of rough and tumble still going on in the wireless space, with many battles yet to be lost and won. What follows are examples of activity of all kinds, in all commercial settings, using multiple standards in various states of development, approval and deployment.

spacer

ZIGBEE AND BACNET LINK UP TO EXTEND STANDARDS AND INTEROPERABILITY TO WIRELESS BUILDING CONTROL APPL
Press release
HomeToys.com May 18, 2006 San Ramon, Calif. – The ZigBee™ Alliance, a global ecosystem of companies creating wireless solutions for use in home, commercial and industrial applications, today announced a new collaboration with BACnet, a protocol for wired commercial building automation, establishing interoperability between the two technologies. This agreement will allow building operators relying on existing wired BACnet infrastructure to add wireless devices to their existing Building Control systems by using ZigBee technology to increase safety, security and convenience while saving money on utilities. ...Full Story

spacer

Wave of commentary delays new WLAN standard
Robert W. Smith
HeiseOnline.de May 18, 2006 The Interim Meeting of the Working Group (WG) 802.11 is currently taking place in Jacksonville, Florida. As has now emerged the last letter ballot of the Task Group N (TGn), which was set up to devise an extension for radio data rates beyond 100 Mbps to the 802.11 family of WLAN standards, has resulted in a failure....in the wake of the ballot TGn received more than 12,000 items of comment on the first draft of 802.11n. According to the IEEE's standardization procedure rules every one of these items of comment now has to be considered carefully. ...Full Story

spacer

New RFID Standard Groups at EPCglobal
RFIUpdate.com May 10, 2006 Standards body EPCglobal last week announced the formation of two new standards development working groups: the High Frequency (HF) Air Interface Working Group and the Ultrahigh Frequency (UHF) Air Interface Working Group. The former will focus on creating a Gen2 standard of HF, and the latter will work to extend the existing Gen2 capabilities to support the unique security features required by item-level tagging. RFID Update spoke with EPCglobal director of product management Sue Hutchinson about the new groups. ...Full Story

spacer

PC World Analysis: Deconstructing the Draft 802.11n Wi-Fi Hype
Yardena Arar
PCWorld May 3, 2006 You'll soon see a raft of Wi-Fi products on store shelves that say they are compliant with a new draft of the 802.11n wireless standard. Wireless networking is often confusing, and draft standards are even more so. If you're shopping for Wi-Fi gear, should you care?...I have to join an increasing chorus of pundits saying no. ...Full Story

spacer

Standards and Your Business

BCM gives business people the choice to think in business terms-not in 'techno-babble' [May 3, 2006]

 

Business process management expert Peter Fingar, commenting on what BCM 1.0 can do for you...Full Story

   

We need to talk:  Technology is great, but not always as people-friendly as many would wish. The following two items report on how standards can help bridge the gap at work, and between businesses and their customers.

spacer

Voice XML 2.1 boosts functionality
Sanjeev Sawai
Network World.com May 12, 2006 VoiceXML is quickly becoming the standard language used for developing interactive voice response and speech-enabled self-service applications. Applications that were previously deployed only on the Web are now easily made available via the phone, giving customers a consistent, convenient method for interacting with retailers, banks and utility providers via the Web or telephone. The latest version, VoiceXML 2.1, takes a significant step toward improving the responsiveness and adaptability of speech-enabled approaches. This can be the difference between customers who are happy with a company's speech-enabled self-service options and those who take their business elsewhere. ...Full Story

spacer

Business-Centric Methodology (BCM) Ratified as OASIS Standard
Press Release
OASIS.org May 5, 2006 Boston, MA, USA - The OASIS international standards consortium today announced that its members have approved the Business-Centric Methodology (BCM) version 1.0 as an OASIS Standard, a status that signifies the highest level of ratification. BCM is a set of layered methods for acquiring interoperable e-business information within communities of interest [that enables]...organizations to identify and exploit business success factors in a technology-neutral manner, based on open standards...."BCM gives business people the choice to think in business terms-not in 'techno-babble'," said Peter Fingar, industry expert on business process management and author of the newly released book, Extreme Competition. ...Full Story

spacer

Standards and Society

There should be no secret RFID tags or readers [May 2, 2006]

 

Center for Democracy and Technology Working Group Report on RFID privacy...Full Story

   

First of all, do no harm:  That admonition has been a prime medical directive for millennia, and it's one that could well be remembered in multiple disciplines. As noted below, advances in RFID technology need to be mindful of privacy risks, while technology innovations that are not adapted to the needs of the disabled can serve to leave the societally and economically disadvantaged even farther behind. In each case, standards are part of the answer.

spacer

IT Vendors, Privacy Groups Release RFID Standard
CIO.com May 3, 2006 A set of best practices designed to help assuage consumers’ concerns about radio frequency identification (RFID) tags was released on Monday by a group of technology vendors, RFID users and consumer groups.... "There should be no secret RFID tags or readers," the report says. "Use of RFID technology should be as transparent as possible, and consumers should know about the implementation and use of any RFID technology ... as they engage in any transaction that utilizes an RFID system. At the same time, it is important to recognize that notice alone does not mitigate all concerns about privacy." ...Full Story

spacer

Last Call: Web Content Accessibility Guidelines (WCAG) 2.0
W3C.org May 2, 2006 Web Content Accessibility Guidelines 2.0 (WCAG 2.0) covers a wide range of issues and recommendations for making Web content more accessible. This document contains principles, guidelines, and success criteria that define and explain the requirements for making Web-based information and applications accessible. "Accessible" means usable to a wide range of people with disabilities, including blindness and low vision, deafness and hearing loss, learning difficulties, cognitive limitations, limited movement, speech difficulties, photosensitivity and combinations of these. ...Full Story

spacer

Deadline Looms for Hearing Aid Compatibility
Carrie Printz
Wireless Week May 1, 2006 Most wireless carriers and manufacturers are confident they can meet the upcoming deadlines for new FCC requirements on hearing aid compatibility, including those dealing with both RF emissions and telecoil coupling....Its members include representatives from the wireless and hearing aid industries as well as the hearing-disabled community. The group has been working on revisions to an American National Standards Institute (ANSI) standard used to gauge hearing aid compatibility. ...Full Story

spacer

New Consortia

The OMC group seems set on using "conversation" as its key mechanism, which is a very open source thing to do, but we wonder how far that will carry them [May 11, 2006]

 

Ashlee Vance, writing in TheRegister.com about the newly formed Open Management Consortium...Full Story

   

Its spring time, and everything's coming up consortia:  There's a bumper crop of new organizations this month, covering a wide and diverse range of technical areas, constituencies and techniques – and an equally interesting diversity of strategic aims and game plans.

spacer

OpenAjax Alliance formally opens for business
Tony Baer
CBROnline.com May 21, 2006 OpenAjax has renamed itself the OpenAjax Alliance and has finalized its first roadmap, according to members of the group at the JavaOne developers' conference in San Francisco. AdvertisementThe group has decided that it will not be a standards body, but will instead operate in a manner similar to WS-I, the Web Services Interoperability Organization. ...Full Story

spacer

AIIM Announces the Formation of the Interoperable Enterprise Content Management (iECM) Consortium
Press Release
AIIM May 16, 2006 AIIM -- The Enterprise Content Management (ECM) standards developer, announces the creation of the iECM Consortium based on the work of individuals from over 50 major companies and government agencies. Based on work done over the past year, the formation of the iECM Consortium provides the foundation for the development of the standards needed for the creation of interoperable ECM systems....The iECM concept identifies three major areas that must be standardized in order to achieve interoperability: Services, Information/Metadata Models, and Component Descriptions. ...Full Story

spacer

Open source gang forms to battle IBM, BMC and CA
Ashlee Vance
TheRegister.com May 11, 2006 A group of open source software vendors have teamed up with the hopes of ending the systems management dominance enjoyed by the likes of IBM, HP, BMC and CA. The companies today have formed the Open Management Consortium (OMC), hoping to get out the message that open source management tools have matured to the point where they can compete with proprietary packages. The names of the OMC founding members could have come straight out of J.R.R. Tolkien's brain. ...Full Story

spacer

Intel, Hynix, Micron, Sony form NAND group
Mark LaPedus
EETimes.com May 10, 2006 SAN JOSE, Calif. — Seeking to accelerate the time-to-market for NAND-based flash memories in the marketplace, Hynix, Intel, Micron, Phison and Sony are among the founding companies that on Tuesday (May 9) announced the formation of a new and long-awaited working group in the arena. The organization — dubbed the Open NAND Flash Interface (ONFI) Working Group — appears to be missing some key vendors. NAND flash leaders Samsung Electronics Co. Ltd. of South Korea and Toshiba Corp. of Japan were conspicuously absent in the formation of the group. ...Full Story

spacer

New group to standardize digital music data
Candace Lombardi
ZDNet.com May 8, 2006 The purpose of DDEX (pronounced "dee-dex") is to establish standards for the meta-data in digital music files, mainly for sales and rights-tracking purposes. Founding members of the consortium include artists' rights group ASCAP, Sony BMG Music Entertainment, Warner Music Group, Universal Music Group and EMI Music, as well as several other music rights societies and agencies from the U.S., U.K. and Europe. Apple Computer, Microsoft and RealNetworks, which all provide digital music services, are also charter members. ...Full Story

spacer

SOA Vendors Link For Interoperability
Clint Boulton
InternetNews.com May 2, 2006 Companies that tout distributed computing banded together today with a plan to make their software work together.... JBoss, Infravio, AmberPoint and several other vendors have created SOA Link to promote the interoperability of their various service-oriented architecture (SOA) (define) products, software that allows Web services (define) to communicate with one another. Participants will jointly develop integration at the data, control and user interface to allow products for SOA governance to interoperate.... Software developed or adjusted under the aegis of SOA Link could include policy repositories and authoring systems, run-time enforcement systems, or business process utilities. ...Full Story

spacer

New Initiatives

And this is just the first draft [May 18, 2006]

 

Microsoft spokesperson, commenting on the fact that Open XML has already doubled in length (to over 4,000 pages)...Full Story

   

…and new initiatives: The April showers have also brought an equally varied number of new initiatives, launched to a wide variety of purposes in many different venues. The following are only a sampling.

spacer

W3C Launches WebCGM Working Group
Staff
W3C.org May 21, 2006 W3C has announced the launch of a Web CGM Working Group with Lofton Henderson as the Working Group Chair. Computer Graphics Metafile, or CGM, is an ISO standard for tree-structured, binary graphics format that has been adopted especially by the technical industries (defense, aviation, transportation, etc) for technical illustration in electronic documents. The new Working group is chartered to develop a W3C Recommendation for WebCGM 2.0, starting with the WebCGM 2.0 Submission from OASIS. ...Full Story

spacer

ISA-SP100 Committee Announces Formation of Working Groups
Press Release
TMC.net May 20, 2006 RESEARCH TRIANGLE PARK, N.C. --(Business Wire)-- May 18, 2006 -- At its recent committee meeting, ISA's Wireless Systems for Automation standards committee (ISA-SP100) agreed to form two new standards working groups, SP100.14 and SP100.11. The SP100.14 working group will define wireless connectivity standards optimized for the unique performance and cost needs of a wide range of industrial monitoring, logging and alerting applications. The SP100.11 work group will define wireless connectivity standards addressing a wide range of applications optimized but not restricted to the unique performance needs of control applications ranging from closed loop regulatory control through open loop manual control. ...Full Story

spacer

Bluetooth Working Group Tackling Healthcare Interoperability
Jim Barthold
TelecommunicationsOnline.com May 17, 2006 The increasingly number of healthcare devices using Bluetooth wireless transmission technology has led to the creation of the Medical Devices Working Group within the Bluetooth Special Interest Group to specifically address interoperability issues. The new working group will draft interoperability specifications this year and make a new profile available for devices in early 2007 that will run on the upcoming Bluetooth Lisbon (2.1) release and future high-speed versions of the wireless technology that could also include devices compatible with the Ultra-Wideband (UWB) transmission scheme. ...Full Story

spacer

New RFID Standard Groups at EPCglobal
RFIUpdate.com May 10, 2006 Standards body EPCglobal last week announced the formation of two new standards development working groups: the High Frequency (HF) Air Interface Working Group and the Ultrahigh Frequency (UHF) Air Interface Working Group. The former will focus on creating a Gen2 standard of HF, and the latter will work to extend the existing Gen2 capabilities to support the unique security features required by item-level tagging. RFID Update spoke with EPCglobal director of product management Sue Hutchinson about the new groups. ...Full Story

spacer

Dell, HP and Lenovo Announce Joint Support for DisplayPort
Tuan Nguyen
DailyTech.com May 7, 2006 DVI-I, DVI-D, UDI, HDMI -- a confusing group of abbreviation for many. Interestingly, all of them do similar things and the two later ones attempt to address the same issues including backwards compatibility while being different themselves. As far as standards go, computer and digital displays have pretty much been using one big standard, DVI. However, industry supporters say that connectivity is too confusing, and in fact, will now launch a newer standard, called DisplayPort. DisplayPort, designed by the VESA group, attempts to do one thing: unify digital display connection interfaces. ...Full Story

spacer

WS-I Announces New Profile Work for 2006; Web Services Interoperability Organization Initiates Work on Three New Profiles: Basic Profile 1.2, Basic Profile 2.0 and Reliable Secure Profile 1.0
Press Release
WS-I.org May 2, 2006 WAKEFIELD, Mass.--(BUSINESS WIRE)--May 1, 2006--The Web Services Interoperability Organization (WS-I) today announced that the WS-I Board of Directors has approved two new working group charters, which will result in the development of three new WS-I profiles in 2006: the Basic Profile 1.2, Basic Profile 2.0 and the Reliable Secure Profile 1.0. WS-I is a global industry organization that promotes consistent and reliable interoperability among Web services across platforms, applications and programming languages. More information about WS-I can be found at www.ws-i.org. ...Full Story

spacer

New Standards

In general American enterprises don't care all that much about whether something is the de jure standard. They do care whether something is the de facto standard [May 10, 2006]

 

Gordon Haff, senior analyst at Nashua, N.H.-based Illuminata Inc....Full Story

   

Meanwhile, at the other end of the funnel: Completed work this month include work on brand new technical challenges – such as Web content delivered over mobile devices and standards to augment emergency response; maturing efforts - such as Web services; and standards from what now seem like another era – like a revised version of the standard for the venerable Ada programming language.

spacer

W3C Dials up mobile web improvements
Robert Jaques
VNUNet.com May 18, 2006 World Wide Web Consortium has introduced the first draft of its Device Independence Authoring Language (Dial) designed to improve mobile content authoring. The organisation explained that with thousands of multifunction mobile devices in use today people have come to expect the same quality information available on the move that they find on desktop PCs. This diversity poses "significant challenges" to web designers and mobile operators required to create content for upwards of 2,500 different kinds of mobile device. ...Full Story

spacer

Web Services Addressing 1.0 is Now a W3C Recommendation
Press Release
W3C.org May 11, 2006 The World Wide Web Consortium (W3C) announced today that Web Services Addressing 1.0 - consisting of the Core specification and the SOAP Binding - is a W3C Recommendation. Industry now has a reliable, proven interoperable standard to address Web services messages. "Web Services Addressing 1.0 provides a mechanism to developers on how to address objects for Web services applications," explained Philippe Le Hegaret, W3C Architecture Domain Leader. "It extends the capabilities of Web services by enabling asynchronous message exchanges, and allowing more than two services to interact." ...Full Story

spacer

EDXL-DE Becomes Newest OASIS Standard
OASIS May 4, 2006 Consortium members voted to ratify the Emergency Data Exchange Language Distribution Element (EDXL-DE), v1.0 as an OASIS Standard. EDXL-DE provides an integrated framework that enables information exchange to advance incident preparedness and response to emergency situations. Congratulations to members of the OASIS Emergency Management TC, and thanks to all who participated in the review and balloting process. ...Full Story

spacer

Ada 2005 Standard Receives Technical Approval; Formal Standardization by International Organization for Standardization (ISO) Anticipated Soon
Press Release
Business Wire May 3, 2006 SALT LAKE CITY--Today the Ada Resource Association announced the accomplishment of a major milestone in the development of the new Ada ISO standard. ISO's Ada Working Group (WG 9) has unanimously accepted the proposed amendment to the language and has forwarded it to the parent organization for an official ballot. Formal approval by ISO is expected some time later this year. The new amendment to the language, commonly referred to as Ada 2005, culminates a collaborative international effort to enhance the 1995 version of the Ada language. ...Full Story

spacer


 

 

 

 

 

 

L10 Web Stats Reporter 3.15 LevelTen Hit Counter - Free PHP Web Analytics Script
L