Consortiuminfo.org Consortium Standards Bulletin- July 2006
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
   Home > Consortium Standards Bulletin > July 2006
  Untitled Document
Untitled Document

 

July 2006
Vol V, No. 7

CERTIFICATION AND BRANDING

EDITOR'S NOTE: CERTIFICATION AND YOU
Whether you know it or not, you're not only surrounded by standards, but by certified standards-compliant products as well.  Which is as it should be.
 
EDITORIAL: THE POWER OF CERTIFICATION
More and more standards of all types (technical, professional, ethical and environmental) are supported by voluntary participation certification programs. These programs not only provide a nimble and cost-effective alternative to government regulation, but offer an increasingly important means to confront global challenges like global warming, environmental degradation, and achieving sustainable use of renewable resources as well. Print this article
   
FEATURE ARTICLE:

STANDARDS COMPLIANCE CERTIFICATION AND BRANDING IN THE INFORMATION AND COMMUNICATIONS TECHNOLOGY SECTOR

The ICT sector is particularly dependent on achieving interoperability through compliance with appropriate standards, and on maintaining end-user trust in compliance.  But developing robust compliance tests is expensive, and the number of products to test is usually too small to permit third party certification companies to recover their development costs.  The result has been the evolution of a range of situation-specific, variably rigorous alternatives to meet the need. Print this article
   
STANDARDS BLOG: THE MICROSOFT CONVERTER, NEWS SHOPPING AND TECTONIC SHIFTS
Microsoft has long refused to support ODF in its Office productivity suite, but this month it announced that it would support an open source project to develop an Office to ODF converter. Just about everyone has an opinion about what it all means. Including me. Print this article
   
CONSIDER THIS: LIVE'N THE WIFI LIFESTYLE: THE IPOD BOWS TO THE ROUTER

Would it surprise you to learn that 8 out of 10 Americans would give up their iPod before they would sacrifice their WiFi router? It shouldn't. The iPod/iTunes system is proprietary and limited to what Apple wants to give you. But the WiFi standard is open, and is being implemented everywhere, by everyone, and on every device imaginable. The result? We expect Internet access everywhere, all the time and we'd even give up our iPods to have it. Print this article

 
NEWS SHORTS:  
Everyone Has Their Own Theory on the Microsoft ODF Converter; Defense Department Calls for Greater Use of Open Source and Open Systems; EC Says No (Again) to Software Patents; Lucky Us - HD-DVD & Blu-Ray are Here; Time for a Terrorist Target Markup Language? Senator Stevens Excellent Internet Adventure; and, as always, much more.

Print this issue (PDF)


 

EDITOR'S NOTE:

CERTIFICATION AND YOU

If you're reading this issue on line, there's a fair chance that you're viewing it on the screen of a WiFi enabled laptop.  And as someone that is interested in standards, I expect that you're well aware that WiFi is a standard.  But you might not know that the chipset that your laptop is using to access your wireless router (or Starbuck's, as the case may be) has almost certainly been certified by the WiFi Alliance, even though this fact is what underlies your assumption that you will be able to log on to another router almost everywhere you go.

The process of certification – testing and verifying compliance to a standard - is vital to the credibility and utility of standards, although we are typically not aware of how pervasive such programs in fact are.  As a simple proof of that observation, glance at the bottom of your laptop's power source, or at the label on the bottom of the laptop itself, and you will find a scattershot pattern of small and inscrutable seals – each of which is the safety-related certification mark of a separate testing body (I count 21 on my Dell power source).

Our topic this month, appropriately enough, is therefore the important and perhaps underappreciated role that certification, and the process of building brand awareness in some certification programs, plays in the world of standards.

In my Editorial for July, I introduce the topic by highlighting both the breadth as well as the adaptability of standards and certification programs, as well as the important role that these tools are playing in addressing new and intimidating environmental challenges, such as global warming and ensuring the sustainable use of natural resources.

In this month's Feature Article, I provide an overview of the techniques and role of certification and branding in the information and communications technology (ICT) sector, an area in which sufficient resources to create robust third party administered testing tools has often been lacking, requiring the creation of a spectrum of techniques that are not common in many other industry sectors.

As the Standards Blog entry for this month, I've selected a posting on a topic that I have covered in great detail for a year now: the expanding adoption of the OpenDocument Format (ODF) and the adaptation of the marketplace to a multiple document format environment.  This entry focuses on an initiative announced by Microsoft early this month to fund and support the creation of a converter to facilitate the translation of documents created using its own format into documents that can be opened using ODF compliant software.

My Consider This essay returns to the certification theme, contrasting the adoption of a standardized, certified product family – WiFi enabled devices and routers – with another popular but proprietary high tech offering: the Apple iPod and iTunes environment.  Each of these successful systems has generated a "lifestyle" impact, but the open standards based wireless lifestyle that is supported by the ingenuity (and marketing budgets) of hundreds of companies is exploding at an order of magnitude greater rate than its proprietary analogue in the music world.

As usual, the issue ends with The Rest of the News, being a selection of what I thought were the most interesting and important stories of the last month, accompanied by a few observations on why I found them to be of interest.

A final note:  The CSB is issued ten times a year, with August and December being my months to catch my breath, and spend some time on other interests.  As a result, I'll visit with you next in September.

As always, I hope you enjoy this issue. 
   
Andrew Updegrove
Editor and Publisher
2005 ANSI President’s
Award for Journalism
 
     
The complete series of Consortium Standards Bulletins is available on-line at http://www.consortiuminfo.org/bulletins/  It can also be found in libraries around the world as part of the EBSCO Publishing bibliographic and research databases

 

EDITORIAL

THE POWER OF CERTIFICATION

Andrew Updegrove

The practice of certifying compliance with standards is almost as old as the creation of standards themselves.  This should come as no surprise, because the vast majority of standards are created for the benefit of multiple stakeholders, rather than as pure design tools for vendors.  Human nature (on both sides of a transaction) being what it is, there is no reason for a vendor to expect a customer to believe an uncorroborated assertion of compliance.  But third party certification can provide a means to fill this gap in trust.

In consequence, the testing and certification of compliance with all manner of requirements, both mandatory under laws and voluntary under many thousands of consensus-based standards, has been an ever-expanding and adaptable practice since the nineteenth century.  The result is that we enjoy a world today that is more commercially trustworthy than that which existed only a short time ago.

It is interesting to note in this regard that an increasing number of the certifications that benefit us today fall into the voluntary rather than the mandatory category.  True, compliance with thousands upon thousands of health and safety-related standards has been incorporated into governmental regulations.  And those regulations continue to be policed by armies of local building inspectors and federal employees of agencies such as the U.S. FDA and OSHA.  But there are even more thousands of standards that vendors, service providers and professionals voluntarily comply with in order to increase the commercial attractiveness of their goods and services.

Why should vendors and service providers not only constrain their design freedom and professional conduct to standards, but also pay others to confirm such compliance?  The exact reasons vary, but most come down to assuring consumers that their purchasing expectations will be met, and/or that those expectations may be justifiably higher with respect to certified than non-certified alternatives.  In practice, these expectations can relate to interoperability (yes, this will plug and play with that), safety (I see the Underwriters Laboratory seal), training, professionalism and trustworthiness (professional certifications of all types) and compliance with ethical, environmental or other societal values (this vendor engages in fair practices, both at home and abroad).

The mere existence of certification programs arguably raises the bar even for those that do not choose to participate.  The reason is that consumer expectations can rise in response to the promotional campaigns that are often launched to support a certification program.  If the consumer comes to associate value with certified product or service, all competitors are put to the challenge of justifying the value proposition of their own offerings, either through lower prices, providing superior service, or simply by committing to a larger marketing budget.  Ultimately, it may become a less expensive and more certain alternative for a non-participating vendor to simply meet the same tests that those who certify have met, rather than to seek to persuade the buying public that an unknown quantity provides a superior alternative – especially if the certified product is now commanding a premium price.

Such market-based self-regulation can provide a very attractive alternative to government regulation, avoiding the greater bureaucracy, waste, and expense that might otherwise be brought to bear to address the same issues.  Even assuming parity of process and efficiency in public and private endeavors, it is difficult to imagine the degree to which government payrolls would need to expand, were the public sector to assume responsibility for assuring compliance with the hundreds of thousands of voluntary consensus standards in existence today.

Recently, broad awareness of the threats presented by global warming and dependency on foreign energy sources has risen dramatically in many countries (including, finally, even the United States).  But the political will to address these challenges effectively is still often weak (especially in the United States), despite the fact that public opinion is swinging in favor of responsible action.

In the face of growing consumer interest in environmentally and ethically responsible conduct by industry, a variety of private sector organizations have been launched.  Such private sector initiatives can be more nimble and responsive, and less likely to be subverted by special interests, than efforts to achieve the same ends through the legislative process.  This has proven to be true not only in the case of initiatives launched by "green" advocates, but also by major companies in some extractive industries that have (in the words of the author Jared Diamond) grown concerned that their "social license" to operate may be revoked if they do not (literally) clean up their acts. 

Some of the efforts that are in process now have adapted traditional standard setting and certification concepts to address important new global needs.  A splendid example (and there are others) is the Forest Stewardship Council, which is headquartered in Germany, and has offices in over 40 nations.  This organization has created rigorous standards for sustainable harvesting of timberland, and certifies independent inspection companies that forest owners can hire to assess their compliance in the field to FSC standards.  Only after compliance with these standards has been confirmed can the FSC certification mark be applied to raw lumber, and to those finished goods created from materials that can be tracked back through the supply chain to a certified source.  Certified timberlands remain subject to annual, unannounced inspections to assure continuing compliance – all at the cost of those for profit, not for profit, and governmental forest owners that seek certification.

The validity, value and extensibility of the concept of standards are amply demonstrated by such new and innovative efforts.  Today, we are faced with ever more daunting challenges, such as global warming, dwindling natural resources and an increasingly ravaged environment.  If there is any reason to hope that we will be able to cope with these crises, it may lie in our ability to create such new kinds of voluntary consensus standards – and in the deployment of effective certification programs to back them up.

subscribe

Comments? Email:

Copyright 2006 Andrew Updegrove


 

FEATURE ARTICLE

STANDARDS COMPLIANCE CERTIFICATION AND BRANDING
IN THE INFORMATION AND COMMUNICATIONS TECHNOLOGY SECTOR

Andrew Updegrove

Abstract:  While a standard can provide value to a vendor through facilitating the design and production process, its greatest benefit arises when multiple stakeholders are made aware that a product or service complies with that standard.  In order for such a benefit to be secured, however, the assertion of compliance must be trusted, and that trust must be validated by actual performance in the marketplace.  In some circumstances, awareness of compliance is needed only on a business-to-business basis, while in others consumers must be made aware – or by experience find that they can take for granted – the fact that compliance goals have been achieved.  However, the creation of tests to demonstrate compliance, and the performance of such tests, can be expensive, and not all standard setting situations generate the desire, investment and infrastructure needed to fund neutral third party testing and certification.  This is particularly true in the information and communications technology (ICT) industry, in which interoperability among the products of diverse manufacturers is nonetheless an essential requirement.  As a result, a variety of techniques have evolved in the trenches to address this need in a situation-specific manner, from self-assertion of compliance with standards, to industry-wide certification programs that support expensive consumer brand-awareness building campaigns.  This article surveys the principal certification and branding needs, realities and practices that can be found in the ICT industry today.

Introduction: Standards have value to many constituencies, but their most obvious beneficiaries are those that utilize standards in the production of goods and the delivery of services, and those that consume those deliverables.  While vendors often benefit from using standards simply as production tools (e.g., to achieve interoperability between products from the same vendor), the greatest value of a standard to vendors and consumers alike can arise from simply knowing that a product or service complies with a standard. 

For example, performance standards (e.g., how many watts of energy a light bulb will draw, and how many lumens of light it will produce) permit vendors to provide products that customers can easily evaluate, and allow customers to compare prices between competing products.  Similarly, vendors increasingly adopt and implement interoperability standards that allow their products to access networks of all types in order to make those products more useful and desirable, and customers rely on built-in "plug and play" interoperability in order to mix and match components of everything from music systems to home wireless networks.

In order for such benefits to exist, however, customers need to be able to rely on, and therefore trust, the fact that a product that purports to comply with a standard actually does.  Such trust can be based upon any of a number of means, including vendor assertions, if the vendor has earned a reputation for trustworthiness, or on third party testing and confirmation of compliance.  With respect to that subset of standards that is created by governments (laws and regulations), the assurance of compliance may result from government inspections, licensures and enforcement.

Private sector assertions of compliance are often loosely referred to as "certification," in the sense that someone (whether the vendor or a third party) is promising that the good or service complies with the standard.  More properly, the word "certification" is usually used only where a neutral third party is providing the assurance of conformance.  Regardless of who is making the guarantee, however, the value is roughly the same, if the claim of compliance is accepted and relied upon in the marketplace.

That value can be augmented by focusing customer attention on the benefits of purchasing products that comply with a standard.  Such building of "brand awareness" in a standard can be just as useful as building customer awareness of an individual vendor's trademarked products.  Moreover, the costs of building brand awareness in a standard can be shared among many vendors, thus lowering the per-vendor cost of a promotional campaign by leveraging the efforts of the many campaign participants.

In this article, I will survey the principal means by which information and communications technology (ICT) industry compliance testing tools are created, the most common types of programs employed to perform and certify successful passage of compliance tests, and the ways that vendors build compliance brand awareness through the promotion of certification programs.

I.  Overview

Why test for compliance?  As with so many other aspects of standards and standard setting, the concept and practice of certification extends back into the dim reaches of antiquity. The first known examples of certification relate to weights and measures, as evidenced by metal ingots stamped with royal seals that attest to purity and weight. The evolution of coinage systems in many societies was a manifestation of the same certification concept, using the impressed (and sometimes idealized) likeness of a ruler on each coin to attest to the exact value (also sometimes idealized) of the precious metal comprising the coin.

Certification of compliance with standards relating to safety, on the other hand, has roots in the private as well as the public sector.  For example, the development of standards to ensure the safe design and building of steam boilers arose not from a government effort to prevent boiler explosions, but from a private vendor initiative launched to reassure both the public as well as insurance underwriters that installing boilers would not lead to disaster.   But over time government regulators became pleased to incorporate by reference the fruits of such private initiatives into the regulations they create in an effort to maintain public safety. 

By 1984, for example, voluntary compliance standards created by the American Association of Mechanical Engineers (ASME) to ensure that heat sources would automatically shut down before boilers could run dry (and sometimes explode) had been adopted into law by 46 states and all ten Canadian provinces. 1    And in the private sector, the ubiquitous Underwriters Laboratories "UL in a circle" mark (and its many related marks) is well recognized by U.S. consumers as a trusted indication that products bearing the mark have been designed to criteria that the UL believes to be conducive to safe usage. 2

In more recent years, certifications of all types have become omnipresent – attesting to the weight, quality, purity, safety and other significant attributes of goods as diverse as building materials, drugs, foods, appliances, elevators, services of all types, and more recently, advanced technology products and sustainable forest management. These certifications attest to compliance with the standards promulgated by a wide variety of bodies – Federal, State, and more recently, regional (e.g., the European Union) governments and agencies, safety testing organizations, accredited private sector standard setting organizations, and unaccredited consortia, trade associations, environmental foundations, and other fora. Within some of these broad categories there may be hundreds of individual standard setting bodies, some of which develop and maintain many, and even thousands, of standards. One site, http://www.nssn.org, tracks the status of some 270,000 current standards worldwide.

Standards themselves can be of several types, permitting varying ways to comply, as well as different processes to verify compliance.  For example, performance standards define required outcomes, but not the design elements required to achieve those outcomes.  As a result, they permit a vendor to design a product using a variety of techniques (patented and otherwise), so long as the resulting product meets the established performance measures.  The techniques used to certify compliance with such products must therefore accommodate the different types of products designs so utilized. 

Products built to design standards, on the other hand, must conform to more detailed and exacting specifications, so that all electrical plugs of a given type (for example) will fit into all electrical sockets intended to accommodate them. Compliance testing techniques for this type of standard can therefore be as simple as measurements of physical dimensions.  A given standard can incorporate both performance and design elements as well as diverse criteria, including the composition of component materials, physical dimensions, minimum outputs and maximum tolerances.

Interoperability standards, from the compliance testing point of view, can be another type of amalgam, in that design elements are specified, but their compliance (as in software) may need to be inferred from performance tests that prove or disprove success in achieving compliance.

Why brand?  Compliance testing of products is very widely used by vendors to ensure that their products will perform as expected, will meet regulatory requirements, and/or will be safe to use.  Certification of compliance also bears an important role in international trade, where the importation of products may be preconditioned on proof of compliance with applicable standards. 3   This extra effort taken to demonstrate compliance with standards is usually invisible and unknown to purchasers, or taken for granted by consumers in the case of safety standards in well-regulated societies. 

The reason that vendors do not go to greater lengths to publicize their efforts to achieve compliance or formal certification is because the means required are expensive, and compliance may not be sufficiently important to a consumer to warrant the extra marketing and promotional resources required to raise customer awareness. 

        Competitive formats:  In today's interconnected world, however, there is increasing demand for certification mechanisms that can assure consumers as well as vendors that their expectations will be met when they make a purchase, whether they are aware that that expectation relates to a standard or not.  Visible evidence of such certification can be useful in the consumer realm to convey the message that a product will perform and be usable as desired.  This is particularly true where new products are being introduced that rely on interoperability to provide value, and where consumers are aware that multiple, incompatible types of products are being sold that are visually indistinguishable, but for a distinctive logo or label text. 

In the late 1970s and early 1980s, for example, it was essential for video vendors to indicate, and for buyers and renters of videotapes to carefully look for the label stating, whether a given title conformed to the VHS or the Betamax video format.  The same is true today as the next generation of DVD players and discs is now being introduced into the marketplace.  Just as before, two competing (and incompatible) formats are once again being promoted, one called HD-DVD, and the second Blu-Ray.  Initially, the vendors of each camp will seek to persuade consumers that their technology is superior, and to build brand awareness around their format mark. 4   After a given consumer buys a player that conforms with one format or the other, however, the principal value of the format label will not be as a brand, but as a conformance mark on a DVD, in order to allow the consumer to avoid buying or renting a disc that proves to be unreadable on the particular player she now owns.

        New networks:  Other common examples of such visible certifications, promoted as actual brands but of value to consumers for more utilitarian means, include the logos that appear on ATM machines, informing a user whether a given terminal will accept a credit, debit or bank card from the network (e.g., Star or Cirrus) with which that card is registered.  Today, most bank ATMs are compatible with the cards of a wide variety of issuers, and arrangements have been made between banks and those issuers to reconcile accounts behind the scenes for most customers.  But initially, these networks were more limited, and the marks displayed on cash machines therefore had a higher value to individuals on the lookout for an ATM that could satisfy their need for instant liquidity.

Whether or not branding as well as certification makes sense to vendors and service providers - and to what degree – therefore depends on market circumstances. In the ATM example, there is no tolerance for error, because the results are binary: either the card can or can't be read, from the technical perspective, and the transaction will or will not be accepted, at the commercial level.  When someone sees a Cirrus logo on an ATM, they expect their card to be honored, even if the user has no knowledge what the "Cirrus" network is, who designed it, or how it operates.  The value that the consumer does appreciate is that there are hundreds of thousands of ATMs worldwide that bear the Cirrus logo, and into which the holder of (for example) a MasterCard can insert that card in order to obtain cash.

        Vendor needs:  In the world of non-consumer goods, the standards-based goals of commercial vendors may vary widely. In some circumstances, standards and credible certification mechanisms may make it easier for a new market to develop because one vendor will have a greater degree of confidence that the products reaching the marketplace will indeed be interoperable. Similarly, the existence of certification options may make it more worth a vendor's while to create products that comply with one standard rather than another, not only because the certification option has independent value, but because it knows that other vendors will be more likely to choose the standard supported by certification.  Since a standard only becomes useful through wide adoption, implementing the standard supported by certification therefore becomes the safer, as well as the higher value, decision (all other things being equal).

In this type of case, there is no incentive to create public brand awareness at all.  Instead, a much more targeted, but no less important, campaign is needed to educate the vendors in a given product space that not only a standard, but a supporting certification mechanism is available to reward them for adoption.  In many cases, the existence of compliance tests, even without a formal certification program, will still be attractive, because the compliance tests will be useful as tools to assist a vendor in discovering those changes in its product design that are required to achieve compatibility.

        Product identification:  The value of certification and branding can also fall somewhere in between.  This is because standards "brand awareness" is more common than most consumers might suspect, with much of the public being unaware that a heavily promoted brand utilized by multiple vendors actually relates to a standard.  A current example is the explosive use of "WiFi" enabled equipment, from laptops to home network routers, all of which achieve their unique value through compliance with one or more of the IEEE 802.11 family of wireless connectivity standards. 

In this case, the WiFi Alliance, an unaccredited consortium, acts as an auxiliary to IEEE, an American National Standards Institute (ANSI) accredited, global standards development organization (SDO).  The WiMedia Alliance rapidly creates test suites for each WiFi standard as it is completed, and then offers certification testing to permit vendors to refine their designs to achieve interoperability, and then advertise their compliance through use of WiFi trademarks licensed from the Alliance after their products pass the required tests.

The result is akin to the "Intel Inside" branding campaign, but with important differences.  In the Intel case, Intel customers are able to borrow on the reputation of the best-known semiconductor manufacturer, and Intel benefits from the increased advertising – but Intel remains in sole control of the design of its chips, and the use and ownership of the "Intel Inside" trademark.  With WiFi, the 250 members of the WiMedia Alliance control the process of test suite creation, certification testing and brand promotion.  To the customer, however, the result is much the same: greater assurance that expectations will be satisfied when a purchase is made.  Even if they don't really know why.

II    Certification Processes

Except in certain government-regulated areas where determination of compliance must be confirmed on-site (e.g., in the case of building codes and food preparation), certification tests and test facilities must usually be created by the private sector, either under the auspices of an existing standards development organization, by for-profit companies, or by means of a new entity created for a specific purpose.

In recent decades, more and more ITC standards have been created not by accredited standards development organizations (SDOs), but by unaccredited consortia.  However, while a standard setting organization (SSO) of either type may be quite able to fund and manage the development of a standard, SSOs in general, and consortia in particular, are most frequently low-budget operations. Moreover, in the world of SDOs, there is a history of separation between the standards creation process and the compliance testing function, each of which is conducted by a separate organization.

This can lead to a lack of certification options, especially in the information technology (IT) industry, which is typified not only by expensive research and development costs, but also by briefer product lifespans than is common in many other industries.  Because development of a robust test suite implemented in software (as compared to a set of detailed questions attesting to internal design compliance and self-testing) can be quite expensive, that cost is likely to exceed the financial resources of the organization that has created the standard in question, even though the actual process of certification testing might be self-funding once the test itself has been developed.

Because the number of vendors building products to a given standard may be low relative to the cost of creating a test suite to confirm compliance, it is also usually the case that a private testing service would be unable to recover its development costs to create the test suite needed before it could offer certification services.   Consequently, where robust test suites are developed at all, they are often funded by consortium dues or by a government or other grant, or the test suite is developed and contributed by the same member that initiated the creation of the standard to which the test suite relates.

The same challenges that stand in the way of test suite creation also arise in the context of certification testing.  In the case of actual interoperability or software-driven testing, expensive test equipment, facilities and personnel may be required, as well as administrative support.  Once again, such costs exceed the budgets, staff and physical resources of many SSOs.  On occasion, however, a third party can be found to provide testing services once the development of the test suite itself has been funded or arranged through the consortium's own devices of one type or another.

As a result of the financial challenges of instituting a formal third-party certification testing program, compliance programs and processes are therefore far from uniform.  Certification programs in the ITC space have therefore evolved that fall across a range of increasing cost and credibility, ranging from very low-budget self-assertion (and therefore low trust) programs, to costly third-party programs that may provide much higher credibility and value.

One on one systems:  The following are representative (although not exhaustive) of the levels of compliance testing and certification that can be found in the ITC industry today where the parties to the process are the vendor and the SSO or a third party verifier. 

        Self-Assertion without a Test Suite:  At the most modest end of the scale is self-assertion, which is not a certification process at all, in any true sense of the word. In this model, the vendor simply asserts that its product conforms to a given standard, and there is no third party verification of either the result, or the means by which the vendor reaches its conclusion. Where this is the best that can be done, it is important for a consortium to make it clear that only limited credibility should be given to such assertions, and that the marketplace understands that no formal certification process is in place.

As a result, the term "certification" should not be used in connection with a self-assertion program. Rather, the implementers of standards in this setting should only be permitted to assert "compliance with", or "conformance to," a standard or specification. 5 Self-assertion programs are quite common for primarily informational purposes, notwithstanding the limited level of credibility that they are likely to offer.  One reason is that, unlike safety features in consumer safety products, interoperability failures in ICT products do not typically lead to dire consequences, and the government therefore has not to date found it necessary to focus on this area.  Further, vendors can acquire an individual reputation over time for being trusted (or not) when they self-assert compliance, since customers will swiftly learn whether or not the product in question is truly interoperable with other equipment or software believed to comply with the same standard. 

Second, a wide range of factors (besides cost) may preclude the ability or interest of an SSO to create a test suite and/or engage a third party testing service.  For example, the commercial value of compliance may not be high enough, or the standard itself may not be sufficiently robust enough to achieve a conclusive result, and therefore compliance with the standard alone would not imply a result that has significant public commercial value.  Where cost is the true reason, however, the achievements of the affected organization may be more modest than those of another group that is capable of supporting a full certification program, especially where reliable interoperability is highly important to the end user.

        Self-Asserted Compliance (or Self-Certification):  In this model, some type of test suite exists (although it may be a "paper test" that states required results of one sort or another), but the vendor performs the test itself and asserts success. In some cases, there may be little effort to publicize the fact that a product meets the test, because the test suite has been created primarily as a tool for vendors to use in order to achieve interoperability or another goal at a lower cost. In other cases, credibility is an important goal, but the consortium has not been motivated, or able, to arrange for verification.  As a result, only a very modest increase in trust may be gained over self-assertion of compliance, since only one leg (thoroughness or rigor of test) has been strengthened, but not the other (independent verification).

        Self-Certification with Verification:  If a higher degree of credibility for the certification program is deemed to be desirable, the vendor is required to return some type of evidence of satisfactory test completion to the SSO (or a third party) for verification.  The deliverable typically will be a paper or electronic record of the test results, with the credibility of the program relying in part on how stringent and conclusive the test suite provided may be.  Again, depending on the consortium's resources and the degree to which vendors are willing to pay certification fees, the report may either simply be filed away to create what is essentially a record of self-assertion, or may be examined for completeness and consistency, but not otherwise directly confirmed by an independent test of the product.  Hence, an element of unsupported trust is still involved, and the credibility of the certification is therefore still qualified.

        Third Party Testing:  This is the highest standard of formal testing, since the vendor must submit its product to a third party for testing.  However, the efficacy of testing may vary widely, being limited in part by the sophistication of the standard to which the test applies (some standards are very detailed and comprehensive, while others are less so), and the effectiveness of the test itself.   Hence, a product built to one standard which successfully passes certification testing may indeed "plug and play" with another compliant product, while a product built to another, less comprehensive standard may require further refinements in order to reliably interoperate.  The degree to which a standard is capable enabling full interoperability is also affected by factors other than technical challenges, including political compromises (such as permitting alternate ways to implement a single element of a standard) among members that are, after all, usually competitors.

With third party testing, the final results are often submitted to the SSO, which will then issue the actual certification, along with a license to use its trademarks in connection with assertions of satisfactorily passing a certification test.

Other Processes:  There are other mechanisms besides certification testing that a consortium may take to increase the credibility of its standards and/or assist its members and other companies in achieving a high degree of compliance.

        Interoperability Testing:  In some cases, a third party testing service may be engaged to run submitted products directly against other compliant products, in addition to (or instead of) running them against the test suite. In others, a consortium may set up such an "interoperability center" itself (usually at a member site or at a trade show) to which members may (or in some cases are required to) bring their hardware and software products and run them against each other, in order to work out final interoperability issues not able to be resolved by means of a test suite.

The purpose of such testing can be either very secret or very public.  In the former case, stringent confidentiality agreements may be utilized, particularly where the testing being conducted relates to products that are not yet announced in the marketplace, and/or where the failure of a product to demonstrate interoperability could have a negative impact on sales.  In this case, the purpose of the exercise is all about compliance confirmation and not at all about branding.

At the opposite extreme is the very much public "plug fest" at a trade show, where multiple vendors demonstrate the interoperability of their products.  In this case, the purpose is entirely brand-related, since no vendor would wish to publicly demonstrate the non-compliance of its products, and confirmation of interoperability is usually therefore tested in advance.

In each case, although the activity in question may not be part of the formal certification testing process, it provides another example of the way in which an SSO may initiate and coordinate activities in order to lower costs and improve outcomes for its members in support of the standard that it has developed.

        Reference Software: In some cases, an SSO will provide actual software instantiating a specification. The software is often made available in both source code as well as object code form, and is commonly referred to as a reference implementation.  Where such software is available (sometimes only to members, and at others as a free download from the consortium's website), an implementer is spared the expense of developing its code to comply with the standard.

One common reason for the existence and use of reference software is that a member may have already created it for its own purposes, and is willing to make it freely available to all in order to reap some greater benefit from wide adoption of the standard.  Another reason may be that a standard has been created in a patent-rich environment, and there is a common benefit to be gained from the availability of an implementation of the standard that is not believed to infringe upon known intellectual property rights of (at least) members.  While the primary motivation may therefore not be to save compliance testing time and expense, those indirect benefits automatically follow.

III. Trademarks and Branding
While discussing intellectual property concerns in standard setting almost always focus on patent and copyright issues, trademarks play an essential role as well.  The reason is that while patent law may control what can be in a standard, and copyright law protects the text of the standard itself, only trademark law provides the means to control whether or not a vendor is entitled to claim that its products actually comply with a standard. 6

Using Trademarks to Enforce Quality Control: As noted earlier, standards need to be credible in order to have value.  This is because standards are only useful to a customer to motivate a purchase, or to a vendor to secure market advantage, when the promise they make is valid (e.g., a brand request to "buy this because it will work with that" only works if in fact "this" really does work with "that").  Moreover, if a vendor asserts compliance where compliance does not exist, an end user may be unable to tell whether the fault lies with a non-compliant product or with an inadequate standard.  As a result, not only the vendor that failed to comply loses credibility, but all products of all vendors that assert compliance with the same standard will lose credibility as well, and the goals of the SSO that created the standard and its members will be defeated.

False claims of compliance are therefore of great concern to SSOs and to end users alike.  Where an SSO gives a name to a standard and the public knows the standard by that name alone, then the SSO may prevent false claims of compliance from being made by withholding the legal right of the offending vendor to refer to the standard in connection with a non-compliant product. 7

While it is not legally necessary to obtain a formal trademark registration in the United States on the name of a standard in order to own all rights to its usage, it is prudent to do so, since the cost is modest in comparison to the benefit of putting the world on notice that the SSO owns the trademark.  Since it is widely known that it has become very simple to perform an on-line search of issued trademarks, obtaining a trademark registration will also make it far less likely that someone else will begin to use the same, or a confusingly similar, name for its product or service.  As a result, there will be less potential that someone else's actions will dilute the value and effectiveness of the SSO's mark, or that the SSO will be put to the trouble and expense of asserting or defending its trademark.

However, since ITC standards are usually intended for global adoption, it is important to undertake an analysis in order to settle upon a cost-effective strategy for protecting a mark, due to the fact that a commercial-scale, global trademark program would invariably be prohibitively expensive.  Fortunately, the a trademark convention in Europe now permits a single filing to secure rights in multiple countries, and a very large proportion of sales of certified products are usually expected to occur in a comparatively small number of first world countries.  The result is that it is possible to achieve a very meaningful degree of protection by obtaining trademark protection in just the United States, Europe and selected Pacific Rim countries.  Such a measured program of trademark registration can be completed within the budget of most SSOs.

        Using Trademarks to Associate Value with Products: The term "branding" usually connotes a use of trademarks that is broader than simply policing compliance.  Rather, it seeks to associate value with compliant products in the mind of the buying public that relates to the purpose for which the standard was created, rather than simply with compliance with the technical elements of the standard itself.  For example, the right to include the familiar "Dolby" brand logo on a product, indicating the use of patented Dolby noise-suppression technology, was a valuable product differentiator in the early days of tape decks. More recently, the earlier noted "Intel Inside" ad campaign provides an example of a brand usage that is intended to promote the goodwill of Intel as much, if not more, than the vendor of the product in which the chip finds a home.  In sum, Intel is seeking to create a market perception that its technology represents a "standard of excellence and innovation" with which consumers should associate added value.

Where a branding campaign is to be launched in connection with a certification program, however, a much larger budget is required.  To be effective, such an initiative also requires the active cooperation of SSO members, who should place certification logos on their compliant products, packaging and advertising in order for the program to be truly successful.  Often, engaging the cooperation of the marketing departments of large corporate members proves to be an insurmountable hurdle, regardless of the fact that the same companies may have invested heavily in creating the standards to which the certification and branding would apply. 8

Nonetheless, as the video format and ATM examples discussed above illustrate, branding may be vital in persuading the marketplace to buy (or, in the case of the credit card, to buy into) new classes of products and services, and the costs of brand creation may therefore prove to be wise, or even unavoidable, investments.  Absent such a program in the video example, many consumers might have shied away from purchasing or renting any products at all while the vendor community engaged in its standards war.

The costs of brand maintenance in such an example may also be finite.  After a single standard "wins", or after interoperability issues are resolved between competing standards, the brand may be allowed to languish, as an end-user comes to take interoperability for granted, and expects that all products, regardless of the technology upon which they are based, will be usable in connection with all other logically related products.

For example, today the user of an ATM is not likely to look for, or even notice, the multiple acceptance network logos on an ATM, because such a high degree of technical interoperability and business reciprocity has been achieved that virtually every ATM will now accept almost any and every card, regardless of the issuer.  Similarly, after the VHS format vanquished Betamax, video rental and consumer electronics stores discontinued stocking Betamax products entirely, making the use of the mark "VHS" no longer meaningful in anything other than an historical sense.  At that point in each example, the brand had already done its job, although the certification process continued to live on unnoticed by consumers in order to confirm actual compliance with the VHS standard for the benefit of manufacturers.

Summary:  Notwithstanding the costs and constraints associated with developing, administering and participating in standards certification programs,  vendors and service providers nonetheless voluntarily implement and comply with hundreds of thousands of standards, because they believe that the benefits of compliance outweigh the costs.  Since one of the anticipated benefits in complying with standards is increasing sales through customer awareness of product compliance, vendors are often willing to make promotional investments in conducting standards-based brand awareness campaigns as well.

When the certification process works best, larger markets for goods and services are created more quickly, and end users are better served by the greater likelihood that their purchase expectations will be fulfilled.  While providing conclusive certification testing in every market situation is not necessary, cost constraints would often render this goal infeasible in many situations in any event.  In response, the marketplace has evolved multiple levels of compliance assertion and testing that can provide both cost effective as well as meaningful comfort in a variety of different situations, to the ultimate benefit of vendors and end users alike.

End Notes


1. Department of Philosophy and Department of Mechanical Engineering, Texas A&M University, "Engineering Ethics," summarizing ASME v. Hydrolevel Corp., at <http://ethics.tamu.edu/ethics/asme/asme1.htm> (accessed July 23, 2006).

2. Like virtually all compliance testing organizations, Underwriters Laboratories does not test and certify every individual product.  Instead, it tests samples, and then allows its marks to be displayed on products that the manufacturer attests are consistent with the tested sample.  Current UL marks can be viewed at this page of the UL Website: http://www.ul.com/marks_labels/mark/art.htm#ul (accessed July 29, 2006).

3. Prohibiting the use of compliance testing to favor domestic industry by making it difficult, expensive or impossible for foreign goods to be imported is a goal of the World Trade Organization's Agreement on Technical Barriers to Trade.  Before the enactment of such laws, countries would frequently require local compliance testing of goods that had already been tested elsewhere instead of respecting the certification already granted by a neutral, but non-domestic, testing service.

4. While each format has its own differentiating features, these features tend to be of greatest interest to distinct stakeholders (e.g., content owners, hardware vendors, software vendors, and so on) rather than to all stakeholders.  As a result, if one format is "better" for the consumer, it will only be likely to win the current standards battle by coincidence.  For an example of the hundreds – if not thousands - of articles that have been written over the past several years assessing the advantages and chances of one format over another at any particular point in time (a search of "HD-DVD vs. Blu-Ray features" at Google yields 2,450,000 hits), see:  Perenson, Melissa J., More from the Blu-Ray vs. HD-DVD Front.  PC World.com (November 15, 2005), at <http://www.pcworld.com/news/article/0,aid,123491,00.asp>  As of this writing, it is uncertain which format – if either – will ultimately prevail.  For an example of a current analysis on that question, see:  Belcher, James, Blu-ray and HD-DVD: Only One Winner? Or Two Losers? (July 26, 2006) EMarketer.com, at <http://www.emarketer.com/Article.aspx?1004082>.

5. While there is consensus on not using the word "certification" in connection with self-assertions of compliance, there is no general agreement on whether, or how, to use words such as "compliant" and "conformant" across SSOs.  As a result, it is important for an SSO to define with precision which word(s) may be used in connection with the performance of what types of tests in connection with its standards, so that the marketplace understands what a vendor is saying when it uses a permitted term.

6. While the use of trademarks in certification and branding programs has many similarities to the usage of the same tools in connection with building brand awareness in support of proprietary products, there are also important differences, not all of which are immediately obvious.  For example, while marks designated as "certification marks" can be registered in some (but not all) countries, it may be appropriate to use trademarks, service marks or certification marks (and sometimes all three) in support of a given standards effort, depending upon the goals and circumstances in a given case.  A detailed review of this topic is beyond the scope of this article.

7. Exercising a sufficient degree of "quality control" over a widely used trademark is a sensitive issue for SSOs, which commonly do not have the resources needed to police the usage of their marks to the same extent as commercial entities.  As a result, it is essential for an SSO to institute good practices with respect to each standard as soon as it is complete, to prevent members and others from taking actions (such as incorporating the name of a standard into a product name) that could result in the mark becoming generic.

8. One of the first consortia that the author represented was formed to initiate a very ambitious certification-based branding program.  The most interested members paid hundreds of thousands of dollars in annual dues to fund the development of sophisticated hardware and software certification suites and the founding and staffing of a sophisticated interoperability testing center.  However, few – if any – members actually branded their products with certification marks after their products had been proven to be compatible.  The author has ensured that every consortium he has helped structure since then has included a marketing committee that is co-equal with the technical committee from the date of formation, in order to make it more likely that both the marketing, as well as the technical management of member companies would be committed to achieving the goals for which the consortium was founded.

subscribe

Comments? Email:

Copyright 2006 Andrew Updegrove


 

FROM THE STANDARDS BLOG:

THE MICROSOFT CONVERTER, NEWS SHOPPING AND TECTONIC SHIFTS

Wednesday, July 12 2006 @ 09:57 AM EDT

It's been a week now since Microsoft announced its ODF/Office open source converter project – time enough for at least 183 on-line stories to be written, as well as hundreds of blog entries (one expects) and untold numbers of appended comments.  Lest all that virtual ink fade silently into obscurity, it seems like a good time to look back and try to figure out What it All Means. 

There are two ways to go about that task.  One is the "have it your way," news channel technique (simply pick the channel that serves up your daily news just the way you like it, whatever that may be – liberal, conservative or just plain snarky).  Nothing better than the Internet for that, where you can go shopping in the great marketplace of interpretation (as well as willful misinterpretation), and find more flavors than you could ever possibly imagine.  If you do want to test that premise, you won't be disappointed with the myriad ways in which people have examined the entrails of the converter story to divine (or dictate) wha's up. 

For example, there is metaphorical religious conversion theory, from Martin LaMonica:

Redmond has "road to Damascus" open source conversion

As well as differences of opinion about whether ODF supporters are jumping for joy or expecting the worst:

OpenOffice developers rejoice at Microsoft's OpenDocument Support 

ODF guardedly welcomes Microsoft's Office XML move 

And, of course, there are plenty of theories about what Microsoft may really be up to.  Here's a sampling:

Vaughn-Nichols:  Microsoft not telling the whole truth about ODF translators  

Pamela Jones:  MS:  OK.  OK, so we'll set up an "OS" project to build an ODF killer.  Er, we mean translator

If you want to read just one analysis, though, by all means make it the acute, thorough and balanced report by Redmonk's Steve O'Grady.  His thoughts, as always, are a "must read."  You can find it, together with links to additional reactions to the announcement, here, and a follow up entry posted by Steve the next day here.  

The second way is to go direct to the sources, and make up your own mind, which is what I'll do here to provide my take on what led to Microsoft's decision, and what it's likely to mean.

Although I've read many different interpretations of this story by knowledgeable parties, I'm going to focus today on just one source:  Microsoft itself, and more specifically, its July 6 press release, together with a conversation I had with Jason Matusow, Microsoft's Director of Standards Affairs.  I don't rank as high in the blogging food chain as Steve O'Grady (he was one of two analysts that Microsoft briefed in advance), but Jason was good enough to call me the morning after the press release was issued, to answer any questions that I might have.

Let's start with the press release, which is useful for two purposes: first, to learn the basic facts, and more intriguingly, to find the messages that Microsoft wants to deliver.  Those messages relate both to the facts at hand, as well as to a bigger and ever evolving strategic picture, because every press release provides an opportunity to insert another piece into the mosaic that is the public image that the issuer wishes to reinforce.

Press releases are especially useful in interpreting what underlies a vendor's desired public image and strategy, because they are extensively worked over and reviewed by multiple parties, and therefore are as authoritative as their authors can make them.  Hence, while a press release is hardly the most objective source in the world, it is highly indicative of what the issuer wants the market to think at that point in time.  More intriguingly, and due to the same process by which they are written, press releases are also highly indicative of the bathwater that the issuer is drinking as well.

As I read the press release, Microsoft wants the following points to sink in regarding its new converter project:

1.  The converters (one each, serially, for Word, Excel and PowerPoint) are being developed at the request of government customers. 

2.  The converters will be created within an open source project, for maximum transparency.

3.  OpenXML and ODF were created for two very different purposes, and OpenXML is far superior to ODF.  This will unavoidably result in some deficiencies in how well the converters will work.

4.  This announcement is further evidence of Microsoft's new commitment to "interoperability by design," a four-pronged approach (only one of which involves an open process – standards). 

Here's how I see these messages fitting into the big picture:

1.  At the government customer's request:  I have heard this phrase explained by two Microsoft sources as follows: "if even one citizen wants to send a document to a government in ODF form, they have to be able to deal with it."  The net desired impression, then, is that the need to accommodate ODF is minimal (so don't take this as an admission that ODF is taking off), but when the customer asks, Microsoft listens.

2.  Open source project:  Microsoft deserves points on this one.  They aren't monkeying around, but are putting the code out front and largely in the hands of others, while still paying the bills.  Is it perfect?  Of course not.  But neither is OpenOffice.org, where Sun pays the bills and supplies most of the programmers to write the code, and largely selects what code will be written.  It's only fair to be consistent in how we judge competitors.

3.  Different formats:  Indeed the two formats were created for two different purposes, and I expect that there will likely be some inabilities for ODF documents to replicate, for example, all 200 Microsoft Word borders back through 1993.  But I assume that there won't be (or at least won't need to be) any such problems in the other direction.  The main difference between the two format approaches is that OpenXML is a format standard created to serve a single product line, while ODF was developed to enable the creation of multiple competing products, which is already occurring.  Losing a few borders along the way is considered to be a pretty easy tradeoff if your goal is the latter rather than the former, because the anticipated rewards are very different.  In fact, there is a place for both standards, and they should not be directly compared to each other any more than, say, a telephone and an intercom should be directly compared, although you can talk into each of them and they share some of the same technology.

4.  Interoperability by design:  Microsoft has realized that standards are not going to go away, and that customer demand for standards in general, and interoperability in particular, will rise rather than fall.  It has taken a thorough approach to creating a new internal standards structure (interestingly, it has many lawyers, as opposed to just technical and business people, in key positions in its standards department), and has constructed its four-point program to address that need. 

It is important to note that Microsoft calls this program "Interoperability by Design," rather than "Interoperability by Collaboration."  The salient difference between these two designations is that only one of the four roads to the interoperability goal (standards) of the Microsoft program involves an open process.  The others leave Microsoft in the senior, or at minimum parity, power position in negotiating the means of achieving interoperability – how, and with whom it pleases. 

The official way that Microsoft phrases this "commitment to interoperability" can be found in the same press release (as well as in many other press releases, statements and documents), and reads as follows:

Ongoing Commitment to Interoperability

As demonstrated by the recent announcement of the Interoperability Customer Executive Council and the significant industry contributions to the Open XML file formats from leading institutions like the British Library and Apple Computer Inc. at Ecma International, Microsoft is broadening its long-term investments in and attention to interoperability across industries and platforms through such avenues as product design, collaboration agreements with other companies, standards and the effective licensing of its intellectual property. Additional information about Microsoft's customer-focused interoperability commitment, including an open letter titled "A Foundation for the New World of Documents" by Chris Capossela, corporate vice president of the Microsoft Business Division Product Management Group at Microsoft, may be found online at http://www.microsoft.com/interop. (emphasis added)

Jason Matusow and I have debated what this means in several blog posts, the latest one of which is here, and you can read a wide variety of other opinions in the 274 comments on and off topic about the same piece that appear at Slashdot.

Now to my conversation with Jason, which was pretty far ranging and candid.  Jason said, and I believe him, that the real motivation behind the conversion project is the need to serve government users, and especially those in countries with strong commitments to use and honor ISO standards (ODF, of course, is now ISO/IEC 26300).  That's a credible reason, and if converters are going to be built anyway, as they are, Microsoft might as well be seen to be facilitating their development rather than holding back, and having at least some say in how the process evolves. 

I also believe that placing the project in an open source venue was a smart move, and an honest effort to be seen as not trying to play games.  As Jason said – and who can question the statement – everything that Microsoft does is going to be questioned and attacked, so they decided to initiate the project in a way that would leave as little to question as possible.  Of course, one can still poke at different aspects of how things are set up, but that's inevitable, given that certain decisions have to be made, and when they are, they have to come out one way or another, each with intended as well as unavoidable potential implications.  The choice of the BSD open source license is a good example of this, and you can find quite a bit of discussion on line about whether this was a good choice or a bad one, and what the motivations might be for so choosing.  Jason answers a few questions on this topic in the comment thread at his blog.

On a related note, I asked Jason why there was no mention of the converter project in the May 19, 2006 Microsoft response to the Massachusetts converter RFI, given that the concept had obviously been kicking around for some time.  He responded that final plans for the project had only come together in a detailed fashion in recent weeks, and that Microsoft did not want to be accused of making a "vapor ware" (my choice of words, not Jason's) announcement that could be suspected as an effort to chill independent development efforts without a real intention of delivering on the promise.  Again, that's a reasonable enough explanation, even if other considerations might have been involved as well.

More intriguingly, Jason also noted that a decision like this is still difficult to reach within Microsoft, with some constituencies hewing to the historical, proprietary way of looking at things, while others argue for a more adaptive, open approach. 

I expect that this is accurate as well, and have heard the same observation from various people I know inside Microsoft for the past year, and at each step along the way as Microsoft has loosened up in the ODF saga: first, on licensing terms, then on issuing its covenant not to compete, next on the submission to Ecma, and so on.  It is logical to assume that just such a process of cultural shift would be required, that it would be difficult and slow, and that change agents would need real issues in the customer base to point to in order to carry the day.

So my personal take is that we are observing a fairly consistent, significant and, well, fascinating evolution in the strategic thinking of one of the most powerful players in the IT industry.  Placing this progression in tectonic terms, there have been some major shifts – earthquakes, if you will – at Microsoft in the past year, such as the organization of the new open standards (and even open source) structures within Microsoft, and the genesis and articulation of the Interoperability by Design public message.  We have also seen smaller tremors and aftershocks, of which each concession in the ODF story is an example.

Obviously, there is still much unrelieved tension between the Interoperability by Design message and the real world of technology and customer expectations, both as respects open standards as well as open source software.  Perhaps it is as accurate to call the Interoperability by Design program an articulation of an internal "belief system" as it is to see it purely as a public marketing message, since I expect that there is passion behind maintaining this halfway house position between a proprietary world and an open environment.  Corporate belief systems can be almost as strong as religious convictions, and no conversion is easy or succeeds at a uniform basis at the level of the individual.   

Perhaps Martin LaMonica's Road to Damascus metaphor is then not so inappropriate after all, although I doubt that the establishment of the open source converter project will prove to have been the particular step along the way at which the defining revelation in Microsoft's future was delivered.  But some day, I think that the remaining tension between Microsoft and the marketplace will need to be released.  Whether that will be through the gelling of a new corporate gestalt that well-serves Microsoft and its customers, or through an earthquake (a catastrophic antitrust penalty?  The rout of Microsoft products by Linux/FireFox/ODF and other open source challengers to come?) remains to be seen.  

It will be fascinating to see whether the transition from Bill Gates to Ray Ozzie in the master architect's chair will prove to be an opportunity to provide a smooth and easy release of this tectonic tension, or simply a ratification of the ancien regime that sets up an ultimate catastrophe.

Only time will tell.

Bookmark the Standards Blog at http://www.consortiuminfo.org/newsblog/ or
set up an RSS feed at: http://www.consortiuminfo.org/rss/

Comments? Email:

Copyright 2006 Andrew Updegrove

subscribe


 

CONSIDER THIS:

[][][] July 31, 2006

#41 Live'n the WiFi LifeStyle: the iPod Bows to the Router

Here’s an interesting bit of data from the wild:  8 out of 10 folks that own both an iPod and a wireless router would give up their cool music tool before they'd do without their boring, clunky router.  The same percentage of those sampled would also give up their home phone before they'd sacrifice their ability to surf the Web from their favorite couch.  The data can be found in a survey conducted by Kelson Research for the WiFi Alliance, the consortium that promotes IEEE WiFi 802.11 standards and, more importantly, certifies compliance with them as well.

Surprised?  Don't be, because the iPod/iTunes system comprises a closed, proprietary environment, while WiFi products are based on a continuously evolving family of open standards, and that makes a far bigger difference than you might imagine. 

If this sounds like too simplistic an explanation, consider this: 

Let's look at the numbers first (gross sales and rate of change).  What we see is that there are many, many more wireless-enabled devices in the field than there are iPods.  According to research analyst In-Stat, wireless chipset sales hit 140 million last year, and should reach 430 million per year in 2009, by which time there should already be over a billion chipsets in active use.  40 million of the chipsets sold last year found their way into home and small office/home office (SOHO) routers, and another 45 million into laptops and other mobile PCs.  That leaves roughly 55 million more to be incorporated into phones and other mobile devices.  Moreover, that 140 million number was up 50% from the year before.

In contrast, another analyst (UBS Investment Research) expects iPod sales to come in about a million units under projection this year, with about 39.8 million new iPods being bought in 2006, and a flattening in iPod sales growth after rapid expansion in prior years. 

While comparing music players and mobile Internet access points is not a totally fair comparison, the ability of WiFi to achieve such dramatically larger sales numbers is still instructive, since few new capabilities of any kind enjoy such explosive growth.  When they do, though, its often because they are based on open standards, and from two resulting, related effects:  the ability and likelihood of multiple vendors to build new products, because the standard upon which the new products are based is open, and the tempting size of the market demand that can rapidly evolve because of the rich selection of competing products.  The result is sometimes referred to as a "virtuous cycle" of incentives and rewards to both sides of the sales equation.

With that as an introduction, let's take a look at the WiFi marketplace, which is shared by many competitors, and the portable digital music play niche, which is dominated by the iPod and iTunes, a commercial combo that has delivered Apple the highest quarterly earnings in its history.

First, it is worthwhile noting that there are also very successful players in the WiFi space (such as Linksys), a result made possible in part by the fact that multiple competitors have developed diverse WiFi-based products and services, allowing for more than a single company to achieve success.  We can assume that one reason this is true is because the WiFi wireless market is based on open standards, while the mobile music market includes several controlled formats - the most popular of which is not available for license. 

As one measure of comparison to drive home this point, the WiFi Alliance has over 250 members, including hardware, software, silicon, consumer electronic, and other vendors  (you can see a list here, but be prepared for a long scroll).  While it's true that the dynamics of most IEEE 802.11 working groups are highly competitive (to put it mildly), the standards these committees set out to develop have thus far all been finally approved (unlike the abandoned UWB project), after which everyone works hard to get them widely adopted.

Of course, in the case of the iPod, "everyone" (other than Apple) is a competitor at the format as well as the product level, unless they're making iPod accessories.  As a result, the products, services, advertising, promotion and ingenuity dedicated to making the iPod more attractive and more useful are limited, while there are hundreds of companies, from the largest to the smallest companies in many industries, that are all working to tout the WiFi value proposition.

Next, it must be noted that not everyone is willing to buy into a proprietary system that traps the customer more thoroughly (just as Apple intends) with every iTune she buys.  And while Apple is incredibly creative in what it designs, an iPod owner's ability to satisfy her appetite for new techno delights is still limited to the candy that Apple decides to offer her.  If the iTunes format was not proprietary, other vendors could challenge Apple more effectively on price, features, and design – and anyone could play songs purchased from iTunes (or elsewhere) on those other wares as well.

But most of all, the rewards of buying a WiFi-enabled system continue to multiply exponentially, while the value of buying into the iTunes system can, at best, increase arithmetically as the stock of iTunes is expanded.  Why?  Because you can still only listen to an iTune on an iPod, or on another Apple product.  With WiFi, the standard is available to anyone, and therefore everyone is making use of it.  Not only are multiple chipset makers churning out price-competitive chips, but hardware makers automatically include those chipsets in almost all (90%) of the laptops they ship. 

Similarly, Starbucks offers WiFi access in order to sell more lattes, and entire cities, like Boston, plan on providing free, universal WiFi access for the benefit of their citizens, in order to polish the city's image, and thereby boost the local economy through competing more effectively in the ongoing competition to attract employers and talent.  In short, the popularity of WiFi encourages multiple constituencies to invest in providing access, and to reap the indirect benefits that such an investment can provide.  You can't do anything comparable to that with an iTune, nor would you want to (why invest in what you cannot control?)

Still, this is just the beginning.  The home is on the verge of becoming pervasively enabled with wireless capabilities, and wireless mobile devices of all types continue to proliferate.  That's where those extra 300 million chipsets per year will be going by 2009.

There is another lesson to be drawn from the wireless example that helps to explain why someone could be more strongly attached to their humble router than to their sexy iPod.  Let's call it the rise of the "WiFi Lifestyle."  True, you won't find any striking ads on billboards of wildly gyrating silhouettes holding laptops, and it's doubtful that a Dell would be the best accessory to take to a trendy club in any event.  But just as white ear buds score high on the teenager index of cool, freedom of access to the Internet has huge appeal to all ages when it comes to how they want to live their lives today.  More and more, we want, and expect to have, ready access to an exploding range of information, services, games and more from the Internet, wherever we may be.

As a result, the perceived value of having always-on Internet capability becomes greater on a daily basis, while an iPod remains just an attractively designed, not very durable, rather expensive, and regularly obsolete music box.  In short, a music box that will never provide more value to you in the future than it delivered on the day that you bought it.

That's the more obviously germane part of my last point. The more subtle (and to me interesting) part of my lifestyle point is this:  the actual value of any single WiFi access point to us is not, in fact, all that great.  We could easily live without Internet access at any particular Starbucks, or even lose it at home entirely (after all, we could always pull our chair closer to the cable jack).  But we are placing an increasingly high value on being surrounded by wireless access wherever we may be.  Wherever we are, we want it, and that's it – because we've bought into the WiFi Lifestyle.

While it's doubtful that many of the 8 out of 10 respondents in the survey realize that their affection for their home router is based more on a lifestyle decision than their affection for surfing the Web from the couch, I'm pretty confident that this is what explains the Kelson Research survey results.

All of which provides yet another splendid example of how an open standard makes participation in the creation of a new network attractive and profitable, thereby enabling a logarithmic increase in innovation, implementation, value and customer appeal.  This "network effect" has been recognized at least since the advent of the railroads, and it is becoming a bigger and bigger reality in our world today, because networks of all sorts are becoming essential to virtually everything that we do.

What does this say about the future of the newly-renascent Apple Computer?  I think it's possible that the iPod may represent the high-water mark of that company's proprietary design strategy.  With the market's ever-increasing expectations for interoperability, and even governments (such as in France) threatening to restrict the sale of music in the proprietary iTunes format, Steve Jobs may find himself on the verge of being forced to compete on design alone, even as his Company enjoys historical highs in its sales.   

The good news for Steve is that, given Apple's chops in the design department, I'd guess that Apple's future will be far rosier when, as and if he ever gets through that knothole.

 

Comments? Email:

Read more Consider This… entries at: http://www.consortiuminfo.org/blog/

Copyright 2006 Andrew Updegrove


THE REST OF THE NEWS

For up to date news every day, bookmark the ConsortiumInfo.org
Standards News Portal


Or take advantage of our RSS Feed

OpenDocument Format (ODF)

 

The principles of open standards may offer the benefits of decreased costs and interoperability of documents, but the ITD did not pursue the policy in an open, collaborative or lawful manner  [June 29, 2006]

 

MA Senator Marc Pacheco, at a press conference announcing the release of a report critical of the ODF adoption process...Full Story

   
Senator Pacheco is wrong on the facts and wrong on the law. We are committed to an open-standards approach that fully takes into account all accessibility, cost and statutory requirements  [June 29, 2006]
 

Response by Romney administration spokesman Felix Browne...Full Story

   

Our next action is to do what we are doing right now, which is working toward the goal. We believe in the utility of open standards  [July 5, 2006]

  MA State CIO Louis Gutierrez, following release of the Pacheco Report...Full Story
   

spacer

Opinion Noted:  A long awaited report was issued at the end of June by the Massachusetts Post Audit and Oversight Committee, chaired by Senate State Senator Marc Pacheco.  The report follows up on a public hearing held last October 31, at which then-State CIO Peter Quinn testified.  The report is highly critical of the process whereby the plan that includes endorsement of ODF was adopted.  The Massachusetts administration, on the other hand, flatly denies many of the allegations in the report, and continues to steam ahead, all as reported in the two entries below.  You can read more about the substance of the report in this Blog entry.

Opponents to ODF strike back in Massachusetts
Eric Lai
ComputerWorld.com July 3, 2006 A Massachusetts Senate committee released a report yesterday criticizing the state’s Information Technology Division (ITD), claiming that its “unilateral” plan to move all state employees to use software that reads and writes files in the OpenDocument format was poorly planned, ignored the needs of handicapped workers and violated state law. Also yesterday, the Danish government said it will launch a four-month pilot program in September to use the OpenDocument format (ODF), another part of the Scandinavian country’s broad endorsement of open computing standards. The program will start with Denmark’s finance and science ministries and possibly others, said Adam Lebech, head of the IT governance division within the Ministry of Science, Technology and Innovation. ...Full Story

spacer

Mass. holding tight to OpenDocument
Martin LaMonica
CNET News.com July 5, 2006 Massachusetts is sticking to its plan to adopt OpenDocument, despite a critical report calling for a delay to the high-profile move. Louis Gutierrez, Massachusetts' chief information officer, said in an interview with CNET News.com that the Information Technology Division (ITD) is forging ahead with its project to make OpenDocument the default document format for executive branch agencies by January next year. ...Full Story

spacer

We really believe that Open XML is the full-featured format  [July 15, 2006]

 

Jean Paoli, Microsoft's general manager of interoperability and XML...Full Story

   
That's nonsense  [July 15, 2006]
  Laurent Lachal, Ovum senior analyst in charge of open-source research...Full Story
   

spacer

Pick a number:  When Microsoft announced early this month that it would support an open source project to create a converter for swapping documents between users of its Office productivity suite and those using applications that support the OpenDocument Format (see this month's Standards Blog selection), everyone had an opinion on what it meant, underscoring the importance that observers are attaching to the challenge to Office presented by ODF.  A representative sampling appears below.

A Game of Zendo
Rob Weir
An Antic Disposition (blog) July 18, 2006 ...There is a game called Zendo, where a player, called the Master, forms in his mind a secret rule which governs the selection and arrangement of objects (often small colored blocks or shapes). Arrangements which conform to the secret rule are said to have 'Buddha nature". The other players take turns trying to select and arrange their own blocks to conform to what they think the secret rule is, to which the Master will acknowledge success or failure....Microsoft is playing Zendo with the Office XML specification. The Master has formed a secret rule. He calls it, "backwards compatibility with billions of office documents". But since the file format documentation for the proprietary legacy binary formats has not been made public, the rule might as well just been called "Buddha nature"….Full Story

spacer

Microsoft-XenSource: Choosing a Side of the Fence
David Marshall
InfoWorld.com July 27, 2006 When Microsoft and XenSource made the announcement that the two companies would be cooperating on the development of technology that would provide interoperability between Xen-enabled Linux and the new Microsoft Windows hypervisor technology-based Windows Server virtualization, people immediately took to the Internet to try and describe their feelings as to what just took place. In reading many of these articles, I could almost imagine the scene in Empire Strikes Back when Darth Vader asks Luke to join the dark side. Was the announcement really that bad? Did it cause a disturbance in the force? ...Full Story

spacer

OpenDocument skirmish ends in truce
Jeremy Kirk
PCAdvisor July 15, 2006 Microsoft's decision to allow its Office software to handle the increasingly popular ODF (OpenDocument Format) was a belated acknowledgement that the company could lose customers if it didn't, analysts said this week....Jean Paoli, Microsoft's general manager of interoperability and XML architecture, said in an interview that Open XML is backed by 4,000 pages of documentation of features, while ODF has around 700 pages. "We really believe that Open XML is the full-featured format," Paoli said. But Lachal disputed the comparison, saying ODF is a developed technology. The two formats can't be compared by the length of their documentation. "That's nonsense," he said. ...Full Story

spacer

More details on the Microsoft Office ODF translator
Brad Grimes and Joab Jackson
GCN.com July 12, 2006 Last week, Microsoft Corp., of Redmond, Wash., announced the Open XML Translator project, an effort to build a series of plug-ins to convert Microsoft Office documents to the Open Document Format. And, as with any debate as spirited as the one between Microsoft and the ODF community, the story was distorted in all sorts of ways by an overeager press. ...Full Story

spacer

Microsoft's Office software 'translator' praised, faintly
W. David Gardner
ITNews.com Australia July 8, 2006 Microsoft's decision to offer free translation software to enable its Office software to operate easily with OpenDocument Format (ODF) software was greeted Thursday with elation in some quarters, but with cautious optimism in other quarters. Hailing the move was Melanie Wyne, executive director of the Initiative For Software Choice (ISC), who said: "We continue to prefer these developments to heavy-handed, and often clumsy government regulation."...Andrew Updegrove, editor of consortiuminfo.org and an ODF supporter, had faint praise for the Microsoft move while noting that there wasn't really much new to the announcement because Microsoft's chief software architect Ray Ozzie had said Microsoft was working on a translator - also variously called a converter and a plug-in - last fall. Updegrove, a partner in the Boston law firm of Gesmer Updegrove, called Microsoft's action a "concession (that) clearly makes it easier for governments and other users to feel safe in making the switch from Office to ODF-supporting software, since Microsoft will be collaborating to make document exchanges smooth and effortless." ...Full Story

spacer

MS: OK. OK, we'll set up an "OS" project to build an ODF killer. Er, we mean translator
Pamela Jones
Groklaw.net July 8, 2006 Now that others have built a translator for ODF/Open XML interoperability after the Commonwealth of Massachusetts put out a call for one, Microsoft announces it would like to sponsor an "Open Source" project to build one of its own. What need is this filling? I'd say Microsoft's need to stay in the game. Can there be any other reason to duplicate work that has already been done? ...Full Story

spacer

Some countries aren't ready for [a serious ODF] discussion. For example, ones that are currently going through elections or a war  [July 20, 2006]

 

Bob Sutor, IBM's VP for Standards and Open Source, on countries on (and not on) the "hot 100" 2006 ODF adoption list...Full Story

   

spacer

At last - some tea leaves to read: It's been months since Google swallowed up Writely, an intriguing Web-based word processing package that supports ODF. That could be a very intriguing development for ODF, except that the typically gnomic Google gave no signals as to whether it was going to take advantage of Writely's support of ODF, ignore it, or even decommit from that support. Now, as reported in the following piece, Google is going public with its support for ODF, although the depth or direction of that support remains unknown.   Meanwhile, as indicated in the second item below, support for ODF, continues to grow globally.

Google joins ODF lobby to turn up heat on Microsoft
Antony Savvas
ComputerWeekly.com July 13, 2006 Google has put its considerable lobbying and financial weight behind the Washington-based ODF Alliance, as the battle lines between Google and Microsoft draw closer....The support for the ODF Alliance is a natural fit for Google, however, as it already distributes the OpenOffice.org open source productivity suite to users. This suite integrates with the ODF standard.Google also recently acquired the Writely on-line word-processing program, which allows multiple users to create and edit the same document in real-time through their web browsers. This program works to the ODF standard too. ...Full Story

spacer

OpenDocument camp in full-court press with '100 or so' countries?
David Berlind
ZDNet July 20, 2006 Late yesterday, IBM's vice president of standards and open source Bob Sutor published a blog that points to Malaysia's potential adoption of the OpenDocument (ODF) file format....In terms of a checklist of nations, while Sutor said there's no formal list, it's clear that IBM and others have prioritized the countries that are "more likely to adopt ODF next" or ones that appear ready to "fundamentally revise their IT strategies around open standards." Sutor mentioned Thailand and Japan as two countries that it became much easier to have discussions with once ODF was ratified as an international standard by the International Organisation of Standardization (a process that I've found to be dubious at best). ...Full Story

spacer

Open Standards/Open Source

I wonder what Peter Quinn would say? It's only seven months since Peter Quinn resigned as CIO of Massachusetts, besieged by opponents of open source software and open standards. Since then, there's been a great deal of progress made in providing cover for government employees (including state CIOs) who would like to take leadership positions on upgrading government IT infrastructure in the same ways that their corporate brethren are advancing their own systems. One such advancement was the creation of the ODF Alliance, which was formed to educate legislators and other public officials on the value of adopting open formats. Another is simply the greater visibility and credibility of open source and open software, in part through public debate of ODF, but more because of the sheer momentum and prominence of the move towards open systems generally. The first article below provides an update on that trend, while the second reports on an endorsement by the U.S. Department of Defense of open source and open standards.

Report: Government Entities Quick to Adopt Open Source
Jay Lyman
TechNewsWorld/LinuxInsider.com July 27, 2006 Governments are now more willing to disclose and discuss their open technology plans and products because they are less fearful of agency infighting or pressure from vendors of proprietary software, said Open Source and Industry Alliance Director of Public Policy Will Rodger..."There's no question open source, and open technologies in general, are really taking off like wildfire in the government sector right now," he said. "It's doing so in large part because of what's become a critical mass of activity over the last five years." ...Full Story

spacer

DoD report recommends move to open software and standards
Matthew Aslett
Computer Business Review Online July 13, 2006
 The use of open source and open standards at the US Department of Defense is in the national interest and the interest of national security, according to a report from the DoD's Advanced Systems and Concepts office. The 79-page report recommended the adoption of open source software and development methodologies, as well as open standards, in order to make the most efficient use of internal resources through collaboration and code sharing....While it does not mandate the use of open source licensed software, the OTD approach does recommend that wherever possible use should be made of existing open source code, rather than funding the development of proprietary alternatives. ...Full Story

spacer

The open divide:  One of the great unresolved debates between the open standards and the open source communities involves agreeing upon a common definition of what "open" means.  It's not just a theoretical question, because at issue is whether an "open standard" should automatically mean a standard that can be implemented in open source software – and many current standards cannot.  Now, the Open Source Initiative, the non-profit organization that determines whether a given license does or does not meet the requirements laid out by it to qualify as an "open source license," has weighed in with its own definition, and an offer to use its trademarks to indicate whether a standard does or does not meet that definition.

Open Source Initiative gets into the game of defining 'open'
David Berlind
ZDNet/Between the Lines July 29, 2006 Ever since the Commonwealth of Massachusetts' definition of an open standard was thrust into the spotlight as the state's IT department looked to establish the OpenDocument Format as the standard file format for electronically saving and retrieving state documents, the definition of what it means for something (a standard, source code, etc.) to be "open" has been a hot topic....Now, with very few neutral "institutions" in the role of defining open standards, the Open Source Initiative appears as though its stepping up to the plate. ...Full Story

spacer

Open Source

People are simply realizing they can share information without the fear of retribution they felt a few years ago  [July 27, 2006]

 

Will Rodger, Director of Open Source and Industry Alliance...Full Story

   

spacer

An uneasy alliance:  It has always seemed to me to be somewhat remarkable that the open source community and major IT corporations have been able to coexist, and even cooperate, as effectively as they have to date.  Still, there are differences in goals that manifest themselves on a regular basis, sometimes causing tension or abandonment of a project.  The following two articles focus on the intersection of major IT corporations with open source communities.  In the first case, LinuxToday editor Brian Proffit gives his thoughts on lessons to be learned from the discontinuance of an Apple-sponsored project, while the second reports on Microsoft's tentative steps towards greater involvement in open source software – a move that is viewed with skepticism by some in the open source community.

Editor's Note: Beware of Suits Bearing Code
Brian Proffit
LinuxToday.com July 29, 2006 While all the hoopla was taking place out in Portland at the Eight Annual O'Reilly Open Source Convention (OSCON) this week, some of us noted the ever-so-quiet death of what should have been a vibrant open source project: OpenDarwin....This eventually comes back to Linux, because it raises an object lesson for all of these companies who open their code and invite community participation....for those few companies who are looking to make a quick buck with some cheap help, think again. The Linux community notoriously does not suffer fools and doesn't fall for a lot of marketing hype. They talk, they react, and they are vocal. Don't follow Apple's example. Be a real community member, and reap the benefits of what you sow. ...Full Story

spacer

Microsoft executive lauds open source
Paul Krill
InfoWorld July 21, 2006 Microsoft is not viewed as an open source proponent, but a key executive said Wednesday the company recognized its benefit and was becoming more open itself. David Kaefer, director of Business Development, Intellectual Property and Licensing at Microsoft, said open source had bolstered innovation in a distributed fashion, and he called the open source software movement a "very powerful force in the industry." ...Full Story

spacer

Europe

Open standards (like Open Document Format) and the use of free software contribute to the independence, quality and effectiveness of public agencies and local communities. Developments funded by public authorities for their own needs should, as a general rule, be free  [July 6, 2006]

 

French Presidential candidate Ségolène Royal and Richard Stallman, in a joint statement on all things open...Full Story

   

spacer

Whither Belgium? The news that Belgium has come over to the ODF camp is certainly significant, but as further facts appear, it appears that we may be headed for a rerun of the Massachusetts experience. The Belgium plan calls for a trial of ODF as a first step, followed by full implementation the following year, if all goes well. But, as Matthew Broersma points out in the article below, "The decision leaves plenty of room for the Belgian government to change its mind....[and, citing an ODF advocate] it remains to be seen what will come of the decision, since many significant details remain to be worked out, and because the move is likely to meet heavy opposition behind the scenes."

Belgium adopts OpenDocument
Matthew Broersma
Techworld.com July 4, 2006 Belgium may become the first national government to mandate the use of the Open Document Format (ODF), with a full-scale trial to begin next year....Every federal government department must be able to read ODF documents beginning in September 2007, and ODF will become the standard for external document exchange a year later, if analysis by Fedict, the Belgian e-government service, shows the trial to have been a success. ...Full Story

spacer

I said "No!" and I mean "No!"  In news reminiscent of an old Saturday Night Live news skit about a dictator that had long insisted on rallying each time he seemed to be at the verge of death ("Francisco Franco Still Dead at 82"), the European Commission said "no" to permitting software patents in Europe.  If you think this sounds familiar, it's because it is familiar: permitting patenting of software has been brought up over and over in the EU, and been voted down in various venues as well.  One suspects that this story will not be the last.

Europe: No patents for software
Ingrid Marson
CNET.News.com June 28, 2006 The Commission said last week that computer programs will be excluded from patentability in the upcoming Community Patent legislation and that the European Patent Office will be bound by this law. "The EPO would...apply and be bound by a new unitary Community law with respect to Community patents," the Commission said in a statement. "The draft Community Patent regulation confirms in its Article 28.1(a) that patents granted for a subject matter (such as computer programs), which is excluded from patentability pursuant to Article 52 EPC, may be invalidated in a relevant court proceeding." ...Full Story

spacer

Security

Preserving one's identity:  Times were better when that concern was more existential than pragmatic.  Sadly, that was then, and this is the age of information technology – a blessing in many ways, but a curse when it comes to proving who you are (see first item below) as well as who you aren't (second item following).

SafeNet Announces Formation of Consortium to Support HSPD-12 Needs of Government Agencies and System Integrators
Press Release
TMCNet/BusinessWire July 6, 2006 SafeNet, Inc. today announced the formation of the HSPD-12 Interoperability Consortium. This is a partnering of industry leading vendors committed to providing an interoperable solution to government agencies and system integrators which addresses the challenges and opportunities of complying with the White House issued Homeland Security Presidential Directive (HSPD) 12 - Policy for a Common Identification Standard for Federal Employees and Contractors. ...Full Story

spacer

New Standards Panel to Coordinate Identity Theft Protection Standards Activities
ANSI.org July 2, 2006 The American National Standards Institute (ANSI), in partnership with the Council of Better Business Bureaus (CBBB), has announced its intent to establish a new standards panel to address identity theft prevention and identity management standards. Once formed, the Identity Theft Prevention and Identity Management Standards Panel would serve as a cross-sector coordinating body, applying use-case practices to promote and harmonize the timely development of voluntary consensus standards to minimize the scope and scale of identity theft. ...Full Story

spacer

Semantic and NextGen Web

Can you afford to ignore site semantics? Not if you intend to be around tomorrow  [July 18, 2006]

 

Web designer Frederick Townes, writing at Searchnewz.com...Full Story

   

spacer

A rare Semantic Web update: It's been five years since Tim Berners-Lee and two co-authors gave a definitive and comprehensive description of his vision for the Semantic Web in the pages of the Scientific American. In May of this year, Berners-Lee (this time with co-authors Nigel Shadbolt and Wendy Lee, of the University of Southampton) contributed an update on the future of the Semantic Web to a special issue of IEEE Intelligent Systems on the state of artificial intelligence development. Until last week, the article was available only on a paid basis, but has now been posted on an open Website. Most of the abstract appears below, along with a link to the full text. For a very detailed interview with Berners-Lee on the future of the Semantic Web, see the June 2006 issue of the Consortium Standards Bulletin.  And for an indication of whether the Semantic Web is picking up support, see the second article below.

spacer

The Semantic Web Revisited
Nigel Shadbolt, Wendy Lee, Tim Berners-Lee
IEEE Intelligent Systems July 26, 2006 The original Scientific American article on the Semantic Web appeared in 2001. It described the evolution of a Web that consisted largely of documents for humans to read to one that included data and information for computers to manipulate. The Semantic Web is a Web of actionable information—information derived from data through a semantic theory for interpreting the symbols. This simple idea, however, remains largely unrealized. Shopbots and auction bots abound on the Web, but these are essentially handcrafted for particular tasks; they have little ability to interact with heterogeneous data and information types. Because we haven't yet delivered large-scale, agent-based mediation, some commentators argue that the Semantic Web has failed to deliver. We argue that agents can only flourish when standards are well established and that the Web standards for expressing shared meaning have progressed steadily over the past five years.... ...Full Story

spacer

SEO And Semantics: Index Content Not Keywords
Frederick Townes
Searchnewz.com July 18, 2006 Despite the awesome drawing power of the www and its ability to sell products and broadcast messages, as site designers and owners, we've only begun to harness the true power of a fully-compliant semantic web....If your site designer isn't up to speed on w3 semantics, find another designer. Yes, new and improved semantics tools are in development but the big SEs are constantly improving their ability to match context and relationships instead of simply identifying literal matches of character strings. And, if your site isn't semantically optimized, it's also not optimized for conversions, so you're missing the best SEOpportunity to come along since the advent of search engines. ...Full Story

spacer

Wireless

Profiles and Gap Fillers: Although it may seem countererintuitive, with convergence comes not completeness, but gaps. The reason is that when devices and applications lived in silos, one or a few SSOs could identify and create all of the standards needed in a reasonably coordinated fashion. But with convergence it becomes essential to try and fit all of these pieces together, and why would that be easy, if each piece had been developed in a silo? What those that need the standards are doing in some cases is to select and compile lists of standards ("Profiles") from various SSOs that can work together to do specific tasks ("Use Cases"), and in others to assess the available standards, and then creates new ones to fill the gaps to address other use cases. The following press release extract (no link yet) describes the release of a "gap filler" consortium's first specification, intended to enable the Home Gateway. For more on the promise and requirements of the digital home, see the February 2006 issue of the CSB, on The Emergence of the Digital Home.

Home Gateway Initiative release 1.0 specifications published
Press Release
Home Gateway Initiative July 6, 2006 The Home Gateway Initiative is pleased to announce the official approval and release of its set of specifications for release 1.0. The document encompasses the complete architecture of a broadband Home Gateway (access network agnostic) supporting triple play services from an end to end perspective with an emphasis on the Home Gateway hardware architecture, the IP connectivity, the Quality of Service control, the remote management capabilities, the home and the access network interfaces, the service support capabilities and the security mechanisms. [The specification can be viewed from the HGI home page] ...Full Story

spacer

Here, there and everywhere:  The means of connecting everything with everything continue apace, with more initiatives being launched all of the time to achieve that goal through the use of standards-based technologies that are best suited to each discrete need.  The following three articles focus on the (sometimes twisting) roads being traveled to satisfy three such needs, each in a highly competitive market setting: the first focuses on the competitive efforts to launch two competing short-range standards, following the failure of an IEEE working group to reach consensus on a single way forward, while the second provides an update on Intel's efforts to ensure that the long distance WiMax wireless standard achieves broad deployment in competition with other long-distance alternatives.  The third and final article focuses on yet a third technology, and on yet another competitive setting.

UWB and Bluetooth roadmaps gain clarity
Chris Everett
Wireless Asia July 7, 2006 Bang! Last January with the IEEE's 802.15.3a task group members voting to get out of the high data rate UWB standards business, the two special interest groups - the UWB Forum and the WiMedia Alliance - were off to the races. The prize being market acceptance of a single radio standard for high data-rate, wireless, personal area networks. Winner takes all. ...Full Story

spacer

In Depth: Intel's Chip Plans Give WiMax A Mighty Push Forward
Elena Malykhina and J. Nicholas Hoover
InformationWeek July 5, 2006 We all want the same thing when it comes to a wireless Internet connection: coverage everywhere, superfast speeds, not too pricey. What we don't know is when we're going to get it. On that front, Intel last week pushed the zoom-ahead button, disclosing plans to deliver by year's end a new chipset called Rosedale 2 that should make it easier to access WiMax from mobile computers. By early next year, Intel predicts some PC makers will be building those chips into laptops, and that may be the jump start the industry needs. ...Full Story

spacer

Government support for open source wireless mesh
Dana Blankenhorn
ZDNet Blogs/Open Source July 25, 2006 The National Science Foundation has announced a grant to the Champaign-Urbana Community Wireless Network (CUCWN) for help in developing open source standards in wireless mesh. The area is home to the University of Illinois (go Illini) and I hope it's not a coincidence that UI is also where the Mosaic browser was developed. UI is a partner in the grant....While most open source projects target software companies like Microsoft or Oracle, the game in this case is much bigger — the Bells and cable operators. A wireless mesh, based on open standards, could reach competitive fiber and easily bypass these gatekeepers, guaranteeing competitive broadband. ...Full Story

spacer

Who's Doing What to Whom

Here we go again  [July 26, 2006]

 

eMarketer Senior Analyst James Belcher on the long-awaited (and by some long-dreaded) introduction of rival, next-generation format DVD players...Full Story

spacer

At last - now you can be taken for a ride! For years, the HD-DVD and Blu-Ray rival next-gen DVD format camps have been at war, forming their alliances, wooing analysts and building their prototype units. Now, they're finally ready to try to persuade you - yes you - that it's time to roll the dice on one format or the other, in a replay of the Betamax-VHS crapshoot. By the laws of the marketplace, its 99% certain that only one of the two formats will be around in the long run. All of which makes you wonder - just who is it that's going to fork over US$3,450 for one of these first, overpriced machines, thereby maximizing their chance of calling it wrong?

Surprise! Another delay on the next-gen optical front
Eric Bangeman
ArsTechnica.com July 17, 2006 The latest in what has become a series of delays in rolling out next-generation optical devices has hit Toshiba's high-definition HD DVD recorder....Expect to drop some serious yen for such a device: Toshiba's MSRP for the RD-A1 is ¥398,000 (currently just under US$3,450).... While we wait for the RD-A1 to hit North American shores, we can be on the lookout for a marketing blitz (PDF) from the North American HD DVD Promotion Group. The new consortium, funded by HD DVD backers Microsoft, Toshiba, HP, and Intel, has a US$150 million advertising and marketing budget, and they're ready to start spending. ...Full Story

spacer

Blu-ray and HD-DVD: Only One Winner? Or Two Losers?
James Belcher
eMarketing.com July 26, 2006 ...So neither format is likely to trounce the other. The bigger concern is permanent niche status. If both formats are pricey, or combo players fail to take off, they are likely to end up like laserdisc. This high-end alternative to VHS never reached a mass adoption thanks to high pricing and a feeling by most consumers that VHS was good enough. Should Blu-ray and HD-DVD follow the same path, consumers may just decide that DVD is good enough. ...Full Story

spacer

Standards and Society

This isn't a far-fetched suggestion  [July 14, 2006]

 

Bob Glushko, proposing the "Terrorist Target Markup Language" in the wake of the release of the phone-book length National Asset Database...Full Story

   

spacer

The universal standard: If ever there was a standard for everyone, it would certainly have to be XML. This versatile tool has been adapted for use in connection with everything from human resources data, to financial reporting information, to sports statistics, to advertising copy, to, oh yes, transcript data, as reported below, in this press release from the Postsecondary Electronic Standards Council.  And if that doesn't persuade you, how about a Terrorist Target Markup Language (second item below), or yes, Virginia, there may soon be even an "Emotion Annotation and Representation Language," as highlighted in the last item below.

XML High School Transcript Standard Released
Press Release
xmlCoverPages.org July 10, 2006 The Board of Directors and Steering Committee of the Postsecondary Electronic Standards Council (PESC) are pleased to announce the release of the XML High School Transcript Standard as a PESC Member-Approved National Education Community Standard. Version 1.0 and all supporting documentation are available at www.PESC.org. This effort marks a significant milestone and achievement for the education community, the Standardization of Postsecondary Education Electronic Data Exchange (SPEEDE) Committee of the American Association of Collegiate Registrars and Admissions Officers (AACRA), and for PESC. ...Full Story

spacer

Needed: Terrorist Target Markup Language
Bob Glushko
Doc or Die (blog) July 14, 2006 The Office of Inspector General for the US Department of Homeland Security has just issued a scathing criticism of the National Asset Database....But suppose that the DHS had encoded these narrative specifications in an XML vocabulary called "Terrorist Target Markup Language" and required all asset submissions to conform to it. TTML would have made it possible to detect most of these problems immediately when they were submitted, and the standard organization and format of the data would have enabled additional data mining to detect anomalous information. This isn't a far-fetched suggestion. There are numerous XML standards activities underway in the homeland security domain, including biometric data exchange, common alerting protocols, and emergency response. ...Full Story

spacer

An XML Language for Emotions?
Andy Updegrove
The Standards Blog July 27, 2006 The W3C announced the launch of an intriguing new "Incubator Activity" earlier this week that should test the limits to which XML, the lingua franca of all things IT, can be put.  The new initiative is called the "Emotion Incubator Group," and its purpose is to take us beyond the narrow range of the emoticon.  According to the group's Charter:  The mission of the Emotion Incubator Group, part of the Incubator Activity, is to investigate the prospects of defining a general-purpose Emotion annotation and representation language, which should be usable in a large variety of technological contexts where emotions need to be represented….Full Story

spacer

Standards relativity: In January, I wrote a piece on the supersizing of Americans called Body Type Standards, Crash Test Dummies, and Sleeping with Big Agnes (sorry, you'll have to read it to find out who Big Agnes is), and noted that the rag trade has seen fit to play fast and loose with clothes size "standards" to pander to the vanity of customers in denial. It appears that the syndrome is not unique to the northern hemisphere, as indicated by the following article on clothes sizes Down Under. Luckily, SCALE is coming to the rescue - a consortium called the Sizing Consortium of Australia Landmark Evaluation, made up of standards groups, retailers associations, and others. Just as I noted in my essay, the hope is that the remeasuring of the Ozzies will have safety design benefits as well as fashion benefits. And, interestingly enough, it turns out that the two nations have more in common than simply expanding waistlines and a fondness for suds: the existing (and out of date) Australian standard is based in part on a 1950s U.S. Department of Commerce standard.

Does my bum look big in this? Yes and no
Rachel Wells
TheAge.com.au July 5, 2006 What woman doesn't dream of squeezing into a smaller dress size? Now, it seems you can do it without even shedding a kilo. An increasing number of retailers and fashion designers have been accused of making their clothes sizes "more generous" in a bid to deceive customers into believing they are smaller than they really are. ...Full Story

spacer

We need to talk:  As the world becomes smaller, the need to exchange data becomes more critical.  Often, that data is domain specific, both as to content as well as to the circumstances in which it is created and the stakeholders that need to create, share and protect it.  The following two articles provide two examples of how standard help address this need.

IUCN and OASIS join forces to develop open standards for conservation
OASIS.org July 5, 2006 Gland, Switzerland and Boston, MA, USA - The World Conservation Union (IUCN) and OASIS, the Organization for the Advancement of Structured Information Standards, formalized an agreement of joint activity to define and promote international standardization within the conservation sector. This new alliance will result in the creation of open standards for the electronic exchange of conservation data. ...Full Story

spacer

Panel Recommends Initial Standards to Support Nationwide Health Information Network
ANSI.org July 3, 2006 The Healthcare Information Technology Standards Panel (HITSP) has identified for the U.S. Department of Health and Human Services an initial set of standards to facilitate the secure exchange of patient data in a new nationwide health information network (NHIN) for the United States. President George W. Bush called for development of the NHIN by 2014. ...Full Story

spacer

WSIS and Internet Governance

Countdown to September 30: As I've noted before, there has been strangely little press attention so far to the impending expiration of ICANN's MOU with the US Department of Commerce, which has just over two months left to run. Not long ago, the US National Telecommunications and Information Administration (NTIA), the DOC agency that directly supervises ICANN, asked for public input on how well ICANN has been fulfilling its mission. The result of the NTIA's review could at one extreme be to conclude that ICANN is ready (as envisioned at the time that the MOU was signed) to become completely independent of US supervision, as demanded by many countries around the world, and at the other, to take the root directories away from ICANN entirely, and authorize another entity to be their steward. The following is one non-profit's evalutation of ICANN, with links to its related submission to the NTIA. The three articles below provide three different viewpoints on the issues at hand.  For more on the controversy, see the November 2005 of the Consortium Standards Bulletin.

Internet Governance Debate Poses Unique Global Challenges
Center for Democracy & Technology July 24, 2006 As the Internet becomes increasingly essential to politics, commerce and daily life, the debate over Internet governance has evolved from a niche discussion among technologists into a global controversy over who should set the rules for one of the world's most vital resources. As the US Government plots its path forward and reconsiders its special role in overseeing the Internet's addressing system, it is important to determine how the shifting global environment is likely to affect the outcome of any US decision about the future of Internet governance.
(1) Internet Governance Debate Poses Unique Global Challenges
(2) ICANN Has Made Progress, but Falls Short of Goals
(3) New Milestones for ICANN Autonomy May be Needed
...Full Story

spacer

UN Economic and Social Council asked to guide information technology issues
UN News Centre July 16, 2006 The United Nations should continue to play a leading role in expanding information and communications technologies to promote development, participants have told the world body’s Economic and Social Council (ECOSOC), currently meeting in Geneva. The calls during Monday’s session came as delegates discussed follow-up to the World Summit on the Information Society (WSIS), which was held in 2003 and 2005 and produced a global strategy to harness the power of the Internet and information and communications technologies in the fight against poverty. ...Full Story

spacer

Swiss ponder future role of the internet
Matthew Allen
SwissInfo.org July 14, 2006 Swiss website experts have outlined proposals to present to the UN-sponsored Internet Governance Forum (IGF) that will discuss the future shape of the medium. Some observers fear the forum lacks the teeth to make any meaningful changes or that it will be hijacked by politically motivated groups when it meets for the first time in October. ...Full Story

spacer

Legislation and Advocacy

[A]n Internet was sent by my staff at 10 o'clock in the morning on Friday and I got it yesterday. Why? Because it got tangled up with all these things going on the Internet commercially  [July 8, 2006]

 

Senator Ted Stevens (R-AK), during the Senate debate on "Net Neutrality." The bill lost by a tie vote...Full Story

   

spacer

Dept. of what did he say (and what can you say?)  For those that care passionately about the Internet and what it can mean to, oh, you know, humanity, creativity and global equality, it was with guarded hope that Americans watched their senators in action, debating a bill that was intended to guarantee "Net Neutrality," by which proponents mean that all sites would have access to equal bandwidth at equal cost. Absent such a bill, they fear, telcos will charge sites more to make high bandwidth-demand content (such as video) available than plain vanilla data, leading to commercial interests monopolizing such content. Sadly, the bill lost, due to a tie vote in the Senate Commerce, Science and Transportation Committee. One of those voting against the bill was Committee Chairman Ted Stevens (R-AK). Clearly, as the vote below indicates, the future of the Net was in good and knowledgeable hands. Needless to say, the bloggers of the world had a field day.

Sen. Ted Stevens Makes it all Clear to Us
Bloggers United
Google search: July 8, 2006 [The Internet] is not a big truck. It is a series of tubes. [A]n Internet was sent by my staff at 10 o'clock in the morning on Friday and I got it yesterday. Why? Because it got tangled up with all these things going on the Internet commercially ...Full Story

spacer

The Funnel

This month's crop:  As usual, I close with a sampling of the wide variety of new consortia, new initiatives, and new standards that were launched since the last issue of the CSB.  More, of course, were also noted above.

I.          New Consortia

U.S., U.K. Tag RFID For Scrutiny, Regulation
Laurie Sullivan
TechWeb.com July 17, 2006 In the U.S., an RFID caucus of government and industry representative was launched today, while on the other side of the Atlantic, 31 global organizations have formed an RFID consortium and secured more than $7.5 million in funding from an EU agency....In related RFID news, EPCglobal Inc. said the International Standards Organization (ISO) has incorporated into an ISO/IEC 18000-6 standard the specs for its ultra high frequency (UHF) Generation 2 protocol ...Full Story

spacer

HSPD-12 interoperability group launched
Press Release
SecurityDocumentWorld.com July 8, 2006 A new industry association aimed at complying with the Homeland Security Presidential Directive (HSPD) 12 policy for a common identification standard for federal employees and contractors has been launched in the US. Known as the HSPD-12 Interoperability Consortium, the group comprises nine founding information security companies . Its consortium’s goals are to provide a clear industry perspective to government agencies and system integrators; to offer an end-to-end interoperable HSPD-12 solution; to build a testing lab to demonstrate the preconfigured and pre-tested solution; and to ensure the solution is flexible to accommodate the future security needs of customers. ...Full Story

spacer

Consortium focuses on IC debug
Harry Yeates
ElectronicsWeekly.com July 7, 2006 An industry group aimed at developing standards to aid a ‘design-for-debug’ methodology had its inaugural meeting here at DAC. Members of the Design-for-Debug (DfD) Consortium are of the opinion that putting standards in place will enable people to use familiar debug techniques for silicon, which is currently approached in something of an ad hoc manner. ...Full Story

spacer

SafeNet Announces Formation of Consortium to Support HSPD-12 Needs of Government Agencies and System Integrators
Press Release
TMCNet/BusinessWire July 6, 2006 SafeNet, Inc. today announced the formation of the HSPD-12 Interoperability Consortium. This is a partnering of industry leading vendors committed to providing an interoperable solution to government agencies and system integrators which addresses the challenges and opportunities of complying with the White House issued Homeland Security Presidential Directive (HSPD) 12 - Policy for a Common Identification Standard for Federal Employees and Contractors. ...Full Story

spacer

II.         New Initiatives

IEEE Forms Higher Speed Study Group to Explore the Next Generation of Ethernet Technology
Press Release
IEEE/Ethernet Alliance July 28, 2006 The Ethernet Alliance, an industry group dedicated to the continued success and expansion of Ethernet technology, today announced that the Institute of Electrical and Electronic Engineers (IEEE) 802.3 working group has formed the Higher Speed Study Group (HSSG) to evaluate the requirements for the next generation of Ethernet technology....Individuals who are interested in participating in the HSSG should contact John D’Ambrosia, components technology scientist at Force10 Networks, at jdambrosia@force10networks.com. ...Full Story

spacer

Firms partner on standard statistical analysis library format
EE Times July 28, 2006 SAN FRANCISCO — Six companies, including Cadence Design Systems Inc. and ARM Holdings plc, have banded together with the aim of accelerating the creation of a standard statistical analysis library format under the Open Modeling Coalition of the Silicon Integration Initiative (Si2). According to the group, which also includes Magma Design Automation Inc., Extreme DA, Virage Logic Corp. and Altos Design Automation, the open statistical library format will be based on Liberty and current source models, building on an Effective Current Source Modeling 2.1 standard announced by the Silicon Integration Initiative (Si2) Monday (July 24). ...Full Story

spacer

OMG, OCEG Announce Alliance for C-GRID Project
Press Release
GridToday.cmo July 8, 2006 The Object Management Group (OMG), a software consortium responsible for establishing distributed computing specifications, and ORCA, the OMG Regulatory Compliance Alliance, has announced a strategic alliance with The Open Compliance & Ethics Group (OCEG), a non-profit organization with a mission to help organizations align their governance, risk and compliance (GRC) management activities to drive business performance and promote integrity. ...Full Story

spacer

IUCN and OASIS join forces to develop open standards for conservation
OASIS.org July 5, 2006 Gland, Switzerland and Boston, MA, USA - The World Conservation Union (IUCN) and OASIS, the Organization for the Advancement of Structured Information Standards, formalized an agreement of joint activity to define and promote international standardization within the conservation sector. This new alliance will result in the creation of open standards for the electronic exchange of conservation data. ...Full Story

spacer

New Standards Panel to Coordinate Identity Theft Protection Standards Activities
ANSI.org July 2, 2006 The American National Standards Institute (ANSI), in partnership with the Council of Better Business Bureaus (CBBB), has announced its intent to establish a new standards panel to address identity theft prevention and identity management standards. Once formed, the Identity Theft Prevention and Identity Management Standards Panel would serve as a cross-sector coordinating body, applying use-case practices to promote and harmonize the timely development of voluntary consensus standards to minimize the scope and scale of identity theft. ...Full Story

spacer

III.        New Standards

PCI Express: Ever-faster graphics pipe serves many masters
David L Fair
EDN.com July 24, 2006 The new PCI (Peripheral Component Interconnect) Express spec provides the biggest improvement in more than a decade in I/O performance for computation systems, significantly improving graphics in desktop PCs and workstations. Intel initially launched the spec in its chip sets in mid-2004, and the technology has become mainstream in high-end systems. But PCI Express is far more than an avenue to better games or video. As have many other PC innovations, PCI Express will enable significant applications, such as medical imaging, and serve in industrial control and many other embedded-system roles. ...Full Story

spacer

Unicode Releases Common Locale Data Repository, Version 1.4
Press Release
MarketWire.com July 19, 2006 MOUNTAIN VIEW, CA -- The Unicode® Consortium announced today the release of the new version of the Unicode Common Locale Data Repository (CLDR 1.4), providing key building blocks for software to support the world's languages. CLDR is by far the largest and most extensive standard repository of locale data. This data is used by a wide spectrum of companies for their software internationalization and localization: adapting software to the conventions of different languages for such common software tasks as formatting of dates, times, time zones, numbers, and currency values; sorting text; choosing languages or countries by name; and many others. ...Full Story

spacer

OMG Adopts Systems Modeling Language
Darryl K. Taft
eWeek.com July 10, 2006 The Object Management Group has announced the adoption of the OMG Systems Modeling Language as a standard. OMG and the International Council on Systems Engineering worked together to extend the OMG's UML (Unified Modeling Language) specification to come up with SysML,...a general-purpose graphical modeling language for specifying, analyzing, designing and verifying complex systems that may include hardware, software, information, personnel, procedures and facilities.As a subset of UML 2.0, SysML provides systems engineers with graphical representation and semantic foundation for system requirements. ...Full Story


 

 

 

 

 

 

L10 Web Stats Reporter 3.15 LevelTen Hit Counter - Free PHP Web Analytics Script
L