Consortiuminfo.org Consortium Standards Bulletin- November 2003
Untitled Document
Home About This Site Standards News Consortium Standards Journal MetaLibrary The Standards Blog Consider This The Essential Guide to Standard Setting and Consortia Sitemap Consortium and Standards List Bookstore
   Home > Consortium Standards Bulletin > November 2003
  Untitled Document
Untitled Document

 

NOVEMBER 2003
Vol 2, No. 11

STANDARDS AND PATENT ISSUES

Editorial:  Do IT Patents Work?
From the days of VisiCalc until today, software -- and software patents -- have come a long way. The patent system itself, on the other hand, is still where it was before the PC was invented. Its time for a change. Print This Article (PDF)

News Cluster:  Patents: Too Easy to Get, Too Hard to Challenge?
When Eolas defeated Microsoft in a suit that involved HTML, even hard-core Microsoft critics found themselves rallying around their opponent. The W3C appealed for relief, and the PTO agreed to review the offending patent. Less noticed was the release of a major report by the Federal Trade Commission, in which the FTC recommends major patent reforms. Print This Article (PDF)

Standards Blog:  What a Difference a Decade Makes (Or Does It ?)
In the last ten years, the IT world has made major strides moving from proprietary systems to open standards. But has this change to the competitive landscape actually reduced the ability of coalitions of companies to manipulate commercial outcomes? Print This Article (PDF)

The Rest of the News:  Standards Bring "Fatter" Fiber Optic Lines; Instant Messaging Finds Standards Support; News Giants Create Standards to Share On-Line Content; Denmark Joins the Open Source Camp; New Developments in the Right to Violate Copyrights to Achieve Interoperability; and much more. Print This Article (PDF)

Featured Meeting: In the wake of a string of election fiascos, NIST hosts a symposium on how new election technologies can build "trust and confidence in voting systems."

Print This Issue

 



EDITORIAL:

Do IT Patents Work?

Andrew Updegrove


Ask anyone whether they are satisfied with the current state of patents in the American IT world, and you are not likely to find many fans of the status quo. And how surprising can that be, given that the authority to grant patents was included by the Founding Fathers in the Constitution?

Those who have some grey hair will be aware that the Patent and Trademark Office (PTO) has found the advent of high technology -- and particularly the unique nature of software -- challenging. The original spreadsheet program was launched, after all, at a time when software was not regarded as being patentable, which is in part why we are more familiar today with the brand name Excel than VisiCalc.

Once software was deemed to be worthy of patent protection, a backlog in filings built up as the PTO struggled to add staff that could understand software architecture. Many felt at that time that software patents were far too hard to obtain. Later, long delays melted away, and the first allegations that software patents were being granted too liberally arose, in connection with a feeling that examiners were still ill-equipped to evaluate them.

Several other cracks became evident in patents as applied to software as the pace of technology quickened. The very concept of a government-granted monopoly for a finite time period seemed absurd, when the period of exclusivity spanned 20 years. True, such a period remained sensible in the world of biotechnology, where drugs could have perpetual utility, and the R&D and approval process could take a decade. But in the world of software, obsolescence could occur before a patent was even granted.

Later, new trends emerged: the free software and open source movements brought an almost theological challenge to the concept of exclusive ownership rights, while the granting of "business methods" patents on processes that people found to be obvious outraged others. Eventually, the abuse of the standards process by some participants who failed to disclose patents was added to the list. Today, many feel simply that patents are too easily obtained, and too difficult to challenge.

Where can all this lead? Unfortunately, patents are a creation of government, and therefore inertia and process are serious issues. But even as new crises arise, there is some hope for progress.

In our November News Cluster, we focus on two current events: the well-publicized Eolas victory over Microsoft, followed by the W3C's successful quest for a review by the PTO, and the barely noticed release by the Federal Trade Commission of a comprehensive report suggesting meaningful reforms to the patent system. Whether the combination of the outcry over the Eolas patent and the FTC's good work will be sufficient to spark a change remains to be seen.

But those that have a serious stake in the game would be wise to seize upon this opportunity to speak out in favor of a serious effort to bring the patent system up to date. Perhaps the Founding Fathers would be proud of us if we do.

Comments? Email:

Copyright 2003 Andrew Updegrove



NOVEMBER NEWS CLUSTER:

PATENTS: TOO EASY TO GET, TOO HARD TO CHALLENGE?

Andrew Updegrove

 

What Price Owner's Rights in a Modern World? For some time there has been unhappiness over the ready availability of IT patent protection. Many have thought that the Patent and Trademark Office (PTO) grants patents too easily, especially in the area of software. At the same time, those unlucky enough to become embroiled in patent infringement litigation typically face legal costs of over $1 million (and several times that, in the case of biotech patent actions). That's bad enough for someone knowingly challenging a patent, but truly punishing for someone surprised to find herself on the receiving end of an infringement suit.

Opinions on this subject have become more emotional in light of several other factors. First, it hasn't helped that many seemingly elementary (or, in patent terms, "obvious") patents have been granted in the area of "business methods" involving ecommerce. Second, the Free Software and Open Source movements are championing fee-free accessibility to important software, such as operating systems (e.g., Linux) and core applications (e.g., Open Office). And, in fact, the open source model has broken out of the realm of true believers, and is gaining significant traction in the commercial marketplace. Finally, in the ever more connected world in which we live, it is becoming increasingly difficult to develop standards that can be implemented without incurring the obligation to pay royalties or other fees. All in all, there are more people feeling hostile to IT patent rights now than ever before.

But to return to the question of whether patent's are too easy to obtain: In this author's experience, it is almost unheard of for a patent asserted against a standard not to be viewed by engineers (at least) as having been conclusively anticipated by one or more pre-existing discoveries, research papers or patent claims (so called "prior art", under patent law). Since inventions that have been "anticipated by prior art" are not entitled to patent protection, there is an immediate expression of outrage in such situations.

In truth, given examples of prior art can have a different significance under the complex legal analysis of patent law than they may present to an engineer from a technical point of view. But nonetheless, more and more people have come to believe that something is out of balance, and needs to be corrected. A number of developments occurring this fall highlight the situation -- and hold out some hope of reform.

Eolas v. Microsoft: Comptons' Redux. The first development is the ongoing outcry over the patent litigation victory of a one-man software company called Eolas over industry behemoth Microsoft. Recently, Eolas won a $512 million judgment over Microsoft. The suit involves a patent that describes a system that launches an application within a Web page. The verdict upset a far wider audience than Microsoft however, since the patent would represent the first royalty cloud that might shadow an important element of normal Web usage. In response, the W3C issued strong protests, as well as an appeal to the Patent and Trademark Office seeking a reexamination of the Eolas patent. Not long afterwards, the PTO granted the request. Hearings are now pending.

For those with long memories, all of this will have a familiar ring. It was at Comdex in 1993 that a company called Compton's New Media made an announcement that landed like a bombshell on the nascent multimedia industry. At the show, Compton's revealed that it had just been granted a patent that would allow it to levy a royalty on a broad swath of multimedia applications, including those based on CDs, which were then coming into their own.

At that time, Multimedia was being hailed as the Next Big Thing, and with modem speeds under 2500 baud, this new technology was wholly dependent on CDs for content delivery and access. As a result, the industry besieged the PTO, demanding a review of the newly awarded patent, and citing prior art to support its case. The PTO agreed to conduct a review, and ultimately the patent was rescinded after public hearings were held. Many were left with a somewhat queasy feeling that the PTO had caved to public pressure (could the patent have, in fact, been valid?) Smarting from the outcome, the PTO vowed to reform the system, in order to prevent future such fiascos.

Would a Recission of the Eolas Patent be Good News or Bad? How should we feel about the rapid response of the PTO to the W3C's call for review of the Eolas patent? The good news for Web surfers is that a non-fee resolution to the issue may be on the horizon. But at the same time, is this any way to run a patent system? After all, Microsoft (no slouch in the legal department) failed to successfully challenge the patent in court. So will justice be served by the PTO intervening? And would those involved in a less-public patent have been able to gain a review by the PTO?

Of course, if the PTO rescinds the patent, the legal expense and the uncertainty that the case has unleashed on the marketplace will have been considerable. Many companies and the W3C will have experienced a great diversion of resources and distraction from other business as a result of what would prove to have been an inappropriately granted patent. In that light, a rescission by the PTO can scarcely be considered to be unmitigated good news.

The FTC Weighs in. While the Eolas litigation was playing out, a different process was unfolding in Washington D.C. During the first half of 2002, the Federal Trade Commission (FTC) and Department of Justice (DOJ) were holding extensive joint hearings on antitrust and other aspects of intellectual property, including the patent system and the process of standard setting. The hearings extended for over 24 days, during which more than 300 commercial, legal, technical and academic experts (including this author) gave testimony.

Now, the first of two reports has been issued by the FTC (the FTC and the DOJ plan to also issue a joint report on antitrust issues). The FTC report focuses almost exclusively on the reform of the patent system, and is entitled " To Promote Innovation: the Proper Balance of Competition and Patent Law and Policy ". Fortunately for those with a less than avid appetite for the subject, the 315-page report is accompanied by a more accessible, 18 page Executive Summary. Included in the FTC's findings are conclusions such as this: "Questionable patents are a significant competitive concern and can harm innovation."

In its report, the FTC offers a broad range of specific recommendations for reform. Several are dramatic, and are based upon systemic flaws that the FTC finds in the PTO. For example, the report notes that examiners on average spend only 8 to 25 hours on each patent application, thereby limiting the amount of time an examiner can spend seeking and evaluating prior art. This should not be a surprise, given that the PTO receives over 1,000 new applications a day, and is under the same budget pressures as any other Federal agency. Further stacking the deck in favor of patent approval, however, is the fact that court cases have held that examiners are to look for reasons why a patent should not issue, rather than establishing the qualifications that would entitle an application to be approved at all (in other words, an application is presumed to be valid unless a reason can be found by the examiner to the contrary).

Of course, even with more time and different standards, mistakes would still be made by the PTO. The FTC accordingly based some of its recommendation on the acceptance that "The PTO works under a number of disadvantages that can impede its ability to reduce the issuance of questionable patents." As a result, the FTC also focused on the question of how difficult it should be to challenge a patent once it has been issued.

While some of the FTC's recommendations (15 in all, counting "sub recommendations") involve finer points of the patent process, some are fundamental, including the following:

  • That legislation be enacted that would create a new administrative procedure to allow post-grant review of, and opposition to, patents.
  • That the legal standard required to overturn a patent be reduced from a showing of "clear and convincing evidence" in support of rescission to a "preponderance of the evidence" (from a legal point of view, the reduction in the burden of proof is significant, and would make it easier to defend against charges of infringement when a dubious patent is asserted against another party)
  • Tighten the legal standards used to evaluate whether a patent is "obvious".
  • Increase the PTO's funding.
  • Increase the authority of patent examiners to query applicants more extensively, including as regards prior art.

What Next? Of course, most of the reforms recommended by the FTC are directly or indirectly dependent on Congressional action, either legislatively or as regards increasing the budget of the PTO. It is hardly likely that Congress will place the FTC recommendations at the top of its agenda, or that ready, willing and able Congressional sponsors will easily be found to sponsor necessary legislation. In this light, the FTC report is therefore likely best seen as a first step towards raising, and legitimating, the dialogue on the topic of patent reform. Hopefully, the FTC's worthwhile efforts will succeed in doing so.

Comments? Email:

Copyright 2003 Andrew Updegrove

On line resources:

I. THE EOLAS SUIT

A. The Eolas - Microsoft suit began in 1999 (Eolas Press Release):

EOLAS SUES MICROSOFT FOR INFRINGEMENT OF PATENT FOR FUNDAMENTAL WEB BROWSER TECHNOLOGY THAT MAKES "PLUG-INS" AND "APPLETS" POSSIBLE

Chicago, IL, (February, 2, 1999) -- Eolas® Technologies Incorporated, an Internet technology firm based in downtown Chicago, today announced that it has filed suit here in Federal court against Redmond, Washington-based Microsoft Corporation (NASDAQ: MSFT ) for infringement of Eolas' patent on fundamental Web browser technology that makes "plug-ins" and "applets" possible.

For the full story see: http://www.eolas.com/zmapress.htm

B. The press begins to pay attention:

Microsoft, Eolas in court over patent dispute
By James Evans

InfoWorld, Boston, October 25, 2000 -- MICROSOFT was in court again on Wednesday, this time in Chicago for a little-known patent infringement case filed early last year by a small research and development company.

For the full story see: http://archive.infoworld.com/articles/hn/xml/00/10/25/001025hnpatentdispute.xml

C. The court's decision is announced, and emotions are mixed:

Microsoft loses $521 million browser lawsuit
Chalk up another one for the little guys
By Eric Smith

Geek.com, August 13, 2003 -- The old saying says that crime doesn't pay, but Microsoft is finding out the hard way (again) that crime can pay--just not in the direction Microsoft wants.

For the full story see: http://www.geek.com/news/geeknews/2003Aug/gee20030813021295.htm

Patent Politics: Rivalries set aside in defense of Internet Explorer
By Paul Festa

CNet News.com, September 25, 2003 -- ...What a difference a patent suit makes. With one staggering loss at the hands of a federal court jury in Chicago, Microsoft has won the support--if not the sympathy--of nearly the entire software industry, from standards organizations to corporate rivals that are rushing to defend the company's Internet Explorer browser

For the full story see: http://news.com.com/2009-1023-5082004.html

D. Microsoft reacts:

Microsoft press release:

Microsoft Announces Steps to Address Eolas Patent Ruling

REDMOND, Wash. -- Oct. 6, 2003 -- Microsoft Corp. today announced how it will respond to the August jury decision in the Eolas patent lawsuit. The steps include modest changes to Microsoft® Windows® and Internet Explorer as well as measures that Web developers and others who use Internet Explorer technology can take to ameliorate or eliminate the impact of the ruling.

For the full story see: http://www.microsoft.com/presspass/press/2003/oct03/10-06EOLASPR.asp

Microsoft technical page:

Microsoft Information for Developers about Changes to Internet Explorer

Issue

This change is a result of an adverse verdict against Microsoft in a patent infringement lawsuit brought by the University of California and Eolas Technologies...Eolas has asserted that its patent covers one specific mechanism used by Web page authors to embed and automatically invoke certain interactive programs. We made this change to IE to respond to this ruling after considering many factors, including impact on the customer and impact on developers.

For the full story see: http://msdn.microsoft.com/ieupdate/default.asp

E. The W3C speaks out:

W3C press release:

W3C Director Tim Berners-Lee urges USPTO Director to review prior art, take action

W3C.org, 29 October 2003 -- The World Wide Web Consortium (W3C), the global standard-setting body for the Web, has presented the United States Patent and Trademark Office with prior art establishing that US Patent No. 5,838,906 (the '906 patent) is invalid and should therefore be re-examined in order to eliminate this unjustified impediment to the operation of the Web. The W3C is urging US Under Secretary of Commerce for Intellectual Property James E. Rogan to initiate a re-examination of the patent because the critical prior art was neither considered at the time the patent was initially examined and granted, nor during recent patent infringement litigation.

For the full story see: http://www.w3.org/2003/10/28-906-briefing

W3C FAQ sheet on the impact of Eolas and W3C's response:

FAQ on US Patent 5,838,906 and the W3C

For the full story see: http://www.w3.org/2003/09/public-faq.html

F. The PTO Responds:

PTO Director Orders Re-Exam for '906 Patent
By Dale Dougherty

O'Reilly Network, November 11, 2003 -- I n what could be good news for the Web, the Director of the US Patent and Trademark Office has ordered a re-examination of the '906 patent, which was the subject of a patent infringement lawsuit this summer brought by Eolas against Microsoft.

For the full story see: http://www.oreillynet.com/lpt/wlg/3969

II. The FTC Report

FTC press release:

FTC Issues Report on How to Promote Innovation Through Balancing Competition with Patent Law and Policy
Washington, October 28, 2003 -- The Federal Trade Commission today issued its report on how to promote innovation by finding the proper balance of competition and patent law and policy.... Although questionable patents can harm competition and innovation, valid patents work well with competition to promote innovation. This Report analyzes and makes recommendations for the patent system to maintain the proper balance with competition. http://www.ftc.gov/opa/2003/10/cpreport.htm

The Executive Summary: http://www.ftc.gov/os/2003/10/innovationrptsummary.pdf

The Full Report: http://www.ftc.gov/os/2003/10/innovationrpt.pdf

 


 

 

FROM THE STANDARDS BLOG

WHAT AT DIFFERENCE A DECADE MAKES (OR DOES IT?)

Andrew Updegrove

[] [] [] November 11, 2003 - In the course of doing research for a Friend of the Court brief in support of Infineon's (unsuccessful) Supreme Court bid, I came across an article that appeared in The Economist some ten years ago. As I read it, I was struck by the degree to which the world has changed since the bad old days of proprietary systems -- when everyone talked about open standards (but no one really meant it) -- to the present time, where we utilize a host of standards that truly permit devices of diverse manufacture to interoperate. But how different are things today, really?

Ten years ago, a business was a DEC shop, or a Data General shop, or a Thinking Machines shop (obviously, a number of other things have changed along the way as well). Today, while we have not achieved true platform freedom, we do have examples of vast wide area interconnectivity, courtesy of the Internet and telecommunications. And new standards-based initiatives, such as the storm of activity in the area of Web services, promise to bring us closer to even the goal of pervasive cross-platform interoperability.

But have things really changed, or are today's standards efforts, like Clausewitz's definition of diplomacy, simply the pursuit of war by other means? And if so, are we better off, or just treading water?

For some insight into those questions, let's go back ten years to that ancient article in The Economist, which stated:

Every firm wants a monopoly--and every firm wants to call it an open standard. The noisiest of...competitive battles will be about standards...[I]n the computer industry, new standards can be the source of enormous wealth, or the death of corporate empires. With so much at stake, standards arouse violent passions. Much of the propaganda pumped out by individual firms is aimed at convincing customers and other firms that their product has become a "standard".

Well, we've largely moved on from that specific type of competitive behavior. Most vendors now realize that becoming the next Microsoft is simply not in the cards. But that doesn't mean that everyone is willing to hop into the same sandbox and create open standards in a pure spirit of cooperation. Not even with adult supervision.

No, the search for competitive advantage has simply switched from trying to establish one's own proprietary technology as "the standard" to trying to control the process whereby the standard is set. But, as with diplomacy, controlling the process of standard setting requires more art and less blunt force than either war or establishing proprietary systems as de facto standards.

Happily, trying to blatantly control the creation of a standard is comparatively difficult to pull off in the context of a collaborative process. After all, as observed in the same article in The Economist, "...most multi-firm efforts have failed for the simple reason that the participating firms cannot trust each other." But today, many new business opportunities simply can't exist without standards, so that lack of trust means that everyone is looking at everyone else very closely. Even the press has found story lines in the standards games that companies play, to the point where standards development has gone from the status of chloroform in print to the subject matter of investigative journalism.

So let's return one last time to The Economist to see how much the world has really changed. In February of 1993, the authors used Unix as an example of how agreeing on truly open standards was an inherently lost cause:

There are now many rival versions of Unix sponsored by various firms from IBM to Sun Microsystems, all of which are, to a significant degree, incompatible with one another, although all are promoted as open....In fact, widespread adoption of a single firm's product is the only way truly open standards have been established in the new computer industry....In December Novell bought Unix Systems Laboratories from AT&T and 11 minority shareholders, with the obvious intent of making Unix an alternative standard to whatever is offered by Microsoft.

Where to begin to bring that one up to date? Unix is back with a vengeance -- but no longer in a proprietary uniform, with Novell owning the One True Version. Instead, its new face is the innocent penguin of Linux, an operating system that is not only open, but created by an entirely new standard-setting process, involving independent, individual programmers (at least initially), rather than corporate giants.

So it is that a confederacy of David's, and not Novell, is finally threatening the Goliath of Redmond. Novell itself is hoping to ride the open coattails of the operating system that it once owned, long after abandoning its effort to turn that same operating system into its next Netware-type monopoly. With Novell's acquisitions of Ximian and SuSE, it is basing its future on the success of an orphaned operating system that made good on the streets, after surviving life in a series of corporate foster homes.

Almost the stuff of a made-for-TV movie. Perhaps a decade can make a big difference after all.

Comments? Email:

Copyright 2003 Andrew Updegrove

# # #

Useful Links and Information:

The Economist, Do It My Way , Vol. 326, Issue 7800, 11 (Feb. 27, 1993) [not available on-line]
Famous quotes, like paper "will hold still for anything", and are thus kidnapped by myriad authors to serve their own selfish purposes. The quote of Carl von Clausewitz that I have used for my own purposes above, perhaps his most famous, is subject to some variance of interpretation even in its original context. The controversy revolves around his use over 170 years ago of the word "politik", simplistically translated by some as "politics", and by others more particularly as "diplomacy". One author explains Clausewitz's intention as follows:

In fact, Clausewitz's varied usage of Politik and the historical context within which he wrote indicate that he meant three things by the term. First, Clausewitz did intend Politik to mean policy, the extension of the will of the state, the decision to pursue a goal, political or otherwise. Second, Politik also meant politics as an external state of affairs, the strengths and weaknesses provided to a state by its geo-political position, its resources, alliances and treaties, and as an ongoing process of internal interaction between a state's key decision-making institutions and the personalities of its policy makers. Lastly, Clausewitz used Politik as an historically causative force, providing an explanatory pattern or framework for coherently viewing war's various manifestations over time.

See: Echevarria, Antulio J. II. War and Politics: The Revolution in Military Affairs and the Continued Relevance of Clausewitz: http://library.thinkquest.org/C004488/Ess2.html?tqskip1=1&tqtime=1111

 

 


 

THE REST OF THE NEWS

Every day, we scan the web for all of the news and press releases that relate to standards, and aggregate that content at the News Section of ConsortiumInfo.org. For up to date information, book mark our News page, or take advantage of our RSS feed: http://www.consortiuminfo.org/news/rss/

The following are just a few of the many stories from the past month that you can find digested at ConsortiumInfo.org.

New Standards/Specifications

Better than a lifetime cell phone number : The hot news in the public press these days is about the citizenry securing the right to lifetime ownership of cell phone numbers. But a more complex and interesting challenge is being tackled by the Liberty Alliance Project, which is addressing the sophisticated issues of "federated identity". As its standards are completed and implemented, users of Web services will enjoy greater convenience, greater security and flexibility.

Liberty Alliance Finalizes Phase 2 Specifications and Privacy Guidelines
for Federated Identity

Madrid, Spain– November 12, 2003 – The Liberty Alliance, a consortium formed to develop open, interoperable federated identity standards, today announced approval and publication of its Phase 2 specifications which round out the existing Liberty Federation Framework and cement the foundation for the Liberty Identity Web Services Framework. The final Liberty Phase 2 specifications are now available for download to be used for Liberty-enabled product and service development. The Alliance also announced today initial member implementation plans for the Phase 2 specifications, a best practices “owners manual” to help Liberty implementers use the specifications in a privacy-compliant manner, and the formation of a new group, the Services Group, to develop service interface specifications that exploit the Liberty Identity Web Services Framework.

For the full story see: http://www.projectliberty.org/press/releases/2003-11-12.html

To GIF or not to GIF? While almost everyone is familiar (and uses) the GIF format for creating and sharing images, the W3C has been at work for years on an independent format, due in part to the fact that the GIF format is proprietary. But now the patents on the GIF format are expiring, leaving many wondering if this will spell the end for the W3C offering. In the announcement below, the W3C announces the completion of the second edition of the PNG specification, and notes the functionalities that it provides that are not available with GIF.

PNG Second Edition Is a W3C Recommendation

November 10, 2003 -- The World Wide Web Consortium released the "Portable Network Graphics (PNG) Specification (Second Edition)" as a W3C Recommendation. The document has also become an International Standard, ISO/IEC 15948:2003. ...[PNG is], an extensible file format for the lossless, portable, well-compressed storage of raster images. PNG provides a patent-free replacement for GIF and can also replace many common uses of TIFF. Indexed-color, grayscale, and truecolor images are supported, plus an optional alpha channel. Sample depths range from 1 to 16 bits. PNG is designed to work well in online viewing applications, such as the World Wide Web, so it is fully streamable with a progressive display option. PNG is robust, providing both full file integrity checking and simple detection of common transmission errors. Also, PNG can store gamma and chromaticity data for improved color matching on heterogeneous platforms. Read more about the Graphics Activity .

For the full story see: http://www.w3.org/TR/2003/REC-PNG-20031110/

"Future-proofing" the transmission lines: Overinvestment in fiber optic lines provided one of the great train wrecks in the general telecommunications meltdown of the Internet bubble years, leading to vastly overbuilt capacity and the collapse of many companies with huge market caps. Nonetheless, the pace of standards development in this area continues. The good news is that the "fat pipes" that these standards are enabling will be able to accommodate any conceivable load of data for decades to come, ensuring that continuing investments in telecommunications infrastructure will reap rewards for many years to come. The next two items relate to the announcement of two new standards that will help to achieve this end.

ITU Standard quadruples fibre optic transmission capacity and lowers cost

Geneva, 7 November 2003 - The International Telecommunication Union (ITU) has reached agreement on a new global standard that quadruples capacity of the optical transmission systems which link the nodes of telecommunication networks. The new standard, which allows a transmission speed of 40 Gbit/s, has been developed for carriers to be able to bring down the cost per bit (of data carried) and the costs of network maintenance and management. The standard - ITU-T Recommendation G.959.1 - increases the capacity for optical interfaces from the present maximum of 10 Gbit/s to 40 Gbit/s. The completed work goes hand-in-hand with other work by ITU in optical transport networks, which encourage a fair market for manufacturers and operators, and ultimately encourages better service for consumers. It is already finding it's way into optical interfaces developed to exploit the demand for high capacity Internet routers. The standard follows extensive field trials between a number of service providers and manufacturers.

For the complete story, see: http://www.itu.int/newsroom

New ITU standards make fat pipes fatter

Geneva, 5 November 2003 -- TU delegates from government and industry have agreed on a new global standard that will allow network operators to increase the capacity of optical fibre. The standard, created in response to industry needs, was developed under ITU's fast track approval process — AAP (Alternative Approval Process). The standard — ITU-T Recommendation G.695 — applies to a technology called Coarse Wave Division Multiplexing (CWDM), used most often in metropolitan networks. In today's cost-conscious telecommunications market CWDM is seen as a cheaper and simpler alternative to DWDM (Dense Wavelength Division Multiplexing). Less expensive uncooled lasers may be used in CWDM products because of wide channel spacing. These lasers require less precise wavelength control, as well as lower-cost passive components. Experts estimate that carriers with sufficient deployed fibre could make savings of up to 30 per cent deploying a CWDM solution compared with the DWDM alternative

For the complete story, see: http://www.itu.int/newsroom/press_releases/2003/28.html

It's not just for kids anymore: Whether systems administrators like it or not, instant messaging has infiltrated the workplace. At the same time, new work is being done to integrate IM into the workplace as a tool, rather than a distraction and a security risk. The IETF is facilitating that progression with the announcement that it has approved five new specifications to permit seamless exchange of IM between disparate systems.

IETF Instant Messaging and Presence Protocol Specifications Approved.

Cover Pages, October 31, 2003 -- A public message from the Internet Engineering Steering Group announces the approval of five proposed standard IETF Working Drafts from the Instant Messaging and Presence Protocol (IMPP) WG. The documents include "Presence Information Data Format (PIDF)", "Common Profile for Instant Messaging (CPIM)", "Common Presence and Instant Messaging: Message Format", "Address Resolution for Instant Messaging and Presence", and "Common Profile for Presence (CPP)". The IMPP WG specifications form the basis for a mechanism by which multiple distinct Instant Messaging applications may pass messages among the different systems while retaining the ability to use end-to-end encryption, integrity protection, and a shared framework for presence information.

For the full story, see: http://xml.coverpages.org/ni2003-10-31-a.html

So much news, so many formats: With the ability to share information electronically, news agencies and outlets of all types now have the ability to exchange vast quantities of news in real time. Ideally, such data flows can be dropped seamlessly into diverse print and web layouts - if the right standards are created and adopted. The following press release addresses a new specification that will allow those in the news industry to do just that.


Industry Milestone: New Standard for Online Content Ready for Prime Time

Alexandria, VA, (October 27, 2003) - Publishers, aggregators, syndicators and other content companies are now ready to exchange content for secondary licensing using a standardized XML format - the newly released PRISM specification, the PRISM Aggregator DTD (Document Type Definition) Version 1.0....By providing the industry with a standardized vocabulary and rules for defining content electronically, the PRISM Aggregator DTD enables aggregators to lower their costs of bringing new sources of information online. It also enables them to publish the content online more quickly, making it more valuable both to them and to the owners of that content.

For the full story see: http://www.idealliance.org/prismtemp/news/2003/1024.asp

New Initiatives

Web Services Standards Everywhere: If a new IT standards working group is formed, there's an excellent chance that it will be formed in the area of Web services. A month is not likely to go by without one or several (or even many) new initiatives being launched by the various bodies active in this area. For an overview of who is doing what, see The Role of Web Services Bodies: In Their Own Words http://www.consortiuminfo.org/bulletins/may03.php#featured, and for the latest announcement, see the following.

OASIS Members Advance Protocol for Monitoring and Controlling Asynchronous Web Services

Boston, MA, USA; 10 November 2003 -- Members of the OASIS international standards consortium have begun work on a specification to enable the control and monitoring of asynchronous or long-running Web services. The OASIS Asynchronous Service Access Protocol (ASAP) Technical Committee is developing an extension of the World Wide Web Consortium's Simple Object Access Protocol (SOAP) that will accommodate latency between the request for a resource or service and its actual return. ASAP is applicable for areas as diverse as workflow, business process management, e-commerce, data mining, and mobile wireless devices.

OASIS ASAP Technical Committee http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=asap
Cover Pages Technology Report: Asynchronous Transactions and Web Services http://xml.coverpages.org/async.html

Financial standards proliferate: An increasing number of standards bodies are active in setting standards that permit financial transactions to be executed in diverse settings, involving everything from smartcards and wireless devices to public company financial reporting. The following press release relates to a new initiative directed at ensuring that these various efforts are executed in a useful and coordinated fashion.

IFX FORUM TO DEVELOP STANDARDIZED XML PAYMENT MESSAGING WITH LEADING BANKS, OTHER STANDARDS GROUPS

ALEXANDRIA, Va. Nov. 10, 2003 The Interactive Financial eXchange (IFX) Forum expects to be a key player in the International Standards Team Harmonization (ISTH) initiative that was announced on Friday by Gartner, Inc.  The IFX Forum, comprised of representatives of leaders in the financial services industry, has a strong track record in XML development and has already delivered robust, extensible financial messaging standards that are rich in payment content and designed for interoperability....[t]he formation of the ISTH group [was announced], with its goal to create a single Core Payment XML “Kernel” that can be used globally by any corporation and any servicing bank, and the signing of a Memorandum of Understanding by the four standards groups to cooperate and coordinate their message standards' content and use of a core payment kernel XML transaction.  RosettaNet has also endorsed the ISTH's direction....The IFX standard, first published in April 2000 and now in its fifth release, is designed specifically for interoperability of systems seeking to exchange financial information internally and externally.

For the full story, see: http://www.ifxforum.org/ifxforum.org/index.cfm

Standards and productivity: One of the drivers of the current jobless recovery is a dramatic rise in productivity. One enabler of increased productivity that is not often mentioned is the development and deployment of standards that address manufacturing and other supply-chain processes. The following new work group formed by OASIS addresses communication between shop floor management and enterprise management, to permit real time adjustments to changing circumstances.

The OASIS PPS TC Is Being Formed

OASIS, October 24, 2003 -- The OASIS Production Planning and Scheduling (PPS) TC has been proposes and OASIS has issued a Call for Participation. In the current manufacturing environment, shop floor management should be collaborated with enterprise management in order to stay current with the customer needs that cannot be expected and frequently changed. Considering this situation, production planning software in the enterprise level and production scheduling software in the shop floor level are required to be integrated. The purpose of the OASIS PPS TC is to develop common object models and corresponding XML schemas for production planning and scheduling software. The OASIS PPS TC will conduct its business in Japanese. The first meeting will take place on 18 December in Tokyo

For the full story see: http://lists.oasis-open.org/archives/members/200310/msg00012.html

Open Source

Drumbeat for open source continues: In the last issue of the Consortium Standards Bulletin, we reflected on the pernicious effects of "computer monocultures" ( Alpha Predators and CyberInsecurity ), as highlighted in the recently released report, "CyberInsecurity: the Cost of Monopoly" . The following report, commissioned by the Danish government, highlights other types of adverse effects that lack of competition causes, and recommends government action to break monopolies that become entrenched in the marketplace.

Denmark urges government support for open source

ZDNet News, October 24, 2003 -- Open source software and open standards are vital for any attempt at e-government, argues a new report from Denmark. Open source software represents a serious alternative to proprietary products, and should be used as a tool to open up software markets to more competition, according to a report carried out under the auspices of the Danish government. The report, which stirred up controversy when it was published in Denmark earlier this month, was released in English this week by the Danish Board of Technology. While a number of governments in Europe and elsewhere are eyeing open source software as a way of cutting costs and stimulating localised software development, the Danish study goes a step further, arguing that public sector support for open-source and open standards may be necessary for there to be any real competition in the software market....The study recommended that governments take an active role in promoting standardised file formats and alternatives to dominant proprietary applications in order to help break a "de facto monopoly". "The ordinary market conditions for standard software will tend towards a very small number of suppliers or a monopoly," the Board of Technology stated in the report. "It will only be possible to achieve competition in such a situation by taking political decisions that assist new market participants in entering the market."

For the full story see: http://www.zdnet.com.au/newstech/os/story/0,2000048630,20280102,00.htm

Story Updates

Another link in the chain: We have been following RFID tag specifications and market testing for all of 2003, and the news in this area continues to issue rapidly. The following two stories show both sides of the buy/sell equation in this rapidly evolving area: a new standard from the Auto-ID Center has been released to enable the supply side, and another 900 pound gorilla is joining Wal-Mart on the demand side: the Department of Defense.

Physical Markup Language (PML) Core Specification Version 1.0 for EPC Objects.

The Cover Pages, November 10, 2003 -- A "PML Core Specification Version 1.0" has been published as an Auto-ID Center Recommendation, documenting the core portion of the Physical Markup Language (PML Core). The PML vocabularies provide the XML definitions for data exchanged between components in the EPC Network system. EPC is an enabling technology designed to transform the global supply chain through a new, open global standard for real-time, automatic identification of traded items.

Feds, Wal-Mart Drive RFID Adoption

eWEEK, October 28, 2003 -- While still in its infancy, radio frequency identification technology is gaining momentum in business and now with the federal government. The Department of Defense last week instituted a policy to require its suppliers to install radio frequency identification (RFID) tags on individual parts and pallets by 2005, a federal stamp of approval on the technology. The tags will enable an operator to wirelessly scan a package for asset management and tracking data. This move by the military follows on the heels of the world's largest retailer, Wal-Mart Stores Inc., which recently decided to require RFID from its suppliers by January 2005. Currently, the military mandate requires that radio frequency (RF) tags be installed at both the crate and pallet level, not on individual items, according to DOD spokeswoman Marcia Klein. The agency will work with its almost 24,000 suppliers to deploy the individual tags and the systems to monitor and analyze the data. See also PML for Radio Frequency Identification

For the full story see: http://www.eweek.com/article2/0,4149,1365701,00.asp

Legislation/Regulation/Advocacy

Two Way Street: One feature of the Federal government is its enormous buying power. As a result of this economic influence, requirements adopted by the government can affect the uptake of standards in the private sector as well (see the "Story Updates" above for another example of this dynamic). On the other hand, one advantage to private sector standards bodies is that they generally can work faster than legislative bodies. Sometimes Congress will take advantage of a voluntary standard created by a consensus process, using it as the basis for a law or regulation. The stories below illustrate each of these phenomena.

Draft Federal Guidelines Issued for Computer Security

NIST, November 3, 2003 -- Computer scientists at the National Institute of Standards and Technology (NIST) released on Nov. 3 an initial public draft of Recommended Security Controls for Federal Information Systems (NIST SP 800-53). The publication, which details controls that will become mandatory for most federal systems in 2005, is expected to have a wide audience beyond the federal government. NIST invites public comments on the new draft guidelines for three months. The agency will hold an open, public workshop in March 2004 to share comments and discuss possible revisions to the draft.

For the full story, see: http://www.nist.gov/public_affairs/releases/compsecurityguide.htm

Standards Key to Passage of Electronic Check Clearing Legislation

ANSI, October 29, 2003 -- In the age of instantaneous communication and automated teller machines, consumers often find themselves perplexed to be waiting days for a check to clear. Banks are currently required by law to physically transmit original paper checks between financial institutions for processing....This cumbersome delay was answered when Congress recently passed a Conference Report on HR 1474, otherwise known as the “Check 21 Act,” which was signed into law by President Bush on October 28, 2003. HR 1474, now Public Law #108-100, will permit electronic checks to legally substitute their original paper copies in order to expedite processing. Rather than waiting for a check to be physically transported to the original institution and back, the information would be transmitted electronically...The electronic check clearing standards originated with the ANSI Accredited X9 Committee. X9 develops and publishes voluntary, consensus technical standards for the financial services industry, in the areas of check processing, electronic check exchange, PIN management and security, financial industry use of data encryption, and wholesale funds transfer, among others.

For the full story see: http://www.ansi.org/news_publications/news_story.aspx?menuid=7&articleid=542

Intellectual Property Issues

Interoperability and ownership rights: To some, it may come as a surprise that achieving interoperability may provide a right to violate another owner's copyright. But that right is narrow and evolving. The first article below highlights a situation where a court found that the right was not available, while the second describes a Copyright Office ruling that could undercut the same court's ruling. The third item is a link to an announcement by the Librarian of Congress, further defining the types of copying to achieve interoperability that are permissible under U.S. copyright law.

The DMCA and Interoperability: A Troubling
Legal Strategy in the Aftermarket Industries
By Peter Moldave

Technology Law Bulletin -- The Digital Millennium Copyright Act (DMCA) prohibits technological devices that assist in the circumvention of copyright access controls. This “anti-circumvention” prohibition was enacted to prevent the manufacture of devices that could be used to circumvent “digital locks” on copyrighted materials such as books, films and music that are sold in digital form. However, a case decided early this year raises significant concerns about the ability of companies to use the DMCA not just to prevent the copying of music or other copyrighted media, but to use technology to lock-in consumers in order to prevent aftermarket competition.

For the full story, see: http://www.lgu.com/newsletter/articles/dmcainterop.shtml

DMCA grounds shaken in Lexmark cartridge case

IDG News Service, October 30, 2003 -- A ruling this week from the U.S. Copyright Office could strengthen manufacturer Static Control Components Inc.'s (SCC's) defense against a copyright infringement lawsuit by Lexmark International Inc. that is expected to have a broad impact on the market for low-cost, third-party printer cartridges. SCC, in Sanford, North Carolina, makes computer chips that allow manufacturers to create clones of toner cartridges used in Lexmark printers. Lexmark sued SCC last year, charging that SCC's chips include copyright Lexmark computer code and violate the Digital Millennium Copyright Act's (DMCA) ban on circumventing digital technology that protects copyright material. Without taking a position on whether SCC's chips illegally incorporate Lexmark code, the Copyright Office ruled that the DMCA does not block software developers from using reverse engineering to circumvent digital protection of copyright material if they do so to achieve interoperability with an independently created computer program.

For the f ull story click here.

Rulemaking on Exemptions from Prohibition on Circumvention of Technological Measures that Control Access to Copyrighted Works

On October 28, 2003, the Librarian of Congress, on the recommendation of the Register of Copyrights, announced the classes of works subject to the exemption from the prohibition against circumvention of technological measures that control access to copyrighted works...

Full the full story see: http://www.copyright.gov/1201/

 


FEATURED MEETING

Hanging with the chads: The Florida presidential election controversy, followed by the more recent on again/off again gubernatorial recall race in California, have kept attention focused on improving voting technology. It's no surprise, therefore, that NIST is sponsoring a meeting in furtherance of the 2002 legislation enacted to improve the state of the voting art.

1st SYMPOSIUM ON BUILDING TRUST AND CONFIDENCE
IN VOTING SYSTEMS - DEC 10-11, 2003

As part of its responsibilities under the Help America Vote Act of 2002 (HAVA), the Commerce Department's National Institute of Standards and Technology (NIST) will hold a symposium on building trust and confidence in voting systems at the agency's Gaithersburg, Md., headquarters on Dec. 10-11, 2003. The two-day event will bring together a host of people with an interest in election technology, including federal, state and local election officials; university researchers; independent testing laboratories; election law experts; hardware and software vendors; and others concerned about or involved with the latest developments in voting systems.

Audience : Federal, State and Local Election Officials, Academic Researchers, Voting Systems Hardware and Software Vendors, Disability Advocates, Independent Testing Authorities, Election Lawyers, Voting Rights Activists

Enacted by Congress in October 2002, the HAVA legislation gave NIST a key role in helping realize nationwide improvements in voting systems by January 2006.

For more about the symposium, see: http://vote.nist.gov/overview.html

For more about NIST and HAVA, see: http://vote.nist.gov/