February – March 2008
Vol VII No 2
Recognizing "Civil ICT Rights" And Civil ICT Standards
What Just Happened?
The "Fast Tracking" of OOXML is over. Now it's time to clean up.
In the last issue's editorial, I predicted that the confidential OOXML Ballot Resolution Meeting would fail to achieve its objectives. I was both right and wrong: there was only time to discuss and, as needed, revise a small percentage of the c. 900 substantive comments registered last year — but as of this point in time, it appears that OOXML may have been adopted anyway. Whether or not the final vote count indicates that OOXML has won, the credibility and integrity of the formal standard setting system certainly lost.
A Proposal to Recognize the Special Status of "Civil ICT Standards"
The history of humanity demonstrates an ongoing evolution in the balancing of the rights of the individual with those of society. Only in modern times have many of the civil rights we hold to be most dear become recognized and protected by law. With many of these rights now being exercised virtually through the use of information and communications technology (ICT) rather than in person, care must be paid to ensure that these rights are not diminished or endangered. In order to protect such "Civil ICT Rights," we will need to take special care in developing "Civil ICT Standards."
Tracking the Man with the Gavel: Alex Brown on the BRM
In advance of the Ballot Resolution Meeting in Geneva charged with resolving open issues with OOXML, pundits speculated about how Convenor Alex Brown would grapple with the seemingly impossible task. Alex gave some indications of his strategy at his blog, and also offered this prediction: "This will be no love-in." He was right.
WAR OF THE WORDS:
Chapter 4: Eric Kriss, Peter Quinn, and the ETRM
In this chapter, the architects of the decision to adopt ODF — and not Microsoft’s OOXML — begin to plan the visionary "Enterprise Technical Reference Model" that would eventually incorporate that decision.
Steve Jobs' Endangered Second Act
It is a truism of American life that you only get one chance to Have it All. Steve Jobs' first chance at dominating a computer platform died at the hands of Bill Gates in the early 1980s, in large part because Gates was willing to license his technology to clone makers while Jobs was not. Now comes Apple's wildly popular iPhone: will Steve Jobs enjoy a second act with this new platform even more successful than the first — or only a replay, and for the same reasons?
Download PDF of this issue
What Just Happened?
In our last issue, OOXML (a/k/a Ecma 376, a/k/a DIS 29500) was packing its bags to head for Geneva, there to be pored over behind closed doors by c. 120 standards professionals from around the world. With some 1,100 comments to consider (c. 900 substantive, and the remainder non-controversial editorial clean ups), they knew that their task would not be easy. It wasn't.
As I write these words, the votes are being tallied from the thirty day final voting period on OOXML, which closed at midnight, Geneva time, on Saturday, March 29. More properly stated, what is being tallied are the votes that changed, since re-voting is not required. If votes are changed with a sufficiently positive net effect, OOXML will have been adopted by ISO/IEC JTC1. At this moment in time, someone in Geneva knows whether OOXML has been approved, but the public does not. But on April 2, ISO plans to announce the vote, at which point the rest of the world will learn the result as well.
As of now, I have been able to determine, through public and private sources, that enough votes have changed to approve OOXML — unless enough as yet undisclosed votes have shifted in the other direction (my running tally can be found here; as of this moment in time, 22 out of 87 eligible National Bodies are recorded there, and OOXML has a margin of victory equal to three votes under the more difficult of the two tests used to determine success or failure). Already, however, that margin is eroding, as Norwegian Ministry of Trade has already filed a formal protest with ISO — asking that its own vote be disregarded, pending the results of in internal investigation.
But the vote itself is only one of two bottom line stories. The other is "what does it all mean?" And that's what this issue is all about.
In my Editorial, I focus on the collateral damage that the formal standards development and adoption system has suffered during the Fast Track adoption of OOXML — an over 6,000 page specification. That process has been marred by many allegations of improprieties (most frequently "stacking" of committees, but also fun and games with rules, and one confirmed case of financial incentives being provided to business partners to help ensure the desired vote in Sweden), undue pressure applied by vendors, and more.
In the penultimate step, the Ballot Resolution Meeting in Geneva predictably had time to actively debate and, as necessary, revise only a small percentage of all proposed resolutions. Accounts of the meeting varied wildly, as did the claims relating to its success or failure. Since the meeting was held behind closed doors and memorialized by only skeletal minutes, there is no way for the public to know exactly what happened, who to believe, or what it all means. It may come as no surprise, therefore, that my Editorial calls for a thorough review by a neutral committee, and for that committee to recommend appropriate rules to ensure a better result the next time such a contentious situation arises, as it surely will.
In the Feature Article for this issue, I focus on the accelerating transition from in-person to on-line exercise of civil rights, and note that to date we have given insufficient attention to ensuring that these precious rights are not compromised as their exercise grows increasingly dependent on information and communications technology (ICT). I suggest a new term to recognize this new reality, and refer to them as "Civil ICT Rights." I also note that in many important respects — as is the case with document format standards — these rights may only be protected by standards — which I logically refer to as "Civil ICT Standards." Using the just-completed OOXML process a business case, I make the case that the development of such Civil ICT Standards requires greater attention and protection, so that our Civil ICT Rights can also be protected and preserved.
My Standards Blog selection seems, from this remove, to be almost a nostalgic, Edenic, "Before the Fall" reminiscence: written before the Ballot Resolution Meeting, it describes the daunting task that BRM Convenor Alex Brown would face in Geneva, and how he planned to deal with that challenge.
I also include another sample chapter from War of the Words, my ongoing eBook project chronicling the ODF-OOXML contest from its early days. The included chapter tells the story of how the architects of the decision by Massachusetts to adopt ODF, but not OOXML, for its Executive Agencies' exclusive use moved towards that decision.
I close, as usual, with my Consider This essay, which departs from the document format theme to reflect upon opportunities seized and, perhaps lost, in another closely contested and enormously strategic race: the global contest to dominate the "smartphone" platform. Will Steve Jobs not only take the early lead this time, but also continue to dominate the Smartphone marketplace? Or will he once again lose his advantage, through asserting too great a degree of proprietary control? In this essay, I give my view. Only time, of course, will tell.
Meanwhile, it's an awesome cool tool.
As always, I hope you enjoy this issue.
Editor and Publisher
2005 ANSI President's
Award for Journalism
The complete series of Consortium Standards Bulletins can be accessed on-line at http://www.consortiuminfo.org/bulletins/. It can also be found in libraries around the world as part of the EBSCO Publishing bibliographic and research databases.
Sign up for a free subscription to Standards Today.
return to top
The last issue of Standards Today was titled, ODF vs. OOXML on the Eve of the BRM. That issue focused on the Ballot Resolution Meeting (BRM) about to be held in Geneva, Switzerland as the penultimate act in the Fast Track approval process of DIS 29500, the specification submitted by Ecma and based upon Microsoft's OfficeOpen XML document formats (OOXML).
My editorial in that issue was prophetically titled The Overwhelming of ISO/IEC JTC1, due to the fact that only one week had been allocated to resolving more than 1,100 separate comments (some 900 of them substantive) that had been registered by National Bodies from around the world during the voting period that failed to approve OOXML during the initial balloting period in mid-2007.
Without exception, every fear that I raised in that editorial was realized, and worse. Here is a sampling:
Prior concern: "Due to the 6,000 page length of OOXML, not all problems are likely to have been identified during the formal review period. But any deficiencies in OOXML discovered after September 2, according to the JTC1 Directives as cited by Brown, are "out of scope," and may not be addressed at the BRM. Instead, they must await resolution in the next review cycle (i.e., years in the future)."
Reality: Far from worrying about addressing new concerns, there was (as expected) insufficient sufficient time to interactively discuss and, as necessary, revise the vast majority of old comments. One consequence was that addressing even many of the concerns submitted in 2007 were deferred to resolution during a future "maintenance phase" of the specification.
Prior concern: "It does not appear at this time as if the resolutions proposed by Ecma will be made available at a public Web site before the BRM, if ever. Consequently, the 500 million users of Office and the legions of independent software vendors whose software must be used in conjunction with Office will have no opportunity to convey their opinions to the delegates that will nominally represent their interests at the BRM."
Reality: Not only did those who were not involved have the opportunity to access the proposed resolutions for review, but one official delegation complained that it had not been able to consider any proposed resolutions other than those offered in response to their own comments.
Prior Concern: "The final vote on OOXML will follow the conclusion of the BRM, whether or not all comment resolutions have been resolved. It appears that if the vote is in favor of adoption, unresolved comments will not be dealt with, if ever, until the next review cycle."
Reality: Only a small percentage of the c. 900 substantive resolutions were interactively discussed and, as necessary, revised. The remainder was disposed of in a process that allowed each delegation to vote upon each resolution that it wished to weigh in on, and to make a blanket choice of "approve," "disapprove" or "abstain" as to the balance, if desired. The time permitted for voting on all c. 900 proposed resolutions (comprising well over 1,000 pages of text) was less than 24 hours.
Confronted with this impossible task, only six delegations chose to approve and four chose to disapprove. Of the remainder, 18 chose "abstain" — and four chose not to register a position at all. As one delegate stated in the meeting, "If this was all that would be permitted, I would have preferred to have stayed at home and had two weeks to consider how to vote." Despite the fact that in the ordinary course all resolutions would be discussed, during as many meetings as needed, OOXML proponents announced that the BRM "was an unqualified success."
Prior Concern: "No outsiders will be allowed to attend the BRM, nor will any transcript be prepared and made available."
Reality: Only a skeletal summary of the actions discussed and resolutions adopted was made available. Moreover, those in attendance were requested not to discuss anything that transpired during the BRM with anyone outside the meeting, either during the course of the meeting or afterwards. Not surprisingly, the result is that widely different accounts were posted even by those who had attended the BRM, ranging from the pronouncement by one delegate that the BRM had been "complete, utter, unadulterated bullshit," and the statement by another that "The process really worked (it was very cool)." (I have provided links and excerpts from the accounts of nine delegates from 7 countries here, and much more original source material here, so that those that are interested can form their own judgment.)
In the absence of a detailed official record or the admission of the press or any other neutral third party, those around the world whose lives will be impacted by the final result can only scratch their heads and wonder what just happened, and who to believe.
While every first hand account applauded the efforts of Convenor Alex Brown and of the delegates at making the best of the situation and achieving the greatest degree of improvement in DIS 29500 possible, the result by anyone's account was the submission for final voting of a specification that had received less attention and collaborative effort to improve its quality than would typically be the case under any other circumstance.
And there was more to come. During the thirty day voting period that immediately followed, accusations of abuse of process at the National Body level once again abounded. And once again, accounts of what actually happened varied widely. Here is a sample, posted by Geir Isene at his blog on March 30. In it, he gives his version of what happened when the appropriate committee met in Norway to decide whether or not to change its vote on OOXML:
March 28th: Meeting in the Norwegian Standards Institute (Standard Norge).
Purpose: To decide the final vote for Norway on whether the document format OOXML should become an international standard.
The meeting: 27 people in the room, 4 of which were administrative staff from Standard Norge.
The outcome: Of the 24 members attending, 19 disapproved, 5 approved.
The result: The administrative staff decided that Norway wants to approve OOXML as an ISO standard.
Their justification: "Standard Norge puts emphasis on that if this [OOXML] becomes an ISO/IEC standard, it will be improved to better accommodate the users' needs."
This translates to: "Yes, we know the standard is broken, 79% of our technical committee have told us. But we hope that it someday will be repaired by someone. And we'll be happy to help if someone can give us the resources."
Alright, the Norwegian Standards Institute is moving away from adopting quality standards to promoting a repair shop philosophy.
Needless to say, such accounts do not inspire confidence in those that must live with the decisions made by those with the authority to make them. Nor did the news on March 31 that the Chairman of the same committee had just sent a formal protest to ISO, that included the following language:
Because of this irregularity, a call has been made for an investigation by the Norwegian Ministry of Trade and Industry with a view to changing the vote.
I hereby request that the Norwegian decision be suspended pending the results of this investigation.
The denouement of this ongoing drama is that at the end of the thirty day voting period, a sufficient number of National Bodies — including Norway — appear to have changed their votes to secure the final adoption of OOXML (the formal announcement may not be made until just after this issue is delivered).
Without assigning blame to either the proponents or opponents of OOXML, two questions that demand answers must be posed: Is this any way to conduct the process whereby the global standards upon which governments and society rely are developed and adopted? And if not, what will be done about it?
It is impossible to avoid the conclusion that the credibility and integrity of the formal standards development process has suffered serious damage as a result of what has just transpired. While that process may serve perfectly well under less contentious circumstances, reforms are obviously needed to address those exceptional circumstances in which greater protections are needed.
In order for the credibility of the traditional system to be restored, a thorough review of the just completed DIS 29500 Fast Track process should be immediately commissioned. That review should include recommendations for reform that would include, but not be limited to, suggesting revisions to the rules relating to Fast Track and PAS submissions, new National Body and ISO/IEC JTC1 rules relating to transparency and conflicts of interest, and providing for circuit breakers and corrective actions that could be invoked the next time such a process has clearly run off the rails.
Copyright 2008 Andrew Updegrove
Sign up for a free subscription to Standards Today.
return to top
A Proposal to Recognize the Special Status of
"Civil ICT Standards"
Abstract: In modern times, civil rights have enjoyed increasing protection, although the exact balance between the rights of the individual and those of society as a whole has been found at different points at different times, and among different societies. Advances in technology, beginning with the printing press, have played a role in this process. Today, civil rights such as freedom of speech, freedom of assembly, and the ability to fully interact with government are increasingly being exercised through the use of information and communications technology (ICT), rather than in person. As this process accelerates, attention must be paid to how such "Civil ICT Rights" can be exercised, so that they are not compromised, diminished, or only selectively, rather than universally, available. Special attention will need to be paid to the development and adoption of technical standards as well, because they will play an essential role in protecting Civil ICT Rights. In this article, I make the case for recognizing the existence and importance of what I call "Civil ICT Standards," and argue that the development of Civil ICT Standards, in contrast to purely technical standards, requires more stringent rules and processes, so that our Civil ICT Rights can be protected and preserved.
Introduction: The rise of civil society has been enabled by developments in diverse disciplines, from the mastery of agriculture (which made life in more complex, settled communities possible) to the evolution of systems of weights and measures, to facilitate business exchanges. Many of these innovations were permanent, in the sense that once their benefits were realized, further development came in the form of increasing refinement and expansion of the original concepts, in processes that continue to this day.
But other, equally important concepts were based upon assumptions that were philosophical, or class based, or otherwise susceptible to subjective forces. Chief among them, from the standpoint of importance to the stability of society, were systems of laws and of governance, because while each was essential, its implementation could be, and has been, highly variable.
For example: inherent in both laws and governmental systems is the concept of balancing the rights of individuals with the needs of society as a whole. With the rise of more hierarchical societies, consolidation of valuable property in fewer hands, and the development of governance systems based upon kingship, the rights assigned to the classes at the bottom of the pyramid radically declined from those that they had enjoyed in tribal societies led by consensus-acknowledged leaders. The laws created by the elites, not surprisingly, also reflected this reality, assigning more rights to the classes that created the laws and had the power to enforce them.
Much as a constitution or bill of rights establishes and balances the basic rights of an individual in civil society, standards codify the points where proprietary technologies touch each other, and where the passage of information is negotiated.
Other societies (notably the Greeks and, derivatively, the Romans) evolved legal systems that acknowledged greater rights in the individual — or at least those individuals that had the good luck to be born into the ranks of citizens of the state. Only in very recent times have concepts such as universal human rights and the democratic selection of leaders by all, regardless of gender, education, or social status, achieved the same level of acceptance that they enjoyed in many hunter-gatherer societies.
Striking the desired (and subjective) balance between the rights of the individual (to do as she pleases) with the rights of the state (to ensure the welfare of all individuals as a whole), however, has been highly variable from nation to nation. This balancing has been periodically adjusted and recorded in what have commonly come to be referred to as "Bills of Rights." These watershed documents include (in the Western tradition) such important agreements as the English Magna Carta (1215), the French Declaration of the Rights of Man and of the Citizen (1789), and the United States Bill of Rights (which amended the United States Constitution, when ratified in 1991). Each of these documents augmented the rights of the individual at the expense of the state, while remaining aware of the importance of protecting society as well.
While this trend has been largely linear across all societies over sufficiently meaningful periods of time, it has not been uniform among them. The balancing of the rights of the individual in comparison to those of society has of course been artificially skewed towards the state (a more proper reference than "society," in this context) in authoritarian and totalitarian societies. But even demonstrably "free" societies display significant variation today in law based upon the values and philosophies of their individual societies. Those differences are manifested in various ways, such as the level of individual income taxation deemed to be acceptable, the legal test adopted to constitute the libel of public officials in the press, whether or not an individual can display evidence of religious association, and what an individual can and cannot do on their own private property.
The ability to enjoy individual rights in modern times has also benefited from, as well as become dependent upon, technological innovation. The concept of freedom of speech, for example, gained far greater scope with the invention of the printing press, because ideas and positions could be much more widely disseminated by persons other than governmental or religious authorities. Similarly, the right of assembly became ever more meaningful as transportation systems became more sophisticated, inexpensive and available.
In more modern times, information technology, and then the Internet, has reshaped the exercise of many civil rights, from the right to vote (via various electronic, and not always foolproof, scanning and tabulating devices), to right to petition government (via the Internet), to the exercise of the freedoms of speech, freedom of assembly and freedom of religious expression — sometimes all at once — via an ever-expanding variety of information and communications technology (ICT) based channels.
This extension of the exercise of human rights to ICT based platforms has occurred with a swiftness that is without precedent. Indeed, a greater and greater percentage of the exercise of civil rights is accomplished via such means on (literally) a daily basis in developed nations, and the process may be even more dramatic in developing nations as "smartphones" and other inexpensive, mobile devices become deployed by the hundreds of millions in just the next few years.
But just as a newspaper could historically be silenced by a court order or seizure of its offices by government action, or its voice muffled by overly restrictive laws, so also can a Web site be blocked, or the ability of an individual to interact with her government be restricted to only such technical means as that government chooses to support.
As government and private interests each move increasingly to ITC platforms, the traditional means of exercising valued civil rights will gradually be eliminated. Perhaps as a result of the speed with which this process is occurring, however, very little attention has been paid (other than in the context of voting) to the deficiencies that new virtual platforms, business models and governmental portals may have, as compared to their historical in-person and paper based analogues. Only recently has the realization begun to dawn that, just as the lack of handicapped access to a polling station can deny the vote to someone with a physical handicap, so also can a government portal Web browser that does not support accessibility standards.
Governments are now becoming more aware of such concerns, and beginning to look for ways in which the benefits of ICT can be adopted without sacrificing, or compromising, civil rights. But they are also realizing that the new bottles into which this old wine is being poured are quite different in important but technically subtle respects, presenting new issues, and demanding competence in technology areas that are new and unfamiliar to them. It is also leading them to examine whether activities that have to date been primarily within the domain of private industry may require government attention, oversight, or even regulation.
This examination by government will once again involve rebalancing the rights of individuals with society. More importantly, it will also require a three-way rebalancing of the rights of commercial interests with those of individuals and society as a whole. To the extent that this process does not occur organically in the private sector, governments will need to act to bring about behavior deemed to be desirable, either through direct action (i.e., through new regulations), or less coercively, by actions such as adopting preferences in government procurement.
In this article, I will explore the impact of our increasing transformation from a society in which civil rights are exercised in person, to one in which those same rights can only be fully exercised electronically. I suggest that this transition leads to the need to recognize a new concept that I will call "Civil ICT Rights." More specifically, I will describe the increasingly crucial role that certain ICT technical standards will play in determining whether or not we are able to fully exercise our Civil ICT Rights. I will also identify this subset of standards, which I will refer to as "Civil ICT Standards." Finally, I will seek to demonstrate why I believe that the current standard setting infrastructure is inadequate to reliably create Civil ICT Standards of a quality and openness that I believe are essential to protect our increasingly important Civil ICT Rights.
I The Digitization of Civil Rights
We are entering an era in which IT technology is to society as earlier very different modalities were to human rights. In this new interconnected world, virtually every civic, commercial, and expressive human activity will be fully or partially exercisable only via the Internet, the Web and the applications that are resident on, or interface with, these resources. And in the Third World, the ability to accelerate one's progress to true equality of opportunity will be mightily dependent on whether one has the financial and technical means to lay hold of this great equalizer.
This is where standards enter the picture, because standards are where policy and technology touch at the most intimate level.
Not surprisingly, with these new and wonderful technical possibilities come real risks and responsibilities. In order to avoid the former and assume the latter, questions of social policy enter the picture, because where the unconstrained forces of the market place will lead may not be where the best interests of society will lie.
In the dawn of the computer age, when only isolated mainframes lived in major corporations and research labs, such a concern barely existed, if at all. But as the world becomes more interconnected, more virtual, and more dependent on ICT, public policy relating to ICT will become as important, if not more, than existing policies that relate to freedom of travel (often now being replaced by virtual experiences), freedom of speech (increasingly expressed on line), freedom of access (affordable broadband or otherwise, and suited to the needs of those with physical disabilities), and freedom to create (open versus closed systems, the ability to create mashups under Creative Commons licenses, and so on).
This is where standards enter the picture, because standards are where policy and technology touch at the most intimate level.
The emergence of Civil ICT Rights: Much as a constitution or bill of rights establishes and balances the basic rights of an individual in civil society, standards codify the points where proprietary technologies touch each other, and where the passage of information is negotiated.
In this way, standards can protect — or not — the rights of the individual to fully participate in the highly technical environment into which the world is now evolving. Among other rights, standards can guarantee:
- That any citizen can use any product or service, proprietary or open, that she desires when interacting with her government.
- That any citizen can use any product or service when interacting with any other citizen, and to exercise every civil right.
- That any entrepreneur can have equal access to marketplace opportunities at the technical, standards-mediated level, independent of the market power of existing incumbents.
- That any person, advantaged or disadvantaged, and anywhere in the world, can have equal access to the Internet and the Web in the most available and inexpensive method possible.
- That any owner of data can have the freedom to create, store, and move that data anywhere, any time, throughout her lifetime, without risk of capture, abandonment or loss due to dependence upon a single vendor.
We can, therefore, aptly refer to such technology-enabled — and therefore also technologically vulnerable — rights as Civil ICT Rights. Having recognized this vulnerability, we must also pause a moment to ask: what will life be like in the future if Civil ICT Rights are not protected, as paper and other fixed media disappear, as information becomes available exclusively on line, and as history itself becomes hostage to technology?
II The Vulnerability of Civil ICT Rights
The document format test case:1 On March 29, the final step in the process whereby a document format designated as DIS 29500 in ISO/IEC JTC1, the de jure standards committee within which such technology standards are considered, came to a close. DIS 29500 is based upon the OfficeOpen XML (OOXML) formats developed by Microsoft for implementation in Office 2007, the significantly updated version of its flagship office productivity software package. The specification for these formats had earlier been submitted by Microsoft to a standards body called Ecma, which in turn revised, approved, and submitted the resulting standard (now called Ecma 376) to JTC 1 under what is referred to as the "Fast Track" process.
What ensued has been the most hotly contested standards battle in recent memory, but for more than just the usual commercial reasons. At the factual core of the issue is the fact that another document format, earlier developed by the Organization for the Advancement of Structured Information Systems (OASIS), and popularly known as the Open Document Format (ODF), had already been submitted to, and approved by the same joint committee of ISO/IEC.
The contest between ODF and OOXML has been extensively reported in countless articles, interviews and blog posts from around the world. Supporters of OOXML point to the fact that Microsoft's historically closed product architecture is now more open, and that developers can now more easily, and on a more level playing field, develop products that interoperate with Office. They can also develop new stand alone products that utilize OOXML independently.
Opponents of OOXML point to the fact that ODF had already been approved by ISO/IEC JTC1 as ISO/IEC 26300, that ODF is already implemented in multiple proprietary, as well as free, open source office suites, and that OOXML is not fully implemented in even one product (Office 2007 will need to be revised to comply with the final version of DIS 29500). They also contend that its primary use will be to perpetuate the dominance of a single vendor, that Ecma 376 was deeply flawed when submitted to JTC1, that the Fast Track process was inadequate to address the many flaws identified in the comments submitted, and that the consideration and voting processes that followed in many National Bodies were marred by alleged misconduct intended to sway national votes in favor of approval (Microsoft has said that it believes that opponents of OOXML, and in particular IBM, engaged in similar conduct).
At minimum, the following facts would be agreed upon by both sides. As initially submitted to JTC1, Ecma 376 comprised over 6,000 pages. Objections raised during an initial one month "contradictions" period were deemed not to require action, while in the five month examination and voting period that followed, some 1,100 separate comments were submitted by many of the 87 countries that participated in the evaluation of OOXML. Irregularities were also alleged in many countries, including "stacking" of various committees with employees or business partners of individual companies. In one case (Sweden), Microsoft admitted that an employee had offered to compensate companies indirectly (through marketing incentives) for the cost of joining the committee entitled to vote on OOXML.2
At the end of the five month voting period, OOXML failed to receive sufficient votes to be approved. Under established rules, Ecma then created a document that proposed resolutions to the comments submitted during the voting period, either by finding them unnecessary to address, or by proposing a suggested remedy. That document ran to over 2,300 pages. A one-week "Ballot Resolution Meeting" (BRM) was held to consider the c. 1,100 comments (less than 200 of which were non-controversial typographical corrections and the like) in Geneva, Switzerland, from February 25 — 29, 2008. At that meeting, only a small percentage of the substantive, proposed resolutions were fully discussed and voted upon. For lack of time, the remainder were addressed by allowing each participating delegation to vote to approve, disapprove or abstain on any individual resolution, while assigning a default position (choosing among the same positions) for any that it did not specifically address.
Once more, there was serious disagreement over the results. Microsoft and other OOXML supporters pointed to the fact that OOXML was now significantly improved from the original specification, and that any interested party could participate in the further evolution of the originally proprietary specification. Opponents contended that OOXML still contained many serious flaws, that the Fast Track process, and particularly the BRM, had not applied the same level of quality assurance that had historically been followed applied in ISO/IEC JTC1, and that irregularities had once again occurred when the National Bodies considered whether or not to change their original votes.
While the final vote has not been announced as of this writing, a sufficient number of National Bodies are known to have changed their votes to approve OOXML — provided that a sufficient number of thus far-unannounced votes have switched in the other direction. To complicate matters, it appears certain that multiple challenges might be brought to the vote in individual National Bodies (Norway has already requested that its own vote be disregarded until an internal investigation is complete), that a review of the entire process by ISO/IEC would be demanded by some, and that many were left with the opinion that the historically collegial de jure process had proven to be inadequate to the very significant commercial pressures that had been brought to bear upon it, from beginning to end during the Fast Track process.
Why open document formats matter: Had this been a simple standards war in the grand tradition (and there have been many bitter contests in the past), the news coverage would have been far more limited, and the conflict could be expected to drop quickly from public sight. But a number of factors distinguish this still ongoing standards war from its predecessors. Within those factors can be found the attributes that distinguish an important Civil ICT Standard from its purely technical brethren.
Those factors include the following:
- Access: Early on, it was realized that electronic documents are far more at risk to loss over time than paper records, and that these risks include the rapid passing of a format into obsolescence, just as audio formats (eight track, cassette, CD and so on) have passed out of common use with increasing rapidity. Given the need for easy access to text records over extremely long periods of time, governments have come to grasp the importance of adopting a document format standard that would prove to be widely adopted and be easy to maintain (and therefore likely to be maintained) over similarly long periods of time.
- Accessibility: Public awareness of the open document format contest began in August of 2005, when the Information Technology Department (ITD) of the Commonwealth of Massachusetts announced that it would adopt procurement guidelines that called for purchasing only ODF-compliant office productivity products. Because Microsoft had announced that it would not support ODF, this would mean replacing as many as 50,000 copies of Office by an initially planned deadline of January 1, 2007. The ITD's announcement brought to light the question of whether ODF-compliant products were as accessible to those with disabilities as Office, which is supported by a variety of third-party software products that augment its accessibility. This was particularly important, because governments have gone to greater lengths than most private businesses to accommodate, and hire, those with disabilities. In fact, it was found that ODF products were not as accessible at that time. In response, both OASIS and many of the proprietary and open source developers of ODF compliant products accelerated their efforts to eliminate the gap.
- Competitive concerns: Microsoft Office is hugely dominant in the marketplace today. Adopting a format that provided the basis for wide adoption in a way that did not provide a special advantage to Microsoft, and then directing government purchasing power towards products implementing that standard, could introduce incentives to the marketplace that could (and already has) reintroduced competitiveness into an important market niche where it has been notably absent for almost twenty years.
- Convergence and broadening of "openness" concepts: The number of people supporting the concept of free, open source software (FOSS, or FLOSS, for free/libre open source software) has grown to be very large, and continues to grow rapidly, embracing not only software developers, but also others (technically sophisticated and otherwise) who believe in the freedoms that FLOSS can enable. At the same time, popular concepts and methods of achieving "openness" have already broadened, most significantly to involve content as well. As one example of this trend, the amount of text, photo, video and audio content now being made available under the various licenses created (in many languages) by CreativeCommons.org is increasing geometrically. And now, with the great debate over ODF and OOXML, public consciousness of the vital role that open standards can play (and therefore the importance of the process by which they are created) has risen greatly as well.
- The right to choose: As more and more citizens become technically sophisticated, their desire to obtain and use the technical tools of their choosing (including FLOSS) has increased dramatically, a trend that can be expected to increase. More and more of these citizens do not wish to be told which tools they must purchase in order to interact with their own government.
The result of this convergence is that a significant number of legislators and citizens came to recognize, at least in this instance, that whatever standard becomes widely adopted will have a different sort of impact upon them than any that they have been aware of before. This, in turn, focused their attention on how the standards development and adoption process operates, whose interests are primarily served along the way, and whether that process succeeded or failed in serving their own interests in the case of open document formats.
Developing a definition of Civil ICT Standards: Although no specific terms or criteria have thus far been used publicly to articulate why, and which, open document formats stand on a different plane from WiFi specifications, such a differentiation seems clear. I would submit that the core difference is that such standards are essential to the electronic exercise of one or more civil rights, and hence the choice of the name "Civil ICT Standards" to describe them.
Standards in this class today comprise only a small, but vitally significant percentage of all standards. But they demand special attention in their selection and protection in their use, because their impact is both fundamental and far reaching. And, since some standards (like document formats) are intended for very long term use, it is more than usually important to select them carefully.
A number of existing Civil ICT Standards can already be readily identified. By way of example, they include those that enable universal global access in native character sets (the Unicode) and the basic standards upon which the Internet and the Web are based. In the future, Civil ICT Standards will include those that relate to health records, privacy, security, electronic voting, federated identity, and much more. Over time, they will become both more numerous as well as more important.
III Moving Towards a Civil ICT Rights and Standards System
Recognition: Before Civil ICT Standards are likely to be given special attention, their existence must become widely recognized. The ODF-OOXML contest supplies a convenient litmus for assessing initial attitudes on this subject.
For some, technical issues and the past proprietary practices of Microsoft either outweighed, or obscured the special characteristics of document formats. A prominent example can be found in Patrick Durusau, the ISO/IEC Project Editor for ISO 26300 (ODF), who came out in favor of adoption of OOXML after the close of the BRM, focusing on the progress that had been made with the OOXML specification rather than on whether OOXML had had all of its technical imperfections resolved,3 or whether it would lead to effective competition in the marketplace at all.4 From this point of view, the primary goal seems to be to negotiate better terms while continuing to live within an already existing ecosystem with a dominant vendor at its center. Other OOXML proponents viewed public concerns over document formats as being simply a cover for aggressive tactics by Microsoft's rivals.5
There are many individuals around the world that would bridle at the implication that they have simply been the puppets of Microsoft's enemies. For those that take this view, the successful vote to adopt OOXML was a step away from, rather than a way to advance towards, a future in which Civil ICT Rights are guaranteed.
Process controls: If a consensus arises around the concept that there is a separate class of standards that should enjoy special attention and protection because of a unique relationship to Civil ICT Rights, then the next step is to determine whether the existing standards development and adoption infrastructure is equal to that task.
The first question that arises in that context is whether Civil ICT Standards should be under the control of government, the private sector, or in some manner shared. At the one extreme, there is self-regulation by industry, and at the other there is legislation leading to government regulation. But the former is subject to proprietary pressures and usually does not include meaningful participation by all stakeholders (especially end users), while the latter is slow, cumbersome, and still subject to lobbying by commercial interests.
Already, there is a broad spectrum of practice in place, that extends from de facto standards developed by one or a few vendors, to government regulations, with purely technical specifications being the subject matter in the former case, and standards relating to safety and public health falling more typically into the latter category.
In between, there are many variations, from consortia, to the quasi-governmental ISO and IEC, in which participation is by National Bodies, to the ITU, a treaty organization in which participation is at the national government level. Where along this continuum should authority over Civil ICT Standards come to rest?
Given the speed with which technological advancement occurs, a case can be made for allowing Civil ICT Standards to continue to be developed by the existing consortia and accredited bodies in which they are now addressed. However, as the ODF-OOXML experience has shown, that infrastructure at minimum needs to be "ruggedized" to withstand the onslaught that sometimes descends upon it when significant commercial interests are at stake. And perhaps it needs to be more substantially overhauled as well, due to the fact that the tenets upon which its structure is based are not necessarily identical to those upon which Civil ICT Standards need to rely.
The following are a few examples of concerns that arose in the course of the OOXML Fast Track process that are worth revisiting in this context:
Quality: In the case of most standards, a poor job in development is likely to be followed by weak adoption. However, this will not always be true. In the case of OOXML, the relative success or failure of the Fast Track process to deliver a quality product will likely have little or no impact on uptake in products that form part of the Microsoft Office ecosystem, due to the pre-existing dominance of Office in the marketplace. In the case of a purely technical standard, the consequences for society in general and the individual in particular for such a result may be low. In the case of a Civil ICT Standard, however, the consequences could be far higher, and therefore, I would submit, the quality controls should be more stringent.
Influence: There have been many allegations of undue influence being asserted on those that cast votes in their respective National Bodies. Whether or not you believe that a call from Steve Ballmer, the Chief Executive Officer of Microsoft, to the United States Secretary of Commerce to discuss the vote of the National Information and Standards Institute (NIST) as a member of the OOXML voting body, may conceivably be a matter of opinion. But I would submit that the reported ability of a single Microsoft employee to block a vote by a National Body to disapprove OOXML represents a failure of the rules of that National Body to protect against the self-interested actions of a single vendor.6 While it is true that vendors often bring significant technical knowledge, and therefore value, to the standards development and adoption process, such an example indicates the need for additional rules to guard against the abuse of such rights of participation.
Transparency: Most governmental activities operate under rules of transparency to the public, in order to ensure that citizens are fully informed, and fully protected. Nominally, the formal standards development process pays lip service to the same value. However, in the breach, the rules are far different. No one other than National Body delegates and representatives of Ecma and ISO were permitted to attend the BRM, and the audio record has not been made publicly available. The only written minutes and record released are minimal in the extreme, comprising only a few pages of text to reflect the activities of a full week of meetings. Those who attended were also requested not to divulge what had transpired, although many declined to be bound by this request.7 Even the proposed resolutions that were prepared by Ecma for consideration at the BRM were posted to a Website to which only limited National Body access was granted. Similar practices and rules abound in multiple National Bodies as well.
While it is true that there are some valid reasons for the genesis of this practice (delegates from small vendors may rightly fear reprisals from large vendors upon whom they are dependent if they vote in the "wrong way"), such secrecy should be regarded as clearly incompatible with the creation and adoption of Civil ICT Standards. Moreover, there are global consortia, such as OASIS (the developer of ODF), that have created hotly contested standards, and yet operate on a fully transparent basis, posting detailed minutes of all meetings to the public portion of its Web site, conducting all discussions in open electronic fora, and posting review drafts of all proposed standards for public comment prior to adoption. Given the importance of transparency and the ability of organizations such as OASIS to conduct a successful process, greater transparency should clearly be required in the creation of Civil ICT Standards.
The role of government: There is ample evidence that the status that a standard can attain, and the value that certain purchases place on such credentials, can have a very substantial impact on the conduct of even the most dominant and powerful vendors in the world. In the case of document formats, we have seen that the non-legislative action of a single US state — Massachusetts — dramatically accelerated the credibility of ODF, motivated enormous efforts on the part of many individual as well as commercial supporters to support that standard, and forced Microsoft to take open document formats far more seriously than it is likely to have done otherwise. Increasing interest in the importance of document formats by other governments (and regulators), especially in Europe, has further motivated supporters of both formats, and brought about more movement by Microsoft.
When governments commit to procure only software based upon truly open document formats implemented by multiple competing products, that promise tells both proprietary and open source developers that a sufficiently large market will exist to reward the substantial effort required to produce robust and compliant products. Such governments have provided the first credible incentive for market participants to compete on the desktop in almost two decades. This in turn has provided incentives to Microsoft to more aggressively innovate there as well, rather than simply seek to maintain its installed base while maximizing profits. One need only look to the historical intervals between releases of a product such as Internet Explorer to see this predictable dynamic at work.
During the ODF-OOXML contest, some OOXML proponents have contended that standards-based government procurement preferences and legislative requirements are in some way undesirable. In fact, such actions by governments are entirely consistent with their role, as demonstrated by long-standing past practice. In the United States, for example, government contractors alone must abide by a wide variety of rules that are intended to pursue social goals, such as encouraging minority hiring and other rules that require the preferential award of contracts to women and minority owned businesses. The goal of each is to help historically disadvantaged classes of individuals gain equal access to good jobs, and to successfully launch businesses of their own.
However, these salutary results can only be achieved if the standards that achieve the necessary status are worthy of the benefits that they can bring to society. If that status becomes too easily available, then the legitimacy of the process is lost, as are the benefits that it could otherwise provide.
IV Conclusions and Recommendations
If the existence and importance of Civil ICT Rights and Civil ICT Standards becomes recognized, then I believe that one, or a combination, of three avenues could be employed to protect the former, and properly develop the latter.
ISO/IEC/National Body Reform: In the wake of the ODF-OOXML contest, the existing de jure standards infrastructure would need to undergo a concerted and determined process of self-review, with the commitment to institute new rules to avoid a repeat of the OOXML experience just witnessed. ISO 9001 might indeed serve as an excellent and apt reference point in this process. It should not be necessary to change the rules for standards those that continue to be developed and discussed in less contentious settings, given that a serious failure of the system is a rare exception rather than the rule.8
The following are samples of nominally intrusive rules that could be considered for use where Civil ICT Standards are involved:
- Set more stringent rules for standards that hope to receive preference in government procurement.
- Provide a mechanism whereby interested parties can assert Civil ICT Standard status for a given type of specification, for determination by a neutral, in the case of disagreement.
- Provide for "circuit breakers" that can interrupt the normal process and appeal for a determination of the issue at hand, when warranted and necessary. A right to have appealed the appropriateness of the Fast Track process for a 6000 page specification at the very beginning would have avoided many of the issues and less than desirable results that followed.
- Require more than one fully compliant implementation of a proposed Civil ICT Standard before it can be submitted for consideration.
- Utilize a more stringent set of rules and requirements relating to transparency and avoidance of undue influence, at both the National Body as well as the ISO/IEC level
Form a new global body: In a previous issue of Standards Today9, I reviewed areas of weakness in the existing infrastructure for developing ICT standards, and made a proposal for a new type of entity that would not replace either the existing consortium or de jure standards development systems. Instead, it would create standards by which standard setting organizations of both types (SSOs) could be judged (and, indeed, ISO and IEC as well). Those standards could address the quality, openness, broadness of representation, and other relevant attributes of SSOs, thus providing a more rational basis whereby their work product could be judged by governments, customers, and communities of interest. SSOs and standards could also be rated on other criteria, such as the environmental impact of their standards.
Once such standards existed, market forces could be expected to assert themselves, with governments and some customers giving greater respect to standards from more highly rated SSOs, and to standards that had met openness, representation, and other relevant process criteria. Over time, SSOs in general could be expected to improve, in order to gain access to the best available work (SSOs have competitors, too), and to ensure the widest uptake of their standards. Indeed, there are hundreds of standards bodies of this type already in existence, certifying the credentials of all manner of professionals and the ethics, processes and other activities of businesses of all types. Needless to say, specifications would only be eligible for adoption by ISO or IEC as Civil ICT Standards if they had been created by organizations that satisfied the appropriate ratings.
Involve government: Unless one of the two alternatives suggested above, or some other equally efficacious method, is applied by the private sector, there will be no other way to influence the marketplace other than by government action. And, in fact, this has already begun to occur, through the exercise of the very significant procurement power of some governments. It was, after all, the decision of one small commonwealth in the United States that at initially ignited the document format standards war. And irrespective of the adoption of OOXML, it can be expected that citizens in some cases may advocate for, and governments in other cases may prefer, to implement ODF rather than OOXML for a variety of reasons that relate to how the OOXML process was conducted, and in order to influence future activities in the marketplace.
Of course, if governments become convinced that Civil ICT Rights are as important as historical civil rights, then governments may become just as involved in their creation, adoption, and enforcement as their constitutional analogs. If governments do come to such a conclusion and the private sector has not taken up the challenge to protect such rights through its own action, then they will have to cope with the regulatory consequences.
Summary: I am convinced that today we have a problem that requires attention. Although Microsoft might have hoped otherwise, OOXML has provided clear evidence of the existence of a dangerous inability on the part of the traditional standard setting infrastructure to satisfactorily address Civil ICT Standards in the face of fierce commercial actions. This flaw in the existing system is in urgent need for a solution.
The fresh memory of the contentious, severely flawed, and very publicly reported process that has just ended should assist us by providing the incentives needed to begin to grapple with the difficult issues that stand between where we are today, and where we need to be. I believe that it is very important that we do so successfully, and soon, because the speed of technological innovation and adoption is far outrunning the recognition and protective of Civil ICT Rights and Civil ICT Standards.
Nor should it fail to be mentioned that the stakes for society are even higher than I have thus far suggested, because the questions raised in the context of Civil ICT Standards extend beyond the field of ICT. Standards of equal importance are urgently needed in areas such as global warming, and will tell us what we can and cannot do except at our peril, how we will determine whether we are winning or losing that battle, and how we can protect our environment from further degradation. Unless the private sector adopts higher standards and more rigorous processes of its own, government will be forced to intercede here as well — but only after much long term damage has already been done.
So it is we see that what happened in ISO/IEC JTC1 and in National Bodies around the world was about far more than whether Microsoft would win and IBM and its allies lose, or vis-versa, even if that has been the immediate and superficial result. In a larger sense, the battle that has just ended has been about fundamental human rights, about not only seizing but also securing the opportunities of the future for the benefit of all. Only by thinking clearly and deeply about these larger issues will we be able to adapt the practices of the past to meet the challenges of a future that has already arrived, whether we wish to realize it or not.
Copyright 2008 Andrew Updegrove
1 It should be noted that while the following summary is intended to be objective, I have been a consistent and vocal advocate for denying ISO/IEC JTC1 adoption of OOXML, for reasons that include my belief that approval will support continued domination of the desktop by a single vendor, thereby diminishing the competition and innovation that widespread adoption of ODF may bring, the inappropriateness of the Fast Track process for so large and initially flawed a specification, leading to low quality in the final result, the fact that approval will reward Microsoft's continuing refusal to support ODF natively, proposing its own standard instead, and the likelihood that approval under such circumstances will encourage, rather than discourage, similar conduct by Microsoft and other vendors in the future.
2 In February, the Wall Street Journal reported that European regulators were inquiring into whether Microsoft had violated antitrust laws during the voting period. See, Forelle, Charls, Microsoft's Office Push Scrutinized by EU, February 8, 2008, at http://online.wsj.com/article/SB120242867034452081.html?mod=technology_main_whats_news accessed March 30, 2008.
3 In one of several open letters, he referred to the OOXML process, with all of its flaws, as a "Poster Child for Standards Development," and criticized OOXML opponents as follows: "The OpenVMS project has made a large amount of progress in terms of the openness of its project development. Objections that do not recognize that are focusing on what they want to see and not what is actually happening with OpenXML." See, http://www.durusau.net/publications/OpenXMLPosterChild.pdf, accessed March 30, 2008
4 In an interview, Durusau noted: "The other question that only time will answer is whether OOXML will be so complex and lengthy that it will have a universe of adopters of 1." See, http://www.bibfor.theoconsult.de/?p=39, accessed March 30, 2008.
5 Jonathan Zuck, president of the Association for Competitive Technology, which represents 4,000 businesses in the United States and Europe, including Microsoft, and supports ISO/IEC adoption of OOXML, stated: "This is a purely commercial battle masquerading as a principled debate over opendocument standards." See, O'Brien, Kevin, Vote on a Web Standard to Close to Call, International Herald Tribune, March 30, 2008, at http://www.iht.com/articles/2008/03/30/technology/msft31.php, accessed March 30, 2008.
6 The vote to which I allude was taken in the first round of voting, which ended on September 2, 2007, and was reported to me by the chair of the committee in question. The vote was 8 to 1 in favor of disapproval, and was held within a National Body whose rules require unanimity of vote in order to record a position. The result was that the National Body registered a vote of "Abstain." Note that under the complex voting rules that apply in ISO/IEC JTC1, abstentions are disregarded, and thus can be almost as useful to securing adoption as a vote to approve.
7 I have collected excerpts and links from nine first-hand accounts of delegates representing a total of seven National Bodies here: http://www.consortiuminfo.org/standardsblog/article.php?story=20080309054524379#delegates
8 That said, I believe that the rules should change systemically in areas such as transparency and the exercise of undue influence through methods such as "stacking" committees.
9 See, A Proposal for a New Type of Global Standards Certification, ConsortiumInfo.org, Standards Today, Vol. VI, No. 8, October-November 2007, at http://www.consortiuminfo.org/bulletins/oct07.php#feature
Sign up for a free subscription to Standards Today.
return to top
Tracking the Man with the Gavel:
Alex Brown on the BRM
Date: January 30, 2008
As many of you are aware, Alex Brown will be the "Convenor" of the OOXML Ballot Resolution Meeting (BRM) that will run from February 25 through 29 in Geneva, Switzerland. Alex has a variety of unenviable tasks, including:
Trying to interpret various standing Directives and other ISO/IEC JTC1 rules and practices that were created for what might be described as kinder, gentler times (not to mention for shorter specifications).
Figuring out how to process c. 1,000 comments (after elimination of duplicates) during a 35 hour meeting week, without the currently contemplated possibility of an extension.
Herding 120 cats, some of which will have strong opinions on individual points, others of which will have alternating suggestions on how to resolve a given point, and many of whom may be just plain bewildered, due to the lack of time to be fully prepared.
For better or worse, the rules that Alex will be interpreting and applying are not as comprehensive, and certainly not as detailed, as the situation might demand to put everyone on exactly the same page regarding what should (or at least could) be done at many points in time. As a result, knowing how Alex's thoughts are shaping up is both interesting and important. To his credit, he has been generous about sharing those thoughts, and often how he arrived at them, at his blog, which can be found here.
While I've often linked to Alex's blog and have had a permanent link in the "Blogs I Read" category for some time, I'd like to point to Alex's latest entry, which covers several important points that others have recently blogged on. In many cases, Alex comes out differently than some others that have stated firm opinions, and since Alex has the gavel, his opinion will be the one that counts.
Alex's latest blog entry was posted yesterday, and is titled Tracking OOXML issues, and here are some of the things that I found instructive.
For example, last week I posted a blog entry where I noted the difficulty that delegates would have preparing for the BRM, given the number of comments and the lengthy proposed dispositions document (2,300 pages). Rick Jelliffe took issue with that, posting the following in a comment:
Third, a reviewer for a national body will be primarily interested in the comments from that body. In most cases, the NBs comments are much less than 100 comments.
Fourth, within each National Body, different members of the committees (and delegations) have different interests and strengths, so of course there is a division of labour. There will be touchstone issues for each reviewer that they will often check thoroughly, and flip through the responses to issues that don't interest them or which are out of their expertise. There will also be some die-hard reviewers who will want to read all the responses, but they would not be the majority, for a large standard.
Where did you get the idea that every person needs to understand every part of every response? That is not the way things work in large standards, it is committee work. Why don't you join an international standards committee, so you can enhance your excellent prose with experience? I am sure everyone would benefit. Among other things I noted this is hardly a normal standards situation, that some delegations will consist of only one individual, who will therefore need to cover all comments, that many National Bodies (NBs) were indeed interested in the comments of other NBs, and that the dispositions of comments can cause new issues, which can be of concern (something Rob Weir wrote about a few days ago here).
So there you have the view of two pundits. What's the real story? Here's what the man with the gavel has to say:
The work however does not end there as the UK must finalise its view on other NBs' comments too. As the JTC 1 Directives explicitly state, the reason why all NB comments are distributed is to allow all NBs to form an opinion on all of them:
Upon receipt of the ballot results, and any comments, the SC Secretariat shall distribute this material to the SC NBs […] The NBs shall be requested to consider the comments and to form opinions on their acceptability. (13.6)
By extension, of course, NBs shall naturally be considering Ecma's responses to these comments too. It is this considered national position that delegations will be taking to Geneva:
NBs […] shall appoint to the ballot resolution group one or more representatives who are well aware of the NB's position. (13.7)
So, NBs need to do their homework so that delegations arriving at the BRM in Geneva are fully briefed. The delegation should ideally know their national position on all 1,000 or so distinct comment/responses that could be discussed. It is the responsibility of the delegation to faithfully represent their national position (not individual divergent delegate views), and to be prepared to respond to any fresh issues that arise in line with guidance their NB has given them.
The point of the above isn't to show that I'm right and Rick is wrong, but rather that it doesn't matter — what matters is what Alex thinks, and also what the Directives say, and not what people may be used to in previous, less contentious settings where having a copy of the Directives at your elbow, and then acting in strict compliance with them (or at least to the best of one's good faith interpretation) is not as important. And that's why I'm going out of my way to point you to Alex's blog, so that you can keep current with how his planning is progressing for the BRM.
Here are some other significant items from Alex's latest blog entry:
Given the five day time limit of the BRM, a frequently asked question is: how can 1,000 issues be addressed in the time, even if NBs already know what their position is? The answer, I think, must lie in paper voting. I am sure that the overwhelming majority of meeting resolutions will be decided by voting (as allowed for by the JTC 1 Directives), and delegations will be given lengthy voting papers allowing them to approve, abstain, or disapprove for any proposed resolution. The voting papers are likely to have three kinds of proposed resolution listed on them:
Verbatim responses from Ecma's proposed disposition of comments (as contained in the document published by SC 34 as N 980)
Ecma responses that have been amended by the BRM
Fresh responses arising from BRM discussion
for the latter two types, consensus might well be reached during in-session discussion, in which case there is obviously no need to put the proposed resolution to the additional test of a redundant vote.
Anyone who has had to bother with paper voting in a crowd will know that even this will be quite time consuming. Hence, the definition of what "consensus" means will likely become quite important. Here's what Alex has to say on that point:
In ISO (and as adopted by JTC 1), the word "consensus" has a specific meaning:
[…] general agreement, characterised by the absence of sustained opposition to substantial issues by any important part of the concerned interests and by a process that involves seeking to take into account the views of all parties concerned and to reconcile any conflicting arguments. Consensus need not imply unanimity.
Different meeting chairs take different approaches to determining consensus. In general, if the existence of consensus is not beyond doubt on any issue at the BRM, it will be deferred to paper balloting alongside the undiscussed issues.
So what happens when the clock runs out, especially if not all 1,000 comments have been addressed to the satisfaction of the NB delegates?
I asked Alex that question in a comment to his preceding blog entry, which he answered as follows:
I'm curious how the decision was made to take one week, without provision for an extension? Clearly, no one wants to spend forever discussing one standard, but other than the practical reality that five days is the space between two weekends, there doesn't seem to be any magic to that amount of time.
[Alex] The decision to schedule the BRM strictly as a five-day meeting was taken at a very high level within ISO/IEC. I am informed that such timetabling has happened before.
Certainly, having a deadline will help people focus, but deciding in advance the amount of time that can be spent doesn't seem to serve anyone very well otherwise. It seems that with this much to cover, you're as like as not going to be able to cover everything to everyone's satisfaction.
[Alex] You're right in that some people from both "sides" consider 5 days too little. However, on the other hand some standards veterans think this DIS is taking quite enough time already, thank you very much!
If that's the case, some people are going to be unhappy, no matter which way the final vote comes out — either proponents will be unhappy, if DIS 29500 fails because some NB's think it's not "done," or opponents (and neutrals) will be unhappy, because they think that the standard is now adopted, even though it's not "done." And the process itself is shown to be inadequate to the challenge as well.
[Alex] Ecma chose to fast track the DIS, knowing perhaps more than anyone about both the spirit and the letter of the Fast Track process. In the end, it's up to the NBs whether the DIS gets passed or not, and a consideration of whether the process is adequate is a perfectly legitimate question they can ask themselves in making their decision, as this FAQ item makes clear.
I found Alex's last comment particularly interesting from a strategic point of view. As I've repeatedly noted in a variety of prior blog entries over the past two years, Microsoft has adopted a high risk strategy by pushing OOXML so aggressively through the Ecma, and then the ISO/IEC JTC1 process. Already, it's received one set back, in that its failure to gain approval in the first voting period has resulted in much bad press, and a seven month delay (through the expiration of the second consideration period, which will end on March 30).
If those extra months had been invested in a voluntary extension of the Fast Track period, perhaps more comments could have been resolved to everyone's satisfaction prior to the BRM. On March 30, we'll find out whether this pedal to the floor strategy succeeds or backfires, because if not enough votes change to the plus column, than any reconsideration of OOXML will take a very long time under JTC1 rules.
Until then, Alex's blog is the place to stay in touch with What Happens Next. Hopefully, he will still be in as good cheer at the end of the day on February 29 as he was when he closed his latest blog entry, as follows:
This will be no love-in: I am expecting some hard work and high-quality technical discussion!
I'll give odds he's right on the first point. It will be interesting to see if his hopes are fulfilled on the second.
Bookmark the Standards Blog at http://www.consortiuminfo.org/newsblog/ or set
up an RSS feed at http://www.consortiuminfo.org/rss/
Copyright 2008 Andrew Updegrove
Sign up for a free subscription to Standards Today.
return to top
WAR OF THE WORDS:
Chapter 4: Eric Kriss, Peter Quinn and the ETRM
This is the fourth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
By the end of December 2005, I had been blogging on ODF developments in Massachusetts for about four months, providing interviews, legal analysis and news as it happened. In those early days, not many bloggers were covering the ODF story, and email began to come my way from people that I had never met before, from as far away as Australia, and as near as the State House in Boston. Some began with, "This seems really important — what can I do to help?" Others contained important information that someone wanted to share, and that I was happy to receive.
One such email arrived just before Christmas in 2005. In its entirety, it read:
Enjoy reading your consortiuminfo blog … keep it up.
Happy New Year,
This was a pleasant and welcome surprise. Until the end of September, Eric Kriss had been the Massachusetts Secretary of Administration and Finance, and therefore Peter Quinn's boss. Together, they had conceived, architected and launched the ambitious IT upgrade roadmap that in due course incorporated ODF into the state's procurement guidelines.
Naturally I responded to Kriss's email, and suggested that perhaps he might consider granting me an interview. The answer was "no." But over the next several months Eric was nonetheless willing to help me out from time to time when I needed insight into current events as they evolved — as they began rapidly to do, beginning with the abrupt resignation of Peter Quinn less than a week later.
Eric usually responded to my questions with single sentence emails or pointers to material already available on line. His reluctance to share more than might be deemed appropriate after serving in government was as frustrating as it was admirable. Inevitably, the exchange reminded me of Bob Woodward's erratic interactions with the gnomic source he called "Deep Throat" during the Watergate era. But early in April I received an email from Eric that included not only several helpful links, but this message as well:
Also, I've been thinking about your offer of an interview and perhaps the time is right.
We settled on a time and place, and a few weeks later I was waiting for Eric in the lobby of a business hotel on Route 128, looking out over the offices of technology companies large and small.
When Eric arrived, my first reaction was to wonder if someone faking his email address had spoofed me. I had never seen his picture, but knew that he was 57, had helped Mitt Romney in 1983 launch what grew to be a $50 billion private equity fund, and had also served in state government before, serving in Massachusetts Governor Weld's administration from 1991 through early 1993. That first time around, he had been both the Chief Financial Officer of Massachusetts as well as its Assistant Secretary of Administration and Finance. In between, he had been CEO of several very successful technology based companies, two of which he had founded. Others he had been brought in to turn around — and did. But the person shaking my hand looked twenty years younger, and was wearing a T shirt, sneakers and jeans — hardly the stereotypical Bain Capital, Beacon Hill type.
Had I done my homework more thoroughly, I would not have been surprised, as Eric Kriss is a man of many parts, most of which aren't usually found in the same package. In addition to the stints in coat and tie, he had written many articles, reviews and anthologies, authored four books on music (three or which focused popular piano styles), produced and performed on a Grammy Award-nominated album, and recorded a CD of his own music. Depending on the year, you could find him as either the CEO of a major Victoria's Secret lingerie supplier, or the Managing Editor of Shuttle, Spindle & Dyepot, a glossy quarterly for craft weavers. In other words, not someone who would be likely to feel constrained by the usual boundaries of state government and finance, let alone document formats.
As we settled in, Kriss told me that he had never expected to reenter public service after leaving the Weld administration in 1993. But some time after he merged a company he led called MediQual into a larger company, the phone rang. It was his old friend Mitt Romney, telling him he was now running for governor. Kriss agreed to help out, and in March of 2002 became Romney's chief policy advisor, working up stands on topics like education reform, tax policy, and "smart growth." After Romney was elected governor in November of 2002, Kriss led the transition team. Perhaps inevitably, in January of 2003 he found himself part of the administration, sitting in the same offices he had expected never to return to a decade earlier. Only this time he was not Assistant Secretary of Administration and Finance, but Secretary, responsible for $25 billion operating, and $2 billion capital, annual budgets.
As numbers and a title like that would suggest, the Secretary of Administration and Finance wore many hats. The former entrepreneur, editor, turn around specialist, musicologist and fund manager was now responsible for overseeing an equally diverse range of functions, including tax collection, collective bargaining, purchasing, construction and real estate — and the state's information technology (IT) infrastructure.
Kriss wasn't trained as an engineer himself, but he did have a strong background in technology-based, fast-growing companies, and most recently had founded a Web development he still operates today. Consequently, he was comfortable with IT, and with what goes into purchasing, scaling and managing IT. He also knew how important IT systems were to any enterprise, and particularly to those that dealt with large amounts of information — like MediQual, which had been a medical data management company. Governments, with their millions of records containing data on everything from legislation to voter data to entitlement programs, were even more data intensive, and therefore IT dependent. In his new role, Kriss knew that he would have ultimate responsibility to the governor and the voters for how well those systems functioned in the service of the electorate.
One of the people he quickly got to know was Peter Quinn. When he met Quinn, Quinn's title was "Commissioner of the Information Technology Division" (ITD), a vague and uninformative title. Kriss recalled from his Weld administration days that whenever the then-Commissioner of the ITD was introduced to someone, the inevitable reaction was a blank stare. The first time Kriss and Quinn entered a meeting, Kriss therefore decided to utilize what he described to me as, "the general ability of executive heads to make up the titles they wish." As he and Quinn shook hands with their guests, Eric introduced Peter as "the state's CIO." In fact, the position did not then legally exist (it does now). But life for both of them was easier from then on, because Quinn's real role in the administration was immediately clear.
Peter Quinn was also a newcomer to state government, having come aboard during the last months of the previous administration. Peter's job, regardless of title, was managing the IT structure of the twenty agencies of the Executive Branch of state government. The CIO title, however, implied greater control than was in fact the case. In Massachusetts, each of these agencies not only created and managed its own budget, but also had the authority to purchase whatever IT goods and services it wished within its budget, once approved. As a result, while Peter had responsibility for making sure that over 50,000 state employees could communicate and create and use information, the equipment that supported all of those desktops was an unbelievable hodge-podge of computers, software and other technology that had accumulated over the last twenty years. Or, as Quinn, liked to say, "If any computer manufacturer that was ever in existence had ever made something — well, we had one of those.
All of which was hardly surprising, nor too different from what Kriss had expected. He knew how purchasing was done in Massachusetts from his previous stint on Beacon Hill, and also done enough consulting to know that most governments spend only about 1% of their budgets on IT systems, as compared to the 5 — 7% large businesses dedicated to the same purpose. And yet governments were far more information intensive than the great majority of private sector enterprises of comparable size. Moreover, governments were mindful of the benefits of stability, and were typically much slower to migrate from existing systems to newer technologies. Understandable or not, Kriss thought the IT infrastructure in Massachusetts was out of date and ripe for an overhaul.
Quinn thought so as well. Like Kriss, he had spent most of his life in private industry, primarily in the financial sector. Most recently, he had been CIO of a company that managed shareholder records for mutual funds, and was used to better-integrated systems. Kriss thought Quinn was a "breath of fresh air" compared to what he had expected, and the two hit it off immediately.
Soon, they were talking about what they could do together to modernize the systems for which they were responsible. Clearly, the state's IT infrastructure needed upgrading and rationalizing. With a new governor happy to cultivate a tech-savvy image and to let a trusted lieutenant have his head, why not do the job right? Instead of simply replacing this computer and that, why not assume a clean slate, and leapfrog from the funky systems they had inherited to the type of infrastructure that an information age government should have?
As Kriss recounted the tale (and as I struggled to keep up with my note taking), it was clear that even three years later he was excited by the opportunity that he and Quinn had embraced. What had captured their imaginations was not the mundane task of reconfiguring a vast, out of date patchwork system into something more state of the art, but of designing the type of architectural model upon which such a system could be built — in Massachusetts or anywhere else. At the core of such an architecture would be two fundamental building blocks: "open source software," and "open standards." The rules for using those building blocks would be laid out in a master plan, which would function going forward as a living, periodically updated framework that they would call the Enterprise Technical Resource Model, or "ETRM." And although neither was an expert in either open source or open standards, they were excited by the prospect that they believed these tools offered to achieve two crucial end states: "interoperability" and "vendor independence."
Those aims were important for achieving certain central goals that were becoming both popular in IT circles as well as feasible in the marketplace. One of those goals was to make it easier for one agency to trade information with another. Such a capability might seem obvious, until the mongrel nature of the state's IT infrastructure was taken into account. Nor was Massachusetts alone in that regard, as state and national governments around the world were struggling with the same mixed environments of systems that individual vendors like IBM, HP, Sun and others had sold them over the decades, each in an effort to lock in its customers. Each of these systems was "proprietary" rather than "open," meaning that they were built to unique designs that made it difficult for the systems sold by one vendor to operate with the systems of others. In fact, even systems within a single agency often couldn't communicate with each other, either, because some equipment had been purchased from one vendor at one point in time to perform a particular function, while other vendors had fulfilled other discrete requirements. None of these vendors, of course, had any reason to make communication between its wares and those of its competitors simple.
Open standards promised a way to ameliorate this situation, because if two systems used the right standards, they were each referred to as "open systems," and could communicate with each other — becoming, in IT-Speak, "interoperable." In truth, it was rarely, if ever that simple, but using open standards as pervasively as possible could make it far easier for IT managers to enable their systems to "talk" to each other. And that would save a great deal of time and money.
Kriss and Quinn believed that using open source software, instead of proprietary software, would also reap large rewards. In this case, proprietary referred to software that had to be used as it was delivered from the vendor, because the vendor not only withhold the type of computer code ("source code") that would make changing the software comparatively easy, but also made the software available under a license agreement that prohibited the customer from making any changes.
Having access to source code would have benefits for the state beyond the freedom to make changes to software whenever it wanted to. To begin with, Massachusetts could continue to maintain that software even if the original developer no longer supported it, or had gone out of business. In short, the state would have become "vendor independent," because it no longer needed to worry about being abandoned by a vendor, or forced to continue to purchase upgrades it didn't really want. Moreover, once an enterprise had converted all its systems to open standards and open source software, it would have far more choices in the marketplace, and therefore could drive harder bargains between vendors, because it could fulfill the same need with more alternative products and services from more vendors.
That concept captured the visionary sides of Kriss and Quinn's personalities. But if they had spent more time in civil service, they might have been less optimistic about their ability to pursue such an ambitious goal in a political environment. After all, not only would they need to invade the turf of multiple career civil servants (some of whom worked in departments that did not report to Kriss), but they would need the cooperation of legislators that had little or no acquaintance with technology at all. Also, it was an open secret that their boss, Mitt Romney, viewed the State House simply as the launch pad for a run at the White House. His appointees, as a result, weren't likely to be long-term forces to be reckoned with.
Finally, and most fatefully (as they would learn), while overhauling the State's IT systems would benefit some large and powerful IT vendors, by definition it would also take business away from others. Those vendors not only had lobbyists, but plenty of money to spend on them. As Peter Quinn in particular would later learn, rocking the government procurement boat was not something to be undertaken lightly.
But in 2003, that was a lesson yet to be learned in the future. For now, Eric Kriss and Peter Quinn could enjoy planning the government enterprise of the future, meeting out of the limelight with their staffs on Beacon Hill in Boston. When Kriss had accepted Mitt Romney's invitation to join the new governor's administration, he had promised himself that his stay in government would be short — just one year. But before he knew it, one year had turned into three as the ETRM slowly took shape.
Copyright 2008 Andrew Updegrove
Sign up for a free subscription to Standards Today.
return to top
March 7, 2008
#53 Steve Jobs' Endangered Second Act
In his later years, the American Jazz Age author F. Scott Fitzgerald ruefully observed that "There are no second acts in American Lives." That now-famous verdict was based upon the personal experience of the once celebrated author, by then a self-described "Hollywood Hack," reduced to writing B movie scripts for much-needed current income.
If there is a current exception to Fitzgerald's axiom in the world of technology, it must certainly be Steve Jobs. The company he founded in a garage with partner Steve Wozniak quickly seized the lead in the PC revolution, reaching $100 million in revenues by 1980. Later the same year, Apple launched the largest IPO since Ford Motor Company went public. But the introduction of the IBM PC and the rise of Microsoft wrought a reversal in Apple's fortunes, and in May of 1985, the man he had recruited to be his mentor ousted Jobs from his own company.
The rest, of course, is the stuff of which legends are made. Jobs attempted to vindicate his vision in 1985 by founding a new company that he unsubtly dubbed NeXT Computer. But NeXT never found its market: by 1993, it had sold only c. 50,000 machines. Then, at last, Jobs' fortunes began to improve.
In 1996, NeXT was acquired by Apple, which had itself been largely wandering in the wilderness during the intervening years. By acquiring NexT, Apple not only obtained the rights to a new operating system, but it reacquired Jobs as well. Moreover, not long after leaving Apple, Jobs had bought an animation studio from LucasFilms for $5 million, plus a $5 million cash infusion into the studio itself. He later renamed that studio Pixar, and it went on to become wildly successful, making Jobs a very wealthy man twice over.
With the fantastic success of the iPod and iTunes, the successful launch of the tectonically innovative iPhone and the rejuvenation of Mac sales, Jobs now seems poised on the cusp of proving Fitzgerald wrong to the point of stomping on the author's grave. But will he in fact pull it off, leading Apple to dominate the mobile platform of the future after surrendering the emerging PC platform of the past to his rivals?
Given Jobs' announcements of yesterday, I'm afraid that history may be about to repeat itself instead. Here's why (or, to employ my usual phrase, "Consider this…").
Yesterday, Jobs made a number of impressive iPhone related announcements, most significantly (for business users) disclosing that Apple had entered into an agreement with Microsoft whereby the iPhone will support Microsoft Exchange. Jobs contends that the result, along with further details, will allow Apple to more directly challenge Research in Motion's (RIM) dominant BlackBerry mobile device. If so, this would permit Apple to have the same impact on its competitors in the business space that it is already having in the consumer world, and perhaps more so, since the penetration of mobile devices in the business world is still much smaller than the spread mobile phones among consumers.
Jobs also revealed news of major appeal to consumers as well, announcing a new release of the iPhone operating system, and here is where I fear he may be making a fatal misstep. But first, the good news: Apple will embrace the innovation of independent software vendors (ISVs), providing them the technical information to create iPhone-based apps, and also a ready distribution channel as well. From the press release:
The iPhone 2.0 software release will contain the App Store, a new application that lets users browse, search, purchase and wirelessly download third party applications directly onto their iPhone or iPod touch. The App Store enables developers to reach every iPhone and iPod touch user....Users can download free applications at no charge to either the user or developer, or purchase priced applications with just one click. Enterprise customers will be able to create a secure, private page on the App Store accessible only by their employees. Apple will cover all credit card, web hosting, infrastructure and DRM costs associated with offering applications on the App Store....
The iPhone SDK provides a reliable, fast and secure way to create innovative applications for the iPhone and iPod touch. In addition to the rich set of iPhone OS APIs, the iPhone SDK also provides advanced tools for creating native iPhone and iPod touch applications including: Xcode® for source code editing, project management and graphical debugging; Interface Builder with drag and drop interface creation and live preview; Instruments to monitor and optimize iPhone application performance in real time; and the iPhone Simulator to run and debug applications.
All of which sounds beyond wonderful, not only for Apple, but for ISVs and Apple's enthusiastic customers as well. But now the bad news: the devil is in the ellipses. Here's a part of the language that I did not include in the extract above:
Developers set the price for their applications — including free — and retain 70 percent of all sales revenues….Third party iPhone and iPod touch applications must be approved by Apple and will be available exclusively through the App Store.
And now you can begin to see why I think that rather than standing on the verge of an unprecedentedly successful second act, Jobs may be about to stage a replay of the same mistakes of three decades ago. Will developers wish to tie themselves to the whims of Steve Jobs, the same way they did 25 years ago to Bill Gates — and pay a toll for the privilege of doing so to boot? Or will they spend their time working to support more open platforms, such as Android, where there will be less control, no toll booth, and multiple channels of distribution? Haven't they all been there before?
Unfortunately, it appears that Steve Jobs learned the wrong lessons the first time around, and is fighting the last war rather than the next one. Apple failed in the early 1980s in large part because the IBM PC platform provided what at the time was a remarkably open platform for ISVs, as well as an ever-growing potential customer market as more and more personal computer buyers flocked to the "WinTel" platform instead of Tandy, Commodore — or Apple.
Soon, this "virtuous cycle" of more platforms attracting more applications attracting more customers (and back around again) became a juggernaut that no computer manufacturer could challenge with a proprietary alternative. Instead, the name of the game became to simply play the game better. While Apple's fortunes fell, Compaq and Dell became ascendant, soon eclipsing Apple in both business and consumer sales. Eventually, Apple flirted with allowing its products to be cloned, it was too little and too late, and Jobs reversed the experiment in any event upon his return.
I find yesterday's announcements particularly unfortunate, because Jobs really does have a chance for a "do-over" here, but only if he understands the opportunity. Just as the PC revolution allowed new venturers to unseat IBM, we are on the dawn of another unparalleled shift. I recently described that opportunity in a piece I called Going Mobile: the Year of the Smartphone Startup. In that piece, I observed:
2008 will usher in a multiyear period of opportunity for entrepreneurs and investors. The dynamics will echo two boom periods of the past — the rapid expansion of the PC marketplace in the early 1980s, and the Internet explosion of the late 1990s. The device that will most robustly deliver on these antecedents is the smart phone, initially deployed (like the first personal computers) with many competing operating systems, and now able (like the PCs of the Internet boom) to satisfactorily access the Internet and the web.
In many ways, however, this boom will be better. Unlike the early, anemic, expensive PCs that people had never used before, a smart phone is simply a much more versatile telephone — something a billion people already own. With a decade of Internet and web experience behind us, there will be far fewer failed efforts to determine what people really will and won't do online. And these mobile devices will be able to perform new tricks, using as many as nine separate on-board radios to interact with an ever-expanding "Internet of things," such as ATMs, film kiosks, movie posters and much more.
But there is one extremely important difference this time around: now we live in an open IT world. If Jobs fights yesterday's war the way that Bill Gates did — by trying to create an ecosystem with Apple firmly entrenched at its center — ISVs will simply go elsewhere. Crucially, there are only so many iPhones in use today. The prize is simply not yet large enough to offset the costs to ISV independence to lure the best of them in and keep them there exclusively.
Already, various flavors of Linux are destined to power the majority of mobile devices, and the Google Android project aims to provide developers with greater independence as well. Not long ago, even the dominant telecommunications carriers grudgingly came to realize that they are better served (assuming they still have a choice) by opening their phones to independent software vendors than by shutting them out.
If Steve Jobs is to truly prove F. Scott Fitzgerald wrong, he needs to realize that the game — and the rules — have changed. Will that happen? Unfortunately, I'm not optimistic. Jobs seems constitutionally rooted in proprietary strategy, and to date has only been capable of incrementally opening the door a crack at a time in the same way as Microsoft — and for Apple's benefit alone.
As I have opined before, this seems to me to be a tragic flaw in Jobs' leadership. Certainly Apple is the unparalleled leader in innovation and design for the consumer market among the majors. Were Jobs to truly open the Apple platform, it seems likely to me that Apple could enjoy a substantial and sustaining lead in an explosively growing market space. How much better to be king of a much larger and more profitable hill as the acknowledged master of the game, rather than a latter day, unsuccessful adopter of the Bill Gates play book? Unless Steve can get through the openness knothole, I fear that his chance at a truly successful second act will be squandered.
The choice, of course, is his. We need not worry too much on his account, though, as the remaining years of his career are certain to end more happily than F. Scott Fitzgerald's.
After all, Jobs owns a studio.
Copyright 2008 Andrew Updegrove
Read more Consider This… entries at: http://www.consortiuminfo.org/blog/
Sign up for a free subscription to Standards Today.
return to top