The Standards Blog

Vendor Escalation, Process Politicalization, and What Needs to Happen Next

OpenDocument and OOXML

Not so very long ago, most standards were set in a largely collegial atmosphere by career professionals who met in face to face meetings over a period of years.  Along the way, they came to know each other as individuals, and established relationships that helped the process move forward and allowed for productive give and take.

While this process was not without its back scratching and game playing, at least the impact on interests other than those directly involved tended to be limited.  After all, if performance standards for light bulbs had settled out at 45, 65 and 95 watts rather than 40, 60 and 90, no end user’s ox would have been gored on the desktop, when it came to lighting.

 

During these more collaborative times, those that defined the rules for organizations such as ISO, IEC and ANSI (the American National standards Institute) tended not to favor simple majority voting to determine outcomes.  Instead, they put a high value on consensus, in order to lessen the chance that minority interests would be oppressed, or important technical matters ignored.  Other rules required that compliant processes provide a means whereby specific technical decisions could be appealed, so that the consensus process could not be abused.

The rules that included these principles were fairly high level, and often less than legally precise.  For the most part, this worked well enough over time, and allowed an already necessarily slow, surface mail-based process (in pre-Internet days) to move a bit faster than it otherwise would have if more detailed process protections were in place.

 

In much of the standards development world, which encompasses every area of products, services and activity, this is still largely the way the standards systems operates.  But sadly, that is no longer the case where information and communications technology (ICT) standards are created.

This is now:  What we have just witnessed with the OOXML adoption process is the catastrophic failure of a system built for one purpose that has been subjected to forces that it was not designed to withstand.  Those forces included intense pressure from vendors, and even political pressure.  The tactics utilized appear to have included taking advantage of rules crafted to foster openness, placing or outmaneuvering committee chairs, and recruiting employers to pressure committee members to vote their employers’ interests rather than their own technical judgment.

This is hardly the first time that such tactics have been employed in standard setting.  But it may be the most global, and is certainly the most public example in recent memory.  Given this publicity, there is a danger that such behavior could become viewed as acceptable by those new to the process.  And even old hands may wonder whether they need to adopt similar tactics on a defensive basis in the future, resulting in an increasingly sordid “race to the bottom” of the abusive practices barrel.

For these reasons, the OOXML process just ended represents a watershed event that needs to be addressed promptly and systemically, in order to lessen the chances of repeat performances, and to deal with them effectively if they nevertheless occur.

In future entries I’ll air my ideas about the specific reforms that will be needed.  First, however, I think we need to agree on what just happened, and what the problems are.  Only when that has been accomplished can we intelligently choose the path that to follow in designing and instituting the reforms that are needed.

Upping the stakes:  The multinational corporations that have acted as the principal players in the OOXML drama have annual revenues well in excess of the gross national products of many of the sovereign nations that cast their votes on that specification.  Just like those nations, these powerful corporations together engage in alliances, diplomacy, and the commercial equivalent of wars (sometimes even concurrently), deciding when to escalate, and when not.  As they do, they take calculated risks.  If the assumptions they make about what their rivals will do in response prove inaccurate, these gambles can blow up.  And when they do, there can be collateral damage, just as in the real world of nations.

In this case, what we have seen is an example of commercial escalation not very different in scale and economic consequences than a trade war among nations.  And just as a major international crisis can grow from a minor event in an out of the way corner of the world, so did the ODF-OOXML standards war arise from a simple procurement decision in a small US state (Massachusetts) in the late summer of 2005.

As a result of that decision, ODF suddenly became a credible weapon that could be targeted at the soft underbelly of Office, one of the two great profit engines of Microsoft’s ongoing success.  IBM, Sun, and, less publicly, companies like Oracle and Google recognized an opportunity to undermine Microsoft’s hegemony on the desktop, and acted to support ODF.  Microsoft, predictably, pulled out all the stops to defend its franchise.

One of the steps that Microsoft adopted was to accelerate the announcement of commitments it was already finding it needed to make to its government customers by way of opening up its products.  As part of that strategy, it revealed in late 2005 that it would submit its OOXML formats to Ecma, a standards body that represented an apt choice in two respects:  first, it advertised itself as a rapid and willing conduit for introducing standards to ISO/IEC that had not been created through an accredited standards body.  And second, it enjoyed a special “Class A Liaison” relationship with ISO/IEC, which granted it specification submission rights equivalent to those held by the National Bodies themselves.

Up to this point, Microsoft’s conduct was not very different from that of its rivals, some of which had used Ecma for a similar purpose.  Most famously, Sun Microsystems had sought to use Ecma as a conduit for the standardization of  Java.  In that case, it was Microsoft that played hardball, successfully opposing Java’s progress as a standard unless Sun released greater control than it was willing to offer.

But Microsoft went further, embarking on a strategy that sought to secure ISO/IEC approval in the absolute minimum time possible, regardless of the technical challenge or feasibility of that approach.  In so doing, it hoped to limit the window of opportunity during which ODF-compliant products might make incursions into the market of government purchasers that give preference to ISO/IEC certified products.

But unlike the specification for Sun’s Java, OOXML was still in a far more primitive state of documentation, due in part to the fact that not even Microsoft’s existing products were based on OOXML.  Not until Office 2007 would Microsoft release a version of Office based upon extensible markup language (XML) formats, rather than the binary versions upon which Office had previously been based.   As a result, much more work was needed would be needed before the OOXML specification could be as complete and polished as would normally be the case before Ecma would become involved.

Nor did that strategy change once Ecma became involved.  Instead of bringing OOXML up to full industrial strength, the Ecma working group concerned itself primarily with converting the specification into the proper format for ISO/IEC submission purposes before it was adopted by Ecma.  Finally, Microsoft chose ISO/IEC JTC1’s  “Fast Track” process for what was intended to be the next and final approval step for what had now become a 6,000-plus page specification.  From the perspective of achieving the needs of anyone other than Microsoft, there was there never a valid reason for such inappropriate speed.

The result was that at every step of the way, Microsoft generated ill will and a great deal of extra work for those in National Bodies around the world who were saddled with evaluating the formidable specification.  During a one month “Contradictions” period, many criticisms were offered - but Ecma concluded that none needed to be addressed at that time.  As contradictions were ignored (during the one month “Contradictions” period), and as the specification was pushed through the five month comment and voting period that followed, great pressure was brought to bear around the world on those in a position to approve the specifications, regardless of its many remaining flaws.

Ultimately, more than 900 substantive comments were registered by many of the 87 National Bodies that voted on OOXML.

Escalation:  While Sun was largely neutralized by pre-existing agreements with Microsoft (a fact largely missed by the press, which continued to speak as if Sun’s ongoing efforts were as significant as IBM’s), IBM decided to make the defeat of OOXML a public part of its global competitive strategy, even though Microsoft had not itself acted to oppose ODF.  And IBM was uniquely positioned to do so, possessing the most extensive and coordinated network of standards professionals at work in its far-flung global facilities.  Those employees set to work to press for the stringent evaluation of OOXML in National Bodies around the world.

In doing so, IBM was able to capitalize on the rise of a populist reaction to OOXML fueled in large part by the open source community.  This placed IBM in the enviable position of wearing the white hat in a commercial battle with a competitor that already had an image problem, as well as ongoing problems with European regulators. 

That said, the public opposition to OOXML that grew around the world was genuine and was instrumental in generating much of the attention that the standards battle generated, and differentiated it from other purely commercial standards-based conflicts that were ongoing in the same time period (e.g., between WiFi and WAPI in China, and between the Bluray and HD-DVD video formats).

As the process continued, more and more reports of unsavory conduct emerged from around the world, dozens of which have been well aired, and in some cases well documented (and even admitted, as in Sweden, where Microsoft admitted that an employee offered “marketing incentives” to business partners willing to join the Swedish National Body and vote for OOXML).  The majority of the charges have been leveled against Microsoft, but Microsoft has alluded generally over the past six months to similar conduct by the opposing forces, and in the past week more specific instances have been alleged by various Microsoft employees in their blogs.  For example, some bloggers have alleged (or pointed to allegations) that anti-OOXML companies joined National Body committees late in the day as well (to which some anti-OOXML readers replied that this was a necessary defensive reaction).

Rationalization:  Now we are entering the stage where we need to determine what happens next.  Jason Matusow argued earlier this week that entering into an appeals process is simply an example of a continuing, healthy exercise in standard-setting in a commercial environment.  To his credit, Jason does not accuse IBM of engaging in evil behavior (although he inaccurately state that all calls for appeal ultimately lead back to IBM).  Instead, he states – accurately but not sufficiently – that both IBM and Microsoft are simply pursuing their own commercial self interest.

I believe that such a characterization is both simplistic and dangerous.  In fact, this is not an example of a healthy process in action, nor is the system constructed in such a way as to permit effective appeals in some of the instances in which they are most needed, in light of the actual abuses that have been alleged.  To give but a single example, it appears that the only way that an appeal against the vote of Standards Norge, in Norway, may be for the appeal to be brought by the same individuals whose actions would be the subject of the appeal – making it clearly an appeal that is highly unlikely ever to be filed.  Moreover, by all accounts, ISO/IEC JTC1 would be far happier to see things simply die down and go away, then to engage in a vigorous process of self-examination.

What has changed:  In order for any sort of reform to be effective, a fundamental shift in the landscape needs to be recognized, at least in the case of what I have previously referred to as “Civil ICT Standards” (i.e., those ICT standards that play a vital role in the preservation on line of civil rights such as freedom of speech, freedom of association, and freedom of creativity).  That shift is one from a time when standards were set predominantly by vendors that could be left to butt heads to their hearts’ content without having much of a deleterious effect on public welfare (health and safety standards being subject to more careful scrutiny, as a rule), to one in which profoundly public impacts can be expected.  As a result of what we have just seen, we need to recognize what is essentially the politicalization of the standards development process.

There is both an obvious and a less obvious leg to this characterization.  The obvious leg is that all manner of pressures and tactics have been applied that are more typical of the political process than the standards process.  Because the standards process, and its rules, were never constructed to deal with such tactics, the rules will need to change to incorporate the same sorts of protections that the political system has evolved in order to defend itself.  Until that happens, not only can abuses be expected to attain their goals (and therefore to continue), but existing avenues of appeal will be unsuccessful to curb them.

The second leg is just as important, but to my knowledge has not as yet been called out for discussion.  That leg recognizes that with the elevation of certain technical standards to the level of Civil ICT Standards, more voices and more balance needs to be introduced into the standards process, and more protections provided to ensure that the Civil ICT Rights that these standards serve are in turn safeguarded. 

What needs to change:  Politicalization and the recognition of Civil ICT Rights and Standards is a game changer for standards development.  What this means is that we need to pay far greater attention to concerns such as balance, representation, and process.  For example, it would be no more acceptable for open source advocates to “stack” a committee than for advocates of a single vendor, or group of vendors, since all of these groups – and other groups not present at the table at all – have a stake in the outcome.   

While engaging in appeals in the case of OOXML may expose the inadequacy of the system to address such concerns, they will not solve them, nor necessarily result in a change of the vote in question, since existing rules may not in fact have been violated.  Instead, what is needed is a neutral, systemic review of how the process failed - and how it needs to change - so that future abuses can be avoided before they have an impact, and so that avenues of appeal can be designed that would permit even National Body votes to be challenged in the appropriate case.

The time to begin that review is now.  And the way to undertake it is not through the existing appeals process.

.

For further blog entries on ODF and OOXML, click here

sign up for a free subscription to Standards Today today

Comments

Permalink

Instead, they put a high value on consensus, in order to lessen the chance that minority interests would be oppressed.

17 of 41 ISO JTC P countries ( Participant countries, not mere Observers ) didn't approve DIS 29500 ( OOXML ), either abstained ( in many countries with controversies [1] ) or disapproved [2]:

Canada
China
Ecuador
India
Iran
New Zealand
South Africa
Venezuela
Australia
Belgium
France
Italy
Kenya
Malaysia
Netherlands
Spain
Turkey

[1] http://www.openmalaysiablog.com/2008/03/the-minister-of.html
[2] http://www.scc.ca/Asset/iu_files/29500-OOXML-Cdn-Pub-Stmt.pdf

This is 41% of NOT-CONSENSUS in P-members. A big number, and a huge number if you consider that 8 of the 59% P-members that approved this beast were the ones that upgraded to ISO JTC P status few days before September-ballot-closing just to cast unconditional-yes to everything Microsoft/ECMA proposed:

Jamaica island
Cyprus island
Malta island
Lebanon
Cote-d'Ivore
Pakistan
Uruguay


Permalink
Thanks for this great summary of the most essential issues. The OOXML saga raises a debate on standards themselves that goes beyond the technical merits.

It is my understanding that the consensus rule is not just to protect the minority interests. It is also a tool to ensure technical quality. People that notice technical deficiencies are often in minority, at least when they make the initial report of the defect. When the minority is allowed to vetoe a deficient standard, there are strong incentives to fix it.

With OOXML, we have witnessed that undermining the consensus building process results in inability to ensure technical quality.

Permalink
To the first comment:  unfortunately, consensus rules can be abused, just as majority rules can be abused.  I spoke to the chair of one National Body while I was in Geneva where the vote was 8 to 1 to disapprove OOXML last summer.  The one vote was cast by an employee of, how to say delicately, the most interested company in the vote.  Unfortunately, the rules of that NB required unanimity - and hence the vote changed to an abstain.  Obviously, any rule that is based on the assumption of good faith and fair dealing is vulnerable to those that choose not to act in synch with those underlying assumptions.

One of the challenges in any rules reform will be not to sacrifice the good reasons for rules to the detriment of the normal situations where the new rules would hurt, or at least impede progress, rather than help.  That's one reason why I think circuit breakers could be useful.  What I mean by this is introducing a mechanism where two or three participants could call for a quick decision by a neutral overseer that could say "yes, we need to look deeper," or "no, you're the one playing games - get back to work!"

And, of course, no rules can be totally successful in any event.  The US Congress is an excellent example of a democratic institution where all manner of procedural tricks of the trade can swing a vote to one's advantage or add in some self-interested rider.  That is why bringing transparency to the standards development process is so important, in my view, so that when abuses do occur, then come to light   And finally why some sort of appeals option - that is accessible to affected, but otherwise disinterested, stakeholders - is a necessary part of the solution.

Wayne and Terry: 

Thanks for the kind words.  I think it's worth pausing from time to time and reviewing what's happened to get things back in some sort of perspective, especially when so many new things are happening so quickly.

  -  Andy

Permalink
<p>Hi Andy,</p> <p>Work commitments have forced me back into lurker mode for the past few months, so I'm not as well versed on details as I used to be, but I'm making an exception here because this seems quite important.</p> <p>I think I agree with the principle of Civil ICT Rights, but I wonder if this particular problem can't be solved much more simply: automatically disallow any standard over 30 pages from any of the fast-track processes.</p> <p>My inspiration for this idea comes mainly from the UK NB's old argument that large standards are inappropriate for fast-tracking. Alex Brown occasionally mentions that they lost this argument, so I would be interested to know the counter-argument to their position.</p> <p>Over the past year, two interesting facts I've learnt are that most standards are very short - on the order of a handful of pages - and that producing a good standard takes about one day of work per page. As such, limiting standards to a maximum of 30 pages would allow the majority of standards that can be effectively processed within the one month contradictions period, and deny the minority where the politicians have the capacity to overwhelm technocrats.</p> <p>Obviously, this would mean ODF couldn't be fast-tracked either. This strikes me as beneficial for two reasons:</p> <p>First, it makes it harder for ISO to play favourites. An important issue in this process has been that access to the fast track provides a significant competitive advantage. Disallowing that access to almost any standards of comparable sizes serves to reduce the temptation for people to game the system.</p> <p>Second, 850 pages is still 850 days of work, not 30, or even 180. I've never really heard a good rebuttal to Brian Jones' argument that a lack of such basic features as formulae makes it more like ODF 0.8 than ODF 1.0. Whether or not he was right, the very lack of such a rebuttal suggests that the 6 month review period wasn't enough to address such issues.</p> <p>Limiting fast-tracked standards to 30 pages strikes me as a simple, practical way of protecting the ISO system, that can be enacted without asking anyone involved to eat crow.</p> <p> - Andrew</p>

My inspiration for this idea comes mainly from the UK NB's old argument that large standards are inappropriate for fast-tracking. Alex Brown occasionally mentions that they lost this argument, so I would be interested to know the counter-argument to their position.

Huh? The counter-argument you are looking for has been plastered all over the place and the main trust of your post is based on it. When the standard exceed a certain size, you just can't complete an in-depth technical review in a timely manner. The proof is in the result and the know technical deficiencies of OOXMl have been quite documented. Claiming the argument has been lost disregards the actual technical quality of OOXML.


I've never really heard a good rebuttal to Brian Jones' argument that a lack of such basic features as formulae makes it more like ODF 0.8 than ODF 1.0. Whether or not he was right, the very lack of such a rebuttal suggests that the 6 month review period wasn't enough to address such issues.

Standards can have scope. Problems can be broken down in smaller pieces to make them more manageable and ensure timely completion. It is better to target a smaller scope and do it right than try to do everything at once and get nothing right. With the ODF approach, the additional features are added incrementally when they are ready. With the OOXML approach, you rush the approval without completing the technical review of anything and defer all problems to maintenance. But that measure, if you consider ODF to be 0.8, OOXML is 0.01. This is the rebuttal you are looking for.

By the way ODF was not Fast Tracked. It used the longer PAS process.

Clearly I'm rustier at this than I thought, having managed to mess up both the content and formatting of my previous message :). Here's hoping this one goes better...

First, I should have been more clear about the FT/PAS issue. When I talked before about "fast track processes", I was referring to both the FT and PAS process, as well as any similar process that I'm not aware of. Sorry for the misleading use of language - I'll use the word "accelerated" in future.

My understanding of the UK national body's position is that that large standards like ODF and Office Open XML should always be put through a long standardisation process - a position I agree with. From my reading on the issue, I had thought that this position had been argued through and rejected in the general case within the ISO, for reasons which I was not familiar. When I talked about being interested to know the counter-arguments, I was referring to the arguments that counter the UK's position - arguments for why large standards are appropriate for accelerated handling.

Going back over old comments, I probably misunderstood the situation anyway. For example, in this thread on his blog, Alex Brown left a comment dated 2007-12-11, 03:25 stating that the UK had "lost the argument" about fast-tracking DIS 29500. On reflection, it seems more likely that he was referring here to the specific case of Office Open XML, rather than large standards in general.

I've always been very impressed by the engineering quality that goes into ODF - taking the extra time to produce a really good formula proposal was definitely the right move, for example. However, that doesn't imply that JTC1 can verify that an 850 page standard is up to scratch in an accelerated time-frame. My point in referencing Brian Jones was to show an argument that has not been properly debated before now, not to assert that he was definitely right. A better example might be to go back to Alex Brown again - in a comment marked 2008-04-05, 08:13 in this thread, he claimed that there are more than 100 known defects in DIS 29300, which suggests that they would have benefited from having more time. For what it's worth, I'm sure that DIS 29500 will gain plenty of defects in time as well, which would suggest that neither format was appropriate for accelerated handling.

Finally, I want to make it clear that I'm not trying to make an argument about the technical quality of Office Open XML, or about ODF. My interest in this thread is solely about discussing ways in which the ISO can improve its procedures for evaluating technical quality.

- Andrew

P.S. a note to Andy - in my previous post, I selected the “plain old text” post mode, but the preview interpreted my HTML, hence my confusion. Please change the preview function in your blogging software so that it shows angle brackets etc. properly.

I am not familiar with the arguments that convinced ISO that large standards can be suitable for accelerated handling.

My guess it it has to do with the degree of confidence they had in the work of upstream standardizing organizations. If the standard has already gone under adequate review and there are already multiple working independent implementations, why should they go back to the drawing board at ISO?

The problem is when the accelerated procedure is used to meet the time to market constraints from the promoter of a rushed specification.

"My understanding of the UK national body's position is that that large standards like ODF and Office Open XML should always be put through a long standardisation process - a position I agree with."

The point is that ODF has been under active standardization development for many years. With everyone willing or interested involved and the process was completely open, with minutes on-line. The ODF team could not have done more before standardization.

On the other hand, OOXML has been developed WITHOUT input from people not chosen by MS in a very short time. Serious work only started AFTER ODF became an ISO standard. Ecma threw out occasional drafts during a 6 month period, but that was all the world saw of OOXML until December 2006. And that for a standard almost 10 times the size of ODF.

OOXML did have a formula "definition" in the end. However, this definition proved to be a dump of Excel help files. They broke several established mathematical principles. And they were not scrutinized at all, just printed out from an Office manual. The chain of events strongly suggests that these Excel formulas were only thrown in to trump ODF on SOMETHING, anything.

The Idea that an office document format is broken unless there exist a full and final definition of each and every spreadsheet formula is simply propaganda.

Winter

I suspect that if the truth were known, MS via ECMA got their proposal into FastTrack by a technicality - that is there was nothing in the rules that put any size limit on a FastTrack document, nor was there anything in the (recently revised) rules that said the standard had to meet any particular quality level.

Therefore MS/ECMA pushed OOXML into FastTrack and dared the world to point to the rules that said they could not do so.  Unfortunately for the world, JvdG had recently had the FastTrack rules re-written to remove the teeth from the process - teeth that would have automatically scrapped the OOXML spec when all contradictions failed to get resolved, when the BRM could not complete in 5 days with consensus on ALL items, when the 'final' draft was not produced 30 days after the BRM completion, etc....

>I think I agree with the principle of Civil ICT Rights, but I wonder if this particular problem can't be solved much more simply: automatically disallow any standard over 30 pages from any of the fast-track processes.</p>
 
I don't think that idea is workable. Organizations like IEEE are very good at defining good standards, and it does not make sense to use a full ISO review for anything produced by them. I doubt few the electrical engineering sector would disagree about that IEEE is the gold standard compared to ISO.

Actually that is root of the problem. Every standard need quality review somewhere. With OOXML the world got proof that Ecma does not constitute one of the organizations that really conduct professional quality review of the submissions. In the past the theory was that the NBs would stop any rubbish sent to them, but with the Microsoft stacking of NBs this last fail safe disappeared. The problem is basically that ISO have no rules that stop late arrivals to NBs.

You make a good point that different standards organisations feed standards of different appropriateness into the ISO. For example, my understanding is that Ecma standards are about documenting current practice, which means that the ISO has some extra work to do if they want to develop best practice from an Ecma standard. Given that fact, I don't really understand why it's appropriate for Ecma to have access to the FT process - but my failure to understand things has a lousy history of stopping them from existing.

I suppose my previous comment was based on the assumption that it was impractical to require the ISO only to allow the best standards onto the fast track. Thinking about it now, three justifications for that spring to mind. First, even if (for example) the IEEE is the gold standard for standards, they could still slip up one day and let through a standard that is incompatible with South Africa's odd electrical system. Second, my understanding is that NBs have access to the FT process - how can we be certain that Trinidad and Tobago's NB has an expert on South African electrical standards? Third, ejecting a standards body from access to the FT process would have to be a difficult political exercise - how do you downgrade Trinidad and Tobago without causing ill will among other NBs that are worried about making a similar honest mistake?

That said, I don't mind admitting that "1 page per day = 30 pages" is something of a back-of-an-envelope metric. If you have something more reasoned, I'd be interested to hear it.

- Andrew

Perhaps we should think about these things in terms similar to a security problem.

We have a process that woks well as long as everybody acts in good faith. History shows people work in sufficiently good faith more than 99.9% of the time. But when comes someone determined to abuse the system, it is unable to defend itself.

In security there are several methods to address such things. You can build a firewall that keeps the intruders out. Alternatively you can have an IDS/IPS that detects and ejects the intruders that entered the system before too much damage is done. Your approach is a kind of firewall that is overly sensitive and keeps out too much good stuff along with the intruder. What we need is something that detects when a standard that doesn't make sense made it to the Fast Track and ejects it out. The difficulty is how to make that "something" hard to subvert by the attacker.

To be honest, the security analogy doesn't fire my imagination personally - I don't see how to draw a line from “keep a 35-page standard in the normal process” to “drop spam on the floor”, for example. That said, I could be won over with a more solid example :)

- Andrew

OK, let' do a more solid example.

Imagine two standard proposals

  1. GeekParadise is a solid standard developed in a seven year process by a reputed organization. There are already half a dozen independent and interoperable implementations. GeekParadise has about 1000 pages in length.
  2. PigWash is a hastily cobbled specification with only one partial implementation. It is also about 1000 pages in length.
Lets assume both proposals are brought to ISO, but fortunately safeguards are in place to filter the bad proposals out of Fast Track. Let's consider two scenarios.


Scenario A: ISO uses a firewall type of mechanism design to let standards in Fast Track only when they are 100% sure it will not clog the system.

For the sake of discussion, let's assume the filter is they exclude all oversized standards on the ground that they can only carefully examine small standards in the time alloted by Fast Track. What happens? Both GeekParadise and PigWash are out of Fast Track because of their length.For PigWash this is a good thing. The specification is hasty, so all the technical work has to be done from scratch. For GeekParadise, this is a waste of time and efforts. There is already seven years of technical work being done. The quality of the work can be empirically verified by the number of working implementations.


Scenario B: ISO uses an IDS/IPS type of solution where they allow every proposal to Fast Track and kick them off when they are found lacking.

What happens? For GeekParadise this works fine. They can't check all the 1000 pages in the time alloted, but it doesn't matter. The work has already been done. The sampling of pages that are verified confirms there are almost no defects. There is no reason the believe the other pages are not of equal quality. The number of independent implementation empirically confirms this assessment. GeekParadise can be safely approved by ISO. For PigWash, they also don't have the time to process all 1000 pages but it doesn't matter either. A sampling of a few pages confirms this is hastily done shoddy work with a large number of defects per page. The lack of working implementations also empirically confirms this assessment. As soon as enough evidence of poor quality is made available, ISO decrees PigWash is unsuitable to Fast Track and pushes it back to the normal length process.

I've been assuming that a better-prepared standard will pass more quickly through the ISO, as it takes less time to process a standard with less issues to resolve. If I were to make the opposite assumption, I would agree with your scenario A. Admittedly I don't have any evidence for my assumption, so unless you're holding back on a long-term study into ISO standardisation times, we'll have to agree to disagree on this one :)

I find scenario B much more interesting, and I'd like to suggest the following based on it:

A 5-day BRM consists of 35 hours of work. When dealing with an accelerated standard, NBs could be required to provide an estimated resolution time with every contradiction they submitted. Then, if the sum of times requested is greater than 35 hours, the process is automatically pushed onto the slow track. If NBs were encouraged to over- rather than under-estimate resolution times, the process would fail safely (slowing standards that should be accelerated, rather than the opposite).

Benefits of this approach include:

  1. limits standards by a more natural metric than length
  2. encourages NBs to consider how a debate will play out at the BRM, leading to better discussions
  3. Provides the convenor with valuable data with which to structure the BRM
  4. Simple, transparent rule that outsiders can understand and verify

Problems include:

  1. Vulnerable to filibustering ("this spelling mistake will take 200 hours to resolve")
  2. Using the sum is very inaccurate - two NBs with identical comments will take a fraction of the allotted time, two NBs with contradictory comments will take a multiple of the allotted time
  3. Accurately guessing how long you need is an art at best

Overall, this strikes me as a strong basis for discussion. It's probably not workable in its present state, because estimating resolution times would be so hard. However, even a significantly watered down version of this proposal would provide the ISO with an additional safety valve for controversial standards.

Although this suggestion would lead to more standards being inappropriately slowed (especially in the short term, as people got used to estimating resolution times), that's always going to be the case with a system that fails safe.

- Andrew

Don't try to equate the ODF standardization process with the abuse that was used to shove MS-OOXML down our collective throats. Please read <A HREF="http://www.robweir.com/blog/2008/02/fast-track-versus-pas.html">this explanation</A> of the differences between Fast-Track and PAS submittal processes.<P>

People who try to assign comparable genesis to ODF and OOXML are either ill-informed or are being disingenuous, to be kind.

Thanks - I'd missed that post, it fills in some blanks in my knowledge. Reading that, I can imagine an argument for abolishing FT altogether and just using PAS (not to say that I agree, just that I can see how such an argument could be made). The post makes a good argument about how PAS is less vulnerable to bad faith than FT, but my objection to accelerated processes isn't based on vulnerability to bad faith - if you made the system totally water-tight, the next move for a bad-faith actor would just be to go outside the system, e.g. to discredit the process amongst consumers.

One of my objections is that if you limit the time-frame, you also have to limit the amount of work done. Given that PAS and FT are both 6 month processes, they're comparable in that regard. I suggested limiting it at 30 pages, but I'd be interested to hear other suggestions.

Another of my objections is that if you accelerate standard A, you encourage backers of competing standard B to act in bad faith, at the risk losing a (sometimes pivotal) competitive advantage. Even if favoring standard A is absolutely in line with the technical quality of the standards, the incentives you create are just the same.

- Andrew

Ideally, if you accelerate standard A, there won't be a standard B because the world does not need two standards covering the same space.

If you accelerate standard A, there is no obligation to accelerate standard B as well just because standard A was accelerated.  This is because each standard should (MUST) be evaluated on its own merits.  If standard B is better prepared than standard A (multiple implementations, more consistent, no NB objections, no IPR issues, open (as in OSI's definitions), etc), then standard A should be canceled in favor of standard B (even if it's already started down the Fast Track).  The idea that if you accelerate standard A, then you somehow have an obligation to accelerate standard B ***REGARDLESS OF THE QUALITY OF STANDARD B*** runs counter to the purpose of ISO which is to produce the best possible standards based solely on technical merit.

I think we're talking at cross-purposes a little bit here. I'm proposing what I believe to be a solution that can get consensus in the ISO quickly, and which will reduce the risks of standards inappropriately being accelerated. I'm not debating what the ISO might look like in an ideal world, or trying to develop a fundamental change in the organisation's approach - I'm happy leaving that to Andy's “civil ICT standards” philosophy.

Just to be clear, I'm advocating that accelerated procedures should be harder to gain access to, not easier. My point is that if standard B is not going to be accelerated, then accelerating standard A is going to cause headaches like the ones we've seen recently. My reasoning is that if gentlemanly behaviour would cost a company a significant percentage of its income, then they're naturally going to look at ungentlemanly options.

Please correct me if I'm wrong, but it sounds like you're approaching the problem from the opposite direction - "what would an ideal ISO look like, and how do we implement that?". There's plenty of value in approaching the problem that way, but it takes years or decades for such things to come to fruition, and I'm personally more interested in good solutions available today than great solutions available tomorrow.

- Andrew

I think you're coming dangerously close to confusing 'business reasons' for approving a standard with 'technical reasons' for approving a standard.

I take your point, but any corporation that feels it needs to use abusive ("ungentlemanly") means to force a standard to approval just because someone else has already done so (or is doing so) clearly does not understand the concept of 'standards'.

What should be happening (and does happen in most cases) is that the first standard to approval becomes the basis and other standards are merged into it instead of competing with it.

Your posts sound to me like you would agree that MSOOXML *must* be passed because:
   1. doing otherwise is somehow 'unfair' to Microsoft.  (Remember that MS was on the original ODF committee, but chose to go their own non-compatible way - presumably to keep their business-interests/monopoly/lock-in from being eroded.)
   2. doing otherwise gives IBM some type of advantage in the marketplace. (OpenOffice.org is *far* more popular than Lotus and/or StarOffice though built from similar code-bases.  KDE Office is also quite nice and is NOT made by IBM.)  ...as we all know, if you're against OOXML, you must be an IBM supporter - Microsoft has beat that horse to death several times.
   3. doing otherwise results in billions of 'orphaned' documents.  (FYI: OpenOffice.org opens MS documents better than MS products at times and as good as MS products most/all of the time.
   4. doing otherwise might result in MS ignoring standards in the future, depriving us of a 'seat at the table' for the next Microsoft Office format.
   5. doing otherwise based on technical merit of the standard is just nit-picking because any and all technical problems with the spec can be worked out during the Microsoft-controlled maintenance period.
   6. doing otherwise and objecting to the lack of ethics involved in procurement of this spec is just whining.  After all, Microsoft got their way and that's what's important - not the quality of their spec, the number of implementations, or the inability of anyone other than Microsoft to implement the spec due to IPR and legal issues.

I agree that in a perfect world, the temptation would be there for the 2nd vendor to try to game the system.  The fact that Microsoft did so so blatantly, so unethically and with such callus disregard for process, rules, procedures, or the sovereignty of international NBs puts them beneath contempt and puts their product in the worst possible light and in a category to be shunned for all time by any government or ngo that has even a shred of integrity.

-- Ed

Before I move on to the meat of this debate, I'd just like to make a few boring points:

  • The argument about accelerating competing formats is only one plank of my argument - while I think it's an important issue to discuss, it's less important to me than the issue that you can only do a finite amount of work in 6 months. While you're right that I had half an eye on business issues in my original post, it was more like I considered them a happy side-effect, and I don't want to mislead you about its importance.
  • I used the word “ungentlemanly” before rather than “abusive” because it covers a wider range of issues, and implies a fuzzier distinction. For example, if a company were to hire an excessive number of non-voting people to argue its case, it could claim that it's not an abuse of the system, because non-voters aren't a part of the system. However, it would be much harder to deny that it was ungentlemanly conduct, because it gains an advantage through quantity rather than quality of arguments. In other words, talking about ungentlemanly behaviour avoids the risk of people some day using semantics to sneak badness into the process.
  • I think another difference between our approaches is that the fundamental question you're trying to answer is “how do we stop an abuse like this from happening again?”, whereas I'm looking at “what can we learn from this experience to improve things in future?”. Again, they're both questions worth asking, but it means there'll be some friction in the debate, where you're more interested in coming down hard on rare-but-serious abuses, and I'm more interested in gently nudging common-but-minor mistakes.
  • The first ISO document format standard was ODA, which spent 10 years going through ISO before being standardised in 1999. While everyone agrees that modern document formats shouldn't be merged into ODA, it's an interesting exercise to try and justify that to yourself.

With those out the way, I'll address the argument itself...

First, I'm not talking here about directly approving or disapproving a standard, only about the amount of scrutiny to give a standard before making the decision. While I'd be deeply opposed to making the decision to (dis-)approve based on anything but technical merit, I definitely think that business reality should be considered when deciding whether to accelerate a standard. It might be easier to understand if I turned the tables a bit:

Say Office Open XML went through on the fast track, but ODF was denied access to the PAS procedure because OASIS forgot to dot an 'i' or cross a 't'. ODF-based applications might then be locked out of government contracts for half a decade or more, while Microsoft embedded its format into the world's biggest bureaucracies. While KOffice and OpenOffice would probably survive, a budding ecosystem of ODF-based tools would die for lack of funding.

Obviously that's not the way things played out this time, but who's to say it won't be that way in the future? The only way to be sure of avoiding that sort of thing is to process all competing standards in a similar time-frame - and that means a long time-frame, since you can't accelerate an unready standard.

To be explicit about it, I would definitely disagree that Office Open XML must be passed. My position is that standards as large as ODF or Office Open XML should go through a long standardisation process, and only after that should a decision be taken. I'm sure that the document that would emerge after years of ISO work would bear little resemblance to the document we have today, so it would be premature to say whether I thought such a document should be passed.

On the topic of ethics and whining - I can see how my comments could be read that way, and I should really be more clear about it. I strongly believe that one should start from an ethical basis when deciding how things should be done, but ethical rules are a constant balancing act between creating incentives to do good and disincentaves against doing bad. In my opinion, accelerating one competing standard incentivises backers of that standard to do good, but also incentivises backers of competing standards to do bad. However, accelerating all standards doesn't create incentives for good or bad, and leads to lower quality standards too. By accelerating no competing standards, I believe you incentivise good behaviour and disincentivise bad, because good behaviour will lead to a faster, less volatile passage through the ISO.

- Andrew