The Standards Blog

How "Ignorant of Standards" was Microsoft Really?

OpenDocument and OOXML

Regular readers will notice that I've been woefully silent the last few weeks, at first due to having too many irons in the fire, and for the last ten days due to being on a family vacation abroad, returning not till July 2.  As a result, I've been not only behind on blogging, but also on keeping up with the news while limited primarily to Blackberry access since I left.   But I thought that it might be useful to take a break and share the "Huh?!?" I experienced when I stumbled across this article by Andrew Donoghue at ZDNet while briefly enjoying an island of laptop connectivity in a hotel lobby in Florence.  The article is titled, "Microsoft admits to standards ignorance pre-OOMXL" and is based on remarks by Microsoft national technology officer Stuart McKee.  Even more incredibly, it bears the following subtitle:

Microsoft has admitted that, despite being one of the dominant names in IT for over 30 years, it had little or no experience or expertise around software standards until the company was mid-way through the process of getting Office Open XML approved by the International Organization for Standardization.

Why "Huh?"  Because Microsoft has been playing the standards game, butting heads over prior technologies such as ActiveX, Java and much, much more with the best of them for decades as a member of hundreds of standards organizations.  Moreover, it has held many board seats along the way, and has had a staff of attorneys for some time dedicated to standards matters.  That staff includes the former General Counsel of the American National Standards Institute (ANSI).

Still, while McKee has over-spun the point by a few hundred RPMs, there is an important point to be made on the subject of Microsoft's standards-related capabilities, as I'll explain in greater detail below.

McKee's comments were made during a panel debate I regrettably declined to be part of, due to current travel, at a Red Hat conference in Boston last week entitled 'The OOXML battle: Who really won?'  According to  McKee,

We found ourselves so far down the path of the standardisation process with no knowledge. We don't have a standards office. We didn't have a standards department in the company. I think the one thing that we would acknowledge and that we were frustrated with is that, by the time we realised what was going on and the competitive environment that was underway, we were late and there was a lot of catch-up.

Given the history as I know it, you'll have to forgive me for coughing a bit into my hand over that one, although there is a grain of truth in McKee's statement.

That grain comes from the fact that Microsoft, in truth, does not have the kind of global standards infrastructure in place that IBM maintains.  IBM's position is, I believe, unique among US multinational IT companies in this regard, with no other peer company having the first hand experience, presence and participation in standards bodies around the globe that IBM has created over the last fifty years.

IBM has enjoyed this position since Tom Watson, Sr., decided to split responsibility for the company between his two sons at the time that he stepped down from day to day control.  When Tom Watson, Jr. took over as President in 1952, the consolation prize for his brother was assuming responsibility for IBM's foreign operations.  The result was a decades-long degree of focus and autonomy that later caused some issues Lou Gerstner had to wring out in order to make IBM a leaner survivor in the mid-nineties, but also gave birth to a level of local autonomy for managers in many countries around the world that helped foster greater integration into local National Bodies.

As a result, it is certainly true that when IBM, and to a lesser extent companies such as Google and Oracle, decided to lay on the full court, global OOXML press, Microsoft was left to play catch up ball, and lacked the type of finesse that its more experienced rival could bring to the game in countless venues around the world.  In consequence, much of its activity had to be undertaken not by standards professionals, but by a host of other staff from many disciplines.  Perhaps this is part of the reason for the number of both publicized as well as alleged missteps that attended the OOXML process.  In contrast, the ODF camp was able to make its case with very few accusations of heavy handedness being made against them.

According to the ZDNet article, McKee had no apologies for any of Microsoft's actions however:

Microsoft did not regret any of its actions during the voting process and claimed the company was merely trying to catch up with a process that it had very little experience of.  "I think the thing is that Microsoft was really, really late to this game," he said. "It was very difficult to enter into conversations around the world where the debate had already been framed."

So if Microsoft is in fact an experienced player in the standards game, why did it allow itself to get so far behind in the standards race infrastructurally? 

The answer is doubtless in part because most of the action in IT standards occurs in consortia, and not in the traditional standards infrastructure, made up of National Bodies, ISO, IEC and JTC 1.  Putting boots on the ground in the "mirror committees" for a given technical committee in countries around the world is a significant undertaking, and a far smaller percentage of IT standards are either created in the first instance through that process, or approved after development through the Fast Track process (as OOXML was, through Ecma) or the PAS process, as ODF was (through OASIS).  And few, if any, of the standards that do become so blessed by the traditional process have been as closely contested as was OOXML.  More typically, a far smaller number of National Bodies take an active interest in any single standard, and are generally interested only in improving it, rather than opposing it.  Hence, from a resources point of view, it's hard to make a case for the type of investment in standards professionals on a global basis that IBM maintains.

The result is that, notwithstanding all of the criticism that has been levied (including by me) at the traditional National Body/JTC 1 system, it does have one thing to be said for it: it's a huge job to try and steamroll (in this case) 86 National Bodies, compared to a single consortium or accredited body - a point that Rick Jelliffe has repeatedly made.  That takes an enormous amount of time if there is serious opposition. 

This time around, Microsoft was willing to make the effort, although with four appeals being processed and DIS 29500 OOXML implementation in Office now postponed until the next major release due to the number of changes to the OOXML draft along the way, the jury is still out on whether Microsoft can be said to have succeeded.  Indeed, McKee was also quoted as saying at the same presentation that "ODF has clearly won," although I doubt that this statement is any more reflective of Microsoft's internal strategic decision making consensus than his portrayal of his employer's standards capabilities.

So what will Microsoft have learned from this long, hard process?  Should it enter into a standards equivalent of the Cold War arms race, matching IBM standards professional for standards professional in virtual silos around the world, or become a more effective team player within the existing process? 

Embarking on the first strategy would take not only millions of dollars but many years to bring to fruition.  You can't simply send a green employee into a standards committee where the most influential members have often been working together for years and expect to get results. Thus it would seem that in open standards, as in open source, Microsoft will need to adopt a strategy that involves being a more savvy and collaboratively effective player in the overall ecosystem, rather than trying to maintain the hegemony of its historical, proprietary environment through efforts to promote defensive standards across the industry, at least in the face of determined opposition.

That would be good news for everyone - including Microsoft.  Why?  because it would be far easier for Microsoft to execute on such a strategy effectively, because it's not really ignorant about how to participate in this process at all.  Indeed, after being a team player in many, many consortia over the years already, working shoulder to shoulder with its competitors to help create many standards of common interest, it has no catching up to do at all.  Moreover, when competing in this manner, IBM and others should have no great advantage over Microsoft at all.

Or, as the old saying goes, "If you can't beat them, join them."  You'd have to be truly ignorant about standards not to take that advice.


Updated: An anonymous reader below has provided a link to a blog entry by Microsoft's Jason Matusow in response to the ZDNet article.  You can read much more of interest in that blog entry here, but I thought I would paste in here some of the details confirming my account of Microsoft's past and current standards activities:

More than eight years ago, a corporate standards organization was formed in the company to help product teams be better participants in standards orgs, to make more strategic decisions about what and where to contribute specifications, and how to deal with the legal issues surrounding standards bodies (there is an entire specialization in the legal field for this kind of work believe it or not).

Currently, the standards organization at Microsoft has more than 25 full-time employees in it and is focused not only on standards, but how the company thinks about interoperability and standards as a whole. What's more, because we are active in more than 150 standards orgs at any one time, and more than 400 overall - we have more than 600 product team and field employees who have been internally certified for standards work (and most of them are active in some committee or other). Our products have supported literally more than 10,000 standards and we have contributed specifications in the areas of development languages, runtimes, networking protocols, systems management, hardware, mobility, document formats, security,...the list goes on.

 

 

For further blog entries on ODF and OOXML, click here

sign up for a free subscription to Standards Today today!

 

 

Comments

Thanks very much - and yes, I am woefully behind in paying attention to my RSS feeds as well.  I'll update to include some of the useful data in Jason's blog entry, and appreciate the pointer to it.

  -  Andy

Permalink

Yes, I am baffled by Stuart McKee's statements too, and the Matusow article cleans up a lot of it for me.  It also fits with my experience.  The one time I met Mark Ryland in person was in April 2000 in Virginia where he was situated as a Microsoft representative in standards-related activities.   I have no details of what that involved.  I was shopping for a royalty-free license for some Microsoft-targeted standards development I was involved in, and I happened to be visiting DC on other business.  I knew of Ryland from his involvement in COM and DCOM though I didn't learn he was an attorney until recently.

 - orcmid (Dennis E. Hamilton)

I think there certainly is room for a JTC1 standard for open standards. Actually, I would like to a standard way of  declaring

 * open standard + open format
 * open source
 * open technology (i.e. open format + open source)
 * free standard + free format
 * free source
 * free technology (i.e. free format and free source)
 * RAND standard + RAND format

where the openness criteria would be the classic ones (e.g. Perens) avoiding feature creep but with enough variety that all the different major categories have clear names.

Of course, they would be abstract definitions (e.g. not name particular organizations, etc), but sufficient for use by procurement and regulatory drafting.  Rating particular standards would be a matter for other organizations and initiatives.

The trouble I see is that legislative definitions of openness will end up more or less broad than the academic definition: you can see this in legislation that allows RAND rather than RAND-z, or legislation that requires implementations. Having a standard which then provides a clear vocabulary (rather than arguing does "open" include RAND-z, legislators can argue more in terms of substance: do we want RAND or RAND-z for example.)   (My major concern with ISO standards is the MPEG/MHEG groups, where RAND licensing has basically locked out free developers from using them, which has been a real showstopper for desktop Linux: clear terms would help policy because governments *might* be able to see more clearly that supporting technologies that are standard, popular but RAND is little different from supporting technologies that are proprietory, popular and RAND, from the POV of moving to a free or open desktop.

If someone wants to contribute to or to draft such a thing, I would be happy to work on it.

Cheers
Rick Jelliffe

Rick,

Unfortunately it's far easier to start a dialogue on this topic than it is to reach a valuable endpoint.  I've long been frustrated on how hard it is to achieve consensus on even the high-level concept of  "openness," although the debate has been constant for as long as I can remember - and changing all the while.  To traditional standard setting (non-IT) folks, it's all about process.  To open source folks it's all about IPR.  To consortium folks IPR is the hard part, and process falls into place pretty easily, in that people who opt in work out the process they like and follow it without too much angst that I've ever seen - and in a well-run consortium there's little difference with an SDO.  To SDOs there's more allegiance to RAND and less willingness to go RAND-Z than in consortia, and in many industry verticals the right to charge a royalty is hallowed - that's a major reason to be there at all.

Hence, I'd be inclined to de-emotionalize the issue by dropping the word "open" out of it entirely and try to be more empirical, focusing on practical outcomes and process features rather than on value-associated labels.   Governments could then pick what they cared about from a list that might include the following, for starters, and call that "open" for their purposes.  And attributes like these would be easy to confirm, making certification a simple process.  Hopefully, market forces would take it from there, given that in many cases there would be more than one SSO that could host an activity, and therefore there would sometimes be competition among them that would lead to making tracks available that would provide these attributes where desired.

For starters, here are two main categories, with examples:

Outcome points (the following would relate to commitments made by process participants, and would not be a guarantee that there were no patents in existence that could be problematic):

-  Standard is free to download

-  Free to implement

-  Able to be implemented in GPL 2 or 3 (or LGPL 2 or 3) software

-  Appropriate test tool available

Process points:

-  All able to participate for a reasonable price

-  Drafts at some point are publicly posted and may be commented on by anyone

-  Ex ante disclosure of licensing terms is required

-  Process is transparent (e.g., no closed meetings, listservs open and archived, minutes are detailed and public)

  -  Andy

Although it is obviously better to refrain from including names, the fact is that both in number and combined size of projects, the GNU GPL licenses are the most used ones (2/3 of SourceForge projects).

If a "Free" or "Open" (or whatever) standard contains license conditions that are not compatible to the GPL, then it will be neither "Free" not "Open" to use in the majority of "Free and Open Source Software".

Several companies, eg, Apple, Sun, and Microsoft, have been able to formulate "Open" standard license conditions that might pass the Debian test, but were carefully written to exclude GNU GPL licenses. Personally, I think many of these incompatibilities were intentional and an attempt to "game" the FLOSS community. That you can do better is shown by the European Union Public Licence (EUPL v 1.0). The EUPL actually contains a list with the names of compatible licenses, which includes the GNU GPL v2.

So, even without actually adding the names of the licenses to the conditions, care must be taken that the standard for openness is compatible with the important licenses. If a group wants to be "open", they should really mean "open to all", and not "open to non-competitors".

Winter

A further example.

Most (or at  least many) of the licenses offered for implementing open standards by the large corporations (directly or through consortia) w.r.t. their technologies have two major flaws, IMHO:

 1) They only extend to using the technology for a particular standard, not for all standards

 2) They only extend to necessary (or essential) claims, not for any claims.  "Necessary" includes optional normative parts of a standard, but the way it is used means that strictly the coverage only exists if it is the only way to implement the standard: it guarantees that there is at least one way to implement the standard that is open.

However, I don't see that this is nearly good enough for the requirements. For example, say you implement OOXML and you find there is some IBM patent (and assuming IBM has this kind of license and have not extended it to OOXML): you might be able to use it for an ODF implementation but not for OOXML under problem 1) above.  Or say you implement ODF and you find there is some MS patent (assuming the MS covenant not to sue is this kind of license, regardless of it being extended  to ODF), the fact there was some other, perhaps horrible, other way to implement it is enough to lose the protection.   Now this is a real issue: think of how Adobe's anti-aliasing patents have held back implementation of postscript fonts and PDF on linux for example.

To the extent that my concerns are valid, I think the standards and open community need to go much further than essential claims in order to bring in the idealized golden age of open source (let alone free software) by riding the white stallion of standards. 

Cheers
Rick Jelliffe

The issue under discussion is about open standards and not about open source implementations.
Having the ability to freely distribute/sell an implementation of a standard must also be available for companies that create proprietary products. Also these companies shouldn't have to pay any money or have a patent cross license deal in place to be able to sell their proprietary product.

That is the reason why I think it should not mention a specific Open Source license, because it is not only for Open Source.

A minor point against using explicitly the GPL in there is that ISO doesn't have control over the content of the GPL, so in theory it could change in such a way that standards that used to be OK are no longer OK. That is a situation that should be prevented and therefore a generic description should be made and no reference to any specific license.

Yes, standards that reference externally developed technologies always have to be careful of treading on people's toes or poaching. Sometimes technologies are standardized in the full knowledge that people will prefer the most recent version of the original: ISO HTML is an example of this.  These kinds of standards give the lower limit of what a technology means, rather than the upper limit. 

However, there are ways of referring to existing concrete licenses without actually standardizing them by doing so: mentioning them in informative notes in particular. You really want to avoid the situation where you mean something in particular (e.g. GPL) but you circumlocute and so cause readers to miss the rationale.

Cheers
Rick Jelliffe

Winter,

Yes, that's what I was driving at - getting away from defining "open," which has been a black hole for years now, and is again having a bumpy ride in the IDABC and instead focusing simply on results - can this or can this not be implemented in GPL or LGPL, as appropriate?

That said, I did have one other thought behind choosing that criterion, which was that the stringency of these licenses puts them at the top of a "tree" of many other licenses.  Hence, referencing is a shorthand way of enabling many others.  For this reason, I should have been a bit more precise, and instead said something like:

"Can be implemented in GPL, LGPL, and under terms permitting implementation under other licenses compatible with the GPL and LGPL"

  -  Andy


I still think it is very important to distinguish between the license for the standard and the license for an implementation of the standard. But I think in general we agree.

Although a GPL style license for standards may be better then a BSD style license. The interpretation of a GPL style license for a standard is that if a company extends a license it should make the extension to the standard available under the same terms as the original standard. This way it is guaranteed that even though a company extends a standard everybody that wants to can still interact or integrate based on the same conditions as the official standard.

This gets into very different waters.  That isn't to say that there isn't a good case to be made for it, but just by way of information, traditional standard setting commitments are limited to implementing the standard itself - sometimes only in its entirety - and nothing outside of that.  The concept is that the IPR owner can be fairly asked to give up that much to achieve interoperability, but no more, or the incentive to innovate will be undercut.  Everything else is a feature that is elective, and for that, the IPR owner should be entitled to expect to be paid, since nobody says that you need to infringe on its patent to comply.  Instead, youi're free to come up with your own useful, differentiating innovation (and patent it, too, if it qualifies).

There is something to be said for that view from an economics and logical point of view, if you believe in the validity of patents to begin with.  Otherwise, there's no way to say where the standard stops and the unique product begins, and the whole concept of patent value becomes impracticle.

Be that as it may, I am not aware of any standards organizations that require free use of patents underlying extensions to standards that are not incorporated into the standard itself through the technical process of the organization that developed the standard.

  -  Andy