ODF vs. OOXML: War of the Words Chapter 5

This is the fifth chapter in a real-time eBook writing project I launched and explained in late November.   Constructive comments, corrections and suggestions are welcome.  All product names used below are registered trademarks of their vendors.

Chapter 5:  Open Standards

One of the two articles of faith that Eric Kriss and Peter Quinn embraced in drafting their evolving Enterprise Technical Reference Model (ETRM) was this:  products built to "open standards" are more desirable than those that aren't.  Superficially, the concept made perfect sense – only buy products that you can mix and match.  That way, you can take advantage of both price competition as well as a wide selection of alternative products from multiple vendors, each with its own value-adding features.  And if things don't work out, well, you're not locked in, and can swap out the loser and shop for a winner.

But did that make as much sense with routers and software as it did with light bulbs and lamps?  And in any event, if this was such a great idea, why hadn't their predecessors been demanding open standards-based products for years?  Finally, what exactly was that word "open" supposed to mean?

To answer these questions properly requires a brief hop, skip and jump through the history of standards, from their origins up to the present.  And that's what this chapter is about.

At a high level, standards have been around for millennia.  The basic concept is that when people agree upon a common definition for something once, they can save a great deal of time and trouble in the future, because they never need to describe that "something" again.  This holds true in an astonishingly wide number of situations – so wide, in fact, that we take the concept of standards entirely for granted.  Language, for example, is one of the earliest examples of a standards-based system.  If we both agree to use the word "deer" for the same animal, then conveying the news that supper can be found nearby becomes much easier and faster.  Of course, each person must agree to give up his right to call a deer something else, but that’s a small price to pay in comparison to the mutual benefit to be obtained – in this case, survival.  Stated another way, a standard is a voluntary agreement that is reached because of the mutual benefit that adopters expect to gain from honoring that agreement.   That’s really all there is to it, although the situations, benefits and challenges specific to the situation vary.  The concept is almost infinitely extensible, embracing not just what we think of as technical standards, but also professional credentials, moral systems, laws, and much more.  Indeed, the concept of consensus-based standards is one of the bedrocks of society.

One of the earliest sets of formal standards evolved to quantify the physical characteristics of objects, and the genealogy of what eventually became technical standards can be traced back to this root.  In each case, the specifics of the standards adopted (e.g., feet, bushels, gallons, and pounds for length, solid volume, liquid volume, and weight, respectively, in the English system of measurement) were totally arbitrary, in the sense that any specific measurement would serve just as well as a standard as any other.  And, in each early society, it did.  Through the use of such measures, meaningful orders and deliveries could be made, and value more easily compared and assessed.  As the same weights and measures became more widely adopted, or at least familiar (especially around the Mediterranean), trade was dramatically facilitated and commerce could become far more sophisticated.

Coinage was a logical next step, and recognized the need to standardize another physical property: the purity of precious metals with intrinsic, but easily adulterated value.  Impressing a coin with the seal of a monarch applied another key concept that would remain fundamental to the usefulness of standards down through the ages: certification.  Just as an Underwriters Laboratory logo tells us today that a consumer product meets relevant safety standards, the King’s mint mark certified that the metal comprising a coin had been tested to be of a certain minimum purity, and therefore value (today, you can buy hundreds of different standardized "reference materials" from NIST, which certifies their purity.  Some of these materials are quite surprising).

As time went on, standards were assigned to most other important characteristics that could be measured, some of which were non-physical, and therefore presented new and unique challenges.  Time, for example, proved to be particularly challenging, once you got past the concept of "day," because a standard that can’t be easily measured has little utility.  Only with the discovery and application of astronomical principles and the development of hourglasses, sundials, and eventually clocks could date and time standards become precise and reliable in application.

There things more or less stayed until comparatively recent times, when the next great leap in standards development occurred.  As has so often been the case throughout human history, the driver of the next great innovation was war.

The problem to be solved was this: over the years, the arts of the armourer had progressed to the point where the initial gunpowder-powered engines of war (cannons) could be miniaturized (becoming guns).  Once miniaturized, infantry could carry these new devices, resulting in a vastly larger market for the product.  Still, creating a flintlock musket was in some respects more challenging than casting a one piece, cast iron cannon barrel.  A musket used multiple small parts that needed to fit together precisely, and which could be damaged relatively easily.  Special skills and tools were therefore required to fabricate or repair a gun, unlike the carriage of a cannon, which was more robust, and which any wagon maker could repair.  This made a musket not only expensive to fabricate, but also resulted in each piece becoming effectively one of a kind.  A musket was also very slow to produce, because a gunsmith and his apprentice typically made every part.  If a musket was damaged in battle, it might therefore be weeks before an opportunity could be found to repair it unless skilled gunsmiths, and all of the tools of their trade, were part of the afterguard.  The obvious solution would be to carry spare parts, but since the fit of the parts in each gun was slightly different, this would require carrying spare parts for each gun.  Eventually a less obvious solution occurred to someone, which was that it would be highly advantageous if the parts of a gun were to be made exactly alike, and therefore become interchangeable.  Repairs could then be made quickly in the field by semiskilled camp followers using only light tools.

While the concept of interchangeable parts would eventually lead to Henry Ford’s production lines, it did not immediately result in the birth of technical standards, because the spare parts created were unique to the manufacturer of the gun.  But with the dawn of the Industrial Revolution, technologies and manufacturing processes each began to rapidly evolve, and three things changed that set the stage for wide adoption of technical standards:  first, machines became much more complex, requiring more and more parts, often similar, and requiring periodic replacement.  Second, the costs of producing machine parts began to drop, as new manufacturing devices were designed that could do more and more of the work of fabrication.  And third, manufacturing gradually became more specialized, such that vendors might no longer need to manufacture every single part of their final products.

This last development followed inevitably when it became more cost-effective to specialize at both the high and the low ends of manufacture, and as the separate elements of design, assembly and sales became more valuable in their own right, allowing greater profit margins at the high end than the low.  If you manufactured power looms, for example, why not buy the simpler parts, if available, leaving fewer custom parts to create in order to create a salable product?    At the low end, why not specialize in a limited number of products fabricated with a high degree of efficiency, allowing you to offer lower prices and undercut your competitors?  Manufacturing thus became more layered and granular: instead of making a business out of cooperage, blacksmithing or spar making, a concern could make a business out of supplying individual parts that were needed in high volume by upstream manufacturers.  This strategy, of course, would favor suppliers able to manufacture uniform parts of consistent quality over those that were only able to fabricate rough parts that had to be finally sized by the customer prior to installation.  The continuing evolution of manufacturing techniques soon made this possible.

The next major step in the evolution of modern standards therefore involved a humble item at the very bottom of the production pyramid: the screw fastener.  By the 1800’s, the screw had evolved from a tour de force of the blacksmith’s art to a fully machine-produced item that was in dramatically increasing demand.  Since manufacturing screws at an economically feasible price required very sophisticated machinery and resulted in finished products that had no features unique to individual machine shops, they were a perfect example of a product destined to be made by companies specializing in fasteners of all types, rather than by the makers of the products that consumed them.  They were also ideal to become the inspiration for modern technical standards as we know them today.

Consider, if you will, what makes a screw a screw:  its value lies as much in its total uniformity of design as in its ability to hold two things together.  Everything about it is specified: length, bore, number of threads per inch, shape of head, and type of screwdriver slot.  Even the material (brass, stainless steel, galvanized, etc.) is standardized today.  Moreover, a screw is part of a larger system (nuts, drill bits, taps and dies) that mirrors the same standards.  Indeed, a screw is as much a collection of standardized features as it is a physical object.

Because the precise dimensions of a screw (just like a weight or measure) were arbitrary and easily adjusted, the customer could specify those measurements, and put the order out to bid rather than manufacture them itself.  But unlike a nail, the screw did have one unique element: its threads.  If a screw were to have a nut, or if it was to be used to fasten metal to metal, the threads of the screw needed to be identical to those in the nut or the machined hole.  And unless the screw manufacturer was willing to also manufacture and sell drills, taps and dies (or the manufacturers of those tools were willing to sell a different set of their tools for use with the wares of each manufacturer of screws), then the threads of all screws would need to be uniform.

Moreover, unless every screw manufacturer wanted to sell direct to every single one of its customers, it would need to offer a standardized product that could be sold at retail.  Ultimately, the value of an individual screw was too low, and the design too obvious, to avoid commoditization under such tremendous pressure from practical considerations.  The threads of the screw, along with its other dimensions, therefore made it the perfect progenitor for standards, resulting in what we would now call an "interoperable product."  Standardization of other products followed, resulting in lower prices per piece, but in far larger markets for the standardized products as well.

Railways provided the impetus for another major advance in standards.  As the landscape became increasingly divided among the new joint stock companies created to build railroads, an obvious need developed for the rails of one system to be placed exactly as far apart as those of its neighbors.  Otherwise, the long distance shipment of goods would require unloading the freight from one set of cars and the loading of another each time a commercial border was passed.   At first, standardizing railway gauges required government intervention.  But soon the benefit to the railway owner became apparent as well, because the value of each railway as a stepping-stone for high volumes of freight and passengers going long distances was far greater than the value of a railway carrying low volumes within its discrete island of transportation.

The standardization and linking of the railroads resulted in one of the first and most powerful examples of the "network effect."  By the late 1800s, life had been transformed, as goods and people began to travel quickly, easily and economically over long distances.  Not only could goods arrive faster, but perishable goods could access markets they could never reach before.  Just as the value of the Internet and the Web increases as the nodes and information that are part of it expand, so too did the value of the railway system increase incrementally, mile by interlinked mile, and station by newly added station.

The effects on standardization were equally profound, as access to rail transport provided benefits that offset the costs of standardizing other goods and packaging in order to be loaded more efficiently in boxcars and on other rolling stock.  Eventually, the effect was extended across oceans as well, with the popularization of railway car sized containers.   Even the measurement of time standards was affected.  Before the railways, local time was whatever the town clock said, which was usually different from what the clock the next town over indicated.  And why not?  With travel between the two taking so long, minor differences were irrelevant.  Once the railways ran, however, time needed to be synchronized over long distances, or you would miss your train.  And as the railway lines became longer, time needed to be divided into zones as well – all so that railway schedules could become reliable.

Standards of all types rapidly proliferated as the evolution of manufacturing, technology and society accelerated in modern times.  Boilers that were apt to explode were harder to sell and uninsurable, so boiler manufacturers banded together to develop, adopt, maintain and promote design standards that would make their products safe – a process that required forming some of the first standards associations.  Other types of safety standards followed.

The same realities that led to standardizing the design of screws soon drove the standardization and commoditization of more, and over time, more sophisticated, products.  Initially, smaller manufacturers would simply copy the dimensions of the products of larger manufacturers, but eventually the process became institutionalized in organizations created for that purpose.  The increasing prevalence of domestic systems utilizing multiple parts that needed to be connected together, such as plumbing and electrical systems, further drove standards.  And the rise of an urban middle class increased the market for consumer products that logically demanded standard as well, leading (for example) to standards for light bulb bases and light sockets.

Some new types of new products led to the need for new types of standards, and especially "performance" standards, such as the power demand of light bulbs (measured in watts) and the light output of the same bulbs (measured in lumens), so that not only were the bulbs of various manufacturers interoperable in fact, but their prices could be meaningfully and easily compared as well.   Of course, more and more associations were needed to develop, maintain and promote all of these standards, because government had neither the resources nor the inclination to provide them.

Initially, these associations were by definition national.  But the emergence of telecommunications and the laying of trans-oceanic cables led to the formation of the first truly global standards body (the ITU).  With time, the increasing volume of international trade led to others.  Not long after the end of the Second World War, all three "Big I" organizations were in place, with the addition of the ISO and IEC.

While the ITU became a treaty organization, the ISO and IEC did not.  But the quasi-governmental role of the ISO, IEC and the many niche and national bodies setting standards was also recognized.  Process concepts were therefore developed to ensure that all of those affected by a standard, as well as those that would directly benefit as vendors ("stakeholders," in standards parlance), could have a say in the development of a standard.  These concepts were general rather than codified in specific rules of process.  The standards that emerged from this system became known as "de jure" standards (Latin for "lawful"), despite the fact that use of these standards remained consensual and market driven, rather than required by law, unless incorporated into law by legislatures, as occurred with increasing frequency as the value of private sector standards became recognized.  Standards that emerged in the marketplace purely as a result of the market dominance of one or a few vendors, in contrast, came to be known as "de facto" standards, indicating that although widely adopted "in fact," they had not been developed in a process open to all stakeholders.

In modern times in the IT industry, standards created through the de jure process or through well-respected consortia, sometimes also began to be referred to as "open standards," although the exact definition of those two simple words has recently become the subject of spirited disagreement, in part because specific standards have become increasingly important to the business strategies of individual vendors, and in part due to the growing popularity of open source software, and the restrictions and requirements of many of the licenses under which such software is made available.

This rapid romp through the history of standards provides not only the background for a discussion of standards in the IT industry, but also the basis for a number of important insights into why open standards in the computer world have not followed as neatly, or been adopted so universally for software and servers, as those implemented in light bulbs and lamps.

The first insight is that the IT industry is, by historical measures, still a new industry.  Standards tend to follow, rather than lead, innovation for a number of reasons, some obvious and others less so.  When new technologies emerge, vendors need to design and manufacture the unique elements of their new products above the level of existing commoditized parts.  Such products can therefore frequently be proprietary, because a significant portion of their new value is, by definition, new.  Stated another way, technology has to exist before a need can develop for it to become standardized.  Moreover, since many technologies fail in the marketplace and because standards require time and resources to create, products tend to come first and standards second, and only for those products that gain traction.  Also, unless a technology is likely to be useful over a long period of time, the effort of standardization may not seem warranted.  Finally, standards do have costs beyond time and effort in development: inevitably, they restrict the degree of design freedom that a vendor can exercise.  Standardization purely for standardization’s sake therefore does not make sense.  For all these reasons, throughout modern history the standardization of new technologies tends to arise after an era of rapid, unrestricted innovation.  This process allows multiple experiments to succeed or fail in the marketplace before one is anointed as the candidate to be fixed in the amber of standardization.

The result of all of these forces is that, until the 1990s (as discussed in earlier chapters), the first decades of the computer era were typified by proprietary vendors selling high margin, sophisticated products to customers that the vendors sought to hold on to for as long as possible once they had been secured.  Most customers therefore lived in proprietary "silos," and entrenched vendors had far more to lose than to gain if switching costs were to decline.  Gradually, standards did become pervasive in ways that allowed vendors to buy the modern equivalent of very sophisticated screws (think disk drives) at extremely low prices.  But the proliferation of standards could be a two edged sword if the result was a level of interoperability that would allow customers to mix and match hardware and software as if they were home stereo components, or to migrate easily to a competitor’s products entirely. 

As IBM found to its sorrow with its ground-breaking line of PCs, this could even occur at the product design level.  The good news for IBM was that it had succeeded in setting "the standard" for desktop computers, in the sense that the PC design became wildly popular in the marketplace.  Unfortunately, the same processors as well as the operating system IBM had selected for use in its PCs were available to its competitors as well, permitting other vendors to build systems capable of running software originally developed for IBM PCs.  The bad news was therefore that IBM’s PC architecture became not only a commercial benchmark, but an available de facto standard as well.  Soon, scores of competitors were selling competing, and cheaper, desktop computers upon which the same software would run.

To be fair, the slow process of achieving interoperability at the product level through deliberate standard setting efforts was not purely the result of vendor-proprietary impulses.  Just as the industrial revolution brought about the need for entirely new types of standards, so also did the IT revolution (and how could it not be so?)  Unlike screw threads, which are easily implemented with complete fidelity, it is sometimes only feasible to create a standard for software that, in a given case, at best will enable two products to become close to interoperable.  After that, tinkering and testing is necessary to accomplish the final "fit."  Similarly, the costs to innovation in achieving true "plug and play" interoperability when that result is feasible may be unacceptably high, leading to a decision to create a standard that (like ODF) only locks in a very significant amount of functionality, rather than complete uniformity (as OOXML strives to achieve).

Until recently, then, while vendors would frequently extol the virtues of "open standards" for marketing purposes, their commitment to actually deliver them was often selective at best.   As with previous industries, with time this position began to change as the industry matured and became more multifaceted as production became more layered and internationalized, as those seeking to compete with entrenched incumbents became more hungry for their share of the end-user pie, and – most importantly – as the Internet truly did begin to change everything.

It would require far more than this single chapter to fully analyze the causes underlying the increasing allure of open standards, but the influence of the Internet cannot be ignored, if only to share one final insight derived from this brief review of the history of standard setting.  Just as the joining of the railways created an explosion in value that radically changed the cost/benefit ratio of standardizing not only railway gauges, but everything from time zones to the dimensions of all manner of freight items, the value of the Internet is reordering almost everything in the IT industry to some degree.

At the macro level, the ability to connect to the Internet is all important to customers, and trumps any vendor’s desire to keep its customers trapped in a silo.  Once connected, however, customers become exposed to new kinds of competition, such as the provisioning of software as a service (SaaS).  Moreover, the Internet enables entirely new platforms to exist that initially are not owned by anyone, such as today’s increasingly powerful smartphones and other mobile devices.  With so enormous a market at stake, vendors can easily conclude that they are better off pushing for open standards in order to ensure that they have a chance at securing some piece of the pie, rather than rolling the dice in a high stakes gamble to achieve monopoly power, and end up completely out of the game.  The current success of Linux in the mobile marketplace and the recent opening up of platforms by major telecom carriers is current evidence of just such a market conclusion.

Looking back a decade from now, we will see few products, few services – and few vendors – that do not show the dramatic effects of this transition.  Resisting the power of the network effect of the Internet will be like resisting the impact of the railways.  Vendors that do not embrace that reality, including by adopting the standards that will continue to be developed to serve the Internet, will find themselves left in the dust as absolutely as a 19th century town that the railway passed by.

Next installment:  The Development of the ETRM

<-Previous Chapter:  Eric Kriss, Peter Quinn and the ETRM
An open standards tree grows in Massachusetts

<All Chapters>

sign up for a free subscription to Standards Today today!

Comments (21)

  1. Two points that I would have liked to see even if it might confuse flow of this chapter:

    1. Software development was standardized and collaborative before it became proprietary (further information if you read about UNIX history).
    2. Of the various definitions of open standards understanding difference between patent ambush standards (RAND) and truly open standards is important.
  2. For those just joining us ETRM is the Massachusetts framework  = Enterprise Technical Reference Model

    (see last part, and as it seems next part)

    • Thanks, and good point.  I’ll add that into the text as well.

        –  Andy

  3. So we’re now at the point where IBM has put symphony.lotus.com/ Lotus Symphony out, as a kind of ‘Public Service’. Or a ‘Marketing Novelty of Nominal Value’.

    I can be absolutely certain the IBM will have no commercial objection if I download this, use it, and distribute 6 billion of them one to every person on the planet, for any price or none.

    But I can’t be certain that some other corporation will not complain; I also can’t be certain that no government body (Trading Standards or Customs) won’t complain; and I can’t be certain which way a judge in a court of law (more likely a court of equity) would decide a case if brought. Probably will fall to the highest-price lawyers, or to ‘home pitch advantage’. Commercial laws such as copyright, patent, antitrust/competition, might all be argued.

    So the current position poses huge uncertainties to those who are not protected by the ‘Mother Ships’  of IBM and Microsoft (either as customers or as employees).

    Where does this leave the scientist ? The engineer ? The small-to-medium business ? The school or university teaching the next generation ?

    • Although I understand your general point, I don’t think that it applies in this specific instance.  The only person who can complain about what you do with someone else’s intellectual property is the owner of that property.  If some third party could claim that your wide distribution of that intellectual property somehow hurt them, they would only have a cause of action against the IPR owner, because that owner had made your conduct possible.  There are exceptions, such as if the work in question was clearly libelous, and you knowingly redistributed it.

        –  Andy

      • Well, anyone (adequately funded) can file suit against anyone, for any reason or none, and it has to be responded to otherwise you lose by default. The lawyers will ask for expert opinions from the engineers and scientists, but ultimately it is a lawyer … a judge … who will tell a corporation what it may not do and how many dollars it must pay in restitution.

        Does your answer hold true in all jurisdictions in the world ? USA ? Europe ? China ? Russia ? Myannmar ?

        ‘Common law’ jurisdictions like USA and UK ? ‘Napoleonic code’ ones like France and its ex-colonies ?

        It somehow seems likely that a corporation with a large amount of capital and a threatened revenue stream might consider its options on that one, with a view to prolonging the revenue stream.

        The way OS/2 ‘died’ was that IBM took $750M off Microsoft as settlement in an anti-trust case, and within a week IBM announced that OS/2 would be withdrawn as a product at the end of the year. It was 2005; not really very long ago. Now, instead of selling OS/2 Software and OS/2 Services, IBM sells Linux Services.

        Getting an individual tangled up in this … or getting a small-or-medium business tangled up … would be rather like being squashed between two aircraft carriers named IBM and Microsoft. There will be a slight dent on each big ship, but the individual or the dinghy won’t survive the encounter, or will be severely financially wounded.

    • The same applies to proprietary software – a patent troll can sue you for using IBM or Microsoft software. Of course the vendor of the software may indemnify you (but do they? read that EULA!); on the other hand, Redhat and Novell offer indemnification on open source software. So I don’t get your point.

      Lou Steinberg

      • I thought of differentiating between copyright and patent when I gave the above response, and then didn’t; my point was addressed to copyright, which is where I thought he was headed.  Be that as it may, I guess I don’t really get the comment.  Why distribute 6 million copies of a program that anyone can download direct from the source?

          –  Andy

      • Why distribute copies of the campaign material for your favourite presidential candidate , when anyone can download the material from the candidate’s  web site ?

        Why distribute ’10 cents off Pepsi’ vouchers, when anyone can download and print their own from Pepsi’s web site ?

        Patents, copyrights, and antitrust all have to do with ‘Control of the distribution channel’. By all accounts, Microsoft likes to control the distribution channel for software, whereas IBM is willing to scrap control of the distribution channel for software, but take back control of the distribution channel for service (and for enterprise-class hardware).

        So there are commercially-opposed views. And it is like opposing armies staring across a no-man’s land. And since neither is going to leave the business any time soon, it’s permanent.

      • Oh, besides, it was ‘6 billion’. If everyone on the planet had a copy of Symphony (or OpenOffice.org, or Google Star Office, or ‘koffice’) then it would be similar to what happened when everyone who wanted a typewriter had one.

        The market for typewriters collapsed.

        Businesses bought other stuff; but not typewriters.

  4. Andy, this is perhaps the best article on the subject I have read. Great historical background and you’ve once again demonstrated your comprehensive understanding of the landscape and communicated it clearly and without flourish. thank you !
    David J Patrick
    http://www.linuxcaffe.ca

    • Thanks very much for the kind words, David.  Each one of these chapters takes 20 – 30 hours of research and writing (including several revision drafts, but not counting additional background reading), so the positive reinforcement is appreciated.

        –  Andy

  5. Andy,

    This is great, and I hesitate to suggest any changes, but …

    For many IT folks (at least for those of us old enough to have lived through it) the early development of the internet, and the culture of interoperability that gave rise to and matured with it, serves as perhaps the paradigm of open standards and the benefits they bring. It would be great if you could include a bit more of this history, perhaps here or perhaps as a separate chapter.

    — Lou Steinberg

    • Indeed, I completely agree.  I’ve been involved with standards for more than 20 years, and I well remember for how long every vendor talked about open standards, but none of them really believed in them.  But the success of Internet gave them no choice but to get on the bus.  I tried to convey this in the end of this chapter, and not to worry – this is s theme I’ll be returning to.

        –  Andy

      • Very interesting again, thanks.

        It might also be worth giving an example of companies denying themselves increased income rather than standardise their industry (touched on by your mention of rail-guage). One that springs to mind is Australian mobile phone network operators who for a long time all refused to allow inter-network SMS. When they finally gave in, SMS volumes leapt orders of magnitude, the resulting revenue increase way more than compensating for any loss caused by churn due to freeing families, groups of friends, and businesses from network lock-in.

        Of course, finding an example more appropriate to your subject that is also publicly documented might be difficult – companies can be a little shy about revealing to their shareholders years of revenue lost for no better reasons than fear of the unknown, or even plain spite for their competitors. I learnt of the above example, I think, from a brief comment during a radio interview with an industry insider about network technology. I didn’t see any other mention of this huge, self inflicted "revenue hole" in any other media.

        Huw

      • Consider the current situation with the RIAA and DRM.  For years they’ve been suing individuals & trying to shut down p2p network systems while advocating DRM (especially MS DRM) where it is intended to ‘protect’ their so-precious IP (copyrights).

        Sony was the last of the big labels to finally abandon DRM due to dropping sales, customer complaints about interoperability and their continuing failure to stop ‘piracy’.  Additionally, examples like RadioHead & others where artists bypassed the labels and actually *increased* profits & sales (for the band – not the label) pose a possible threat to long-term income if more bands abandon the labels & follow the lead of those doing direct distribution.

        To tie this back into the thread at hand, how will the labels explain the choice to deploy DRM (including the Sony rootkits), persecute p2p sites, sue tens of thousands of their own customers for ‘piracy’ whether or not they own a computer to their shareholders and persuade them that doing so was the moral and ethical thing to do as a business ?
        It is my opinion that the RIAA may have gone a step too far and may have irredeemably damaged their reputation and businesses.

        -Ed

    • Well, they bring benefits to some, but at the expense of others.

      None of the 3 ‘current generation’ games consoles (PS3, XBox360, Wii) are designed to open standards, and the games are not interoperable.

      What would the consequences of an ISO-standard games console be ? How about an ISO-standard game ?

      • Gaming isn’t an area I’ve paid a lot of attention to, but I believe that there has been some standardization.  If you do a site search here, I think that you’ll find a consortium or two or some standards news involved in this; if so, the news would be here and/or here.

        Unfortunately,  the culture in consumer electronics like games and video formats has been very "winner take all" proprietary, so they’re not likely to join in any standards activity at the console level.  Also unfortunately, it’s a lot easier to go open source with software then it is to deal with proprietary hardware hooks, although hackers are getting better at that, too.

          –  Andy

  6. But because the same processors as well as the operating system IBM had selected for use in its PCs were available to its competitors as well.

    It looks like this started out as a two-clause sentence, but at some point it was separated from the following sentence and you forgot to remove the "because".

    • Thanks for the catch.  I’ve split and slightly modified the paragraph while correcting the error you spotted.

        –  Andy

  7. Look what i just got from a UK Microsoft business manager:

    Hello

     

    As you may already be aware one of Microsoft’s top priorities at the moment is to have our new Office 2007 XML File Format (Open XML) ratified as an international standard by the ISO in March.

     

    To secure a favourable vote from the ISO we need to mobilise as much support as possible from customers and I was hoping you might be able to lend your support by assisting us with the following:

     

    1)      Go to the Open Xml Community site and register your support here, by providing a quote. Your quote is then published here.

     

    The new file format was ratified as a standard earlier in the year by Ecma International and we think the new open file format will have a significant benefit to customers and the industry in general, in areas such as:

     

    1)      Open XML is platform independent, meaning it can co-exist with other file formats including Open Document Format (ODF)

    2)      Improves Document Preservation by ensuring document formats can be consumed long into the future without vendor-specific clients or applications

    3)      Document Assembly is improved as server-based or user-assisted construction of documents from archived content or database content is now much easier.

    4)      Content Re-Use: with Open XML it is much easier to move content between documents, including different document types.

    5)      Due to reduced file sizes, Open XML should also drive reduced storage costs in the long term.

    6)      See attached for further benefits

     

    As always your support is greatly appreciated, let me know if you have any concerns or questions on the above.

     

    Thanks in advance

     

    Business Manager

Comments are closed.