It was in September of 2010 that a group of key members of the OpenOffice.org developer team announced that they were no longer willing to wait out the uncertain future of OpenOffice, especially in the face of the lack of interest shown by Oracle, the new owner of the project following its acquisition of Sun Microsystems nine months before.
Their announced intent was to form an independent foundation to host a fork of the OpenOffice code base, thereby achieving a goal they had sought throughout ten years of control by Sun – to work in an environment free from the control of a single vendor.
It's now two and a half years later, and with the release of LibreOffice 4.0, that Foundation is not only flourishing, but forging a path independent of its predecessor.
According to a press release issued today by the Portuguese Open Source Business Association (reproduced in full at the end of this blog entry), the government of Portugal has decided to approve a single editable, XML-based document format for use by government, and in public procurement. And that format is not OOXML.
Instead, the Portuguese government has opted for ODF, the OpenDocument Format, as well as PDF and a number of other formats and protocols, including XML, XMPP, IMAP, SMTP, CALDAV and LDAP. The announcement is in furtherance of a law passed by the Portuguese Parliament on June 21 of last year requiring compliance with open standards (as defined in the same legislation) in the procurement of government information systems and when exchanging documents at citizen-facing government Web sites (an unofficial English translation is here).
Yesterday, Microsoft made an unobtrusive announcement that brings a degree of closure to a seven year long epic battle between some of the largest technology companies in the world. The same saga pitted open source advocates against proprietary vendors, and for the first time brought the importance of technical standards to the attention of millions of people around the world, and at the center of the action were Microsoft and IBM, the latter supported by Google and Oracle, among other allies.
The standards in question described the format specifications that can allow documents created by one proprietary software product to be opened, edited and saved in another.
Since 2005, I see that I have written over 227 blog entries about ODF (I say more than, because the very earliest got lost in an earlier platform migration). Throughout the greatest part of this six year period, OpenOffice was the poster child ODF implementation - the one with the most users, the most press attention, the most corporate support - tens of millions of dollars of it, from Sun Microsystems. Of course, there were other impressive implementations, both open source and proprietary alike. OpenOffice, though, was always the default ODF implementation referenced by the press.
Poor OpenOffice. It’s been open source for so long, and yet its adoption and market importance has always lagged far behind that of peer software like Linux – despite the fact that it’s free and implements a standard (ODF) aggressively promoted by some of the most powerful technology countries in the world. Can this ever change?
If yesterday’s announcement by IBM is any indication, the answer is “not likely,” despite the fact that Big Blue’s latest commitment to OpenOffice, on its surface, sounds like good news. The reason? It’s too little, and too late. Here’s why.
After sixteen years of working in parallel to the traditional standards infrastructure, the World Wide Web Consortium has taken an interesting decision: to begin submitting selected W3C Recommendations to that same system for endorsement. In doing so, it joins the small handful of consortia (seven, to be exact) that have applied for this option out of the hundreds of consortia currently active in the information and communications (ICT) to apply for that option.
If this process sounds vaguely familiar, that’s likely because this is the same process that OASIS used to gain global endorsement of its OpenDocument Format (ODF). Microsoft took a similar, but procedurally distinct, route with OOXML, its competing document format, when it offered it to ECMA, which enjoys a special “Fast Track” relationship with JTC1. What won't sound familiar is the conditions that the W3C has successfully included in its application to make submissions, on which more below.
When news of Oracle's intended acquisition of Sun Microsystems broke long ago, many people wondered what that would mean for OpenOffice, the most widely adopted full desktop implementation of ODF. But Oracle immediately imposed a company-wide "no comment" policy on that topic, so everyone has been wondering what the answer might be ever since.
So like many others, I expect, I’m trying to get my brain around Oracle’s reasoning in deciding to charge $90 for a formerly free ODF conversion plug-in developed by Sun Microsystems. That downloadable plug-in was intended for Microsoft Office users who wanted to import ODF-compliant documents created, most obviously, by users of the free, open source OpenOffice.org (OOo) version, or of Sun’s StarOffice, the for-sale, supported productivity suite based on the free OOo code.
Moreover, it’s not just $90 you’ll need to fork over – the plug-in is only available in packages of 100.
In reviewing my RSS feed this morning, I found this interesting blog entry by Alex Brown, titled Microsoft Fails the Standards Test. In it, Alex makes a number of statements, and reaches a number of conclusions, that are likely to startle those that followed the ODF-OOXML saga. The bottom line? Alex thinks that Microsoft has failed to fulfill crucial promises upon which the approval of OOXML was based. He concludes that unless Microsoft reverses course promptly, “the entire OOXML project is now surely heading for failure.”
Mea Culpa. I am uncharacteristically late in commenting on the XML Wars of August, 2009, which have already received so much attention in the press and in the blogs of the technology world. The wars to which I refer, of course, broke out with the announcement early in the month that Microsoft had been granted an XML-related patent. The opening of that front gave rise to contentions that patenting anything to do with XML was, in effect, an anti-community effort to carve a piece out of a public commons and claim it as one's own.
The second front opened when a small Canadian company, named i4i, won a stunning and unexpected remedy (note that I specifically said "remedy" and not "victory," on which more below) in an ongoing case before a judge in Texas, a jurisdiction beloved of patent owners for its staunch, Red State dedication to protecting property rights - including those of the intangible, intellectual kind.
So if this is war, why have I been so derelict in offering my comments, as quite a few people have emailed me to tell me they are waiting to hear? Here's why.
Last week, Microsoft and the European Commission each announced that Microsoft had proposed certain concessions in response to a "Statement of Objections" sent to Microsoft by the EC on January 15 of this year relating to Microsoft's bundling of Internet Explorer with Windows. If you've been reading the reams of articles that have been written since then, you may have noticed that the vast majority of the virtual ink spent on the story has been directed at the terms relating to browser choice. Typically, and as an afterthought, most of these stories have added a brief mention that Microsoft also proposed commitments relating to "another" dispute, this one relating to interoperability.
While the browser question is certainly important, in many ways it is far less important than the interoperability issue. After all - the primary benefit for consumers under the browser settlement is that they can choose their favorite browser when they first boot up their new computer, as compared to investing a few extra clicks to download it from the site of its developer - as they can already do now. Interoperability, of course, goes far deeper. There's no way that you can make one program work the way you really want it to with another unless it comes out of the box that way, or unless you have not only the ability, but also the proprietary information, to hack it yourself. And if both programs don't support the same standards, well, good luck with that.
So what exactly did Microsoft promise to the EC, regarding interoperability? Let's use ODF as a reference point and see.
Quote of the Day
“Should the provision of a hyperlink leading to a work or other subject matter protected under copyright?”
AltExchange Alliance Publishes Private Equity’s First Ever Data Standard Press Release AltExchange Alliance December 20, 2013 - The AltExchange Alliance, a global private equity industry group launched in May 2013, has taken a major step towards its aim of developing a comprehensive global data standard for sharing private equity information. The Alliance has published its Group One standard – the industry's first ever detailed template for the exchange of data related to capital accounts, schedules of investments, and cash flow activity.
The guidelines previously outlined by the ILPA and IPEV provided a starting point for the working group's efforts. However, the new standard goes much further than anything the industry has seen before. It provides a robust, highly-detailed, fixed format data standard, and is now available for download.... ...Full Story
US PIN debit networks form EMV alliance Press Release Debit Network Alliance December 20, 2013 - Ten leading PIN debit networks in the United States have formed a new company, Debit Network Alliance, to provide a structure for the governance, deployment and implementation of the EMV debit standard.
The goal of this collaborative effort is to help facilitate the adoption of an interoperable EMV standard for debit payments in the U.S. through a common governance structure that fosters regulatory compliance, equal access and ability to innovate for all debit networks, routing choice for merchants, and portability for issuers.
The debit networks have a long history of collaboratively working together - especially with regard to improving security - to define standards that maintain the integrity and quality of the U.S. payment industry. In particular, the networks have been working together on chip standards under the support of the Secure Remote Payment Council's Chip and PIN Work Group since April 2012.
The founding networks of Debit Network Alliance include AFFN®, ATH®, CO-OP Financial Services ®, NETS®, NYCE®, Presto!®, PULSE®, SHAZAM®, and STAR®.... ...Full Story
NIST Special Publication Expands Government Authentication Options NIST NIST Techbeat December 19, 2013 - A newly revised publication from the National Institute of Standards and Technology (NIST) expands the options for government agencies that need to verify the identity of users of their Web-based services. Electronic Authentication Guideline (NIST Special Publication 800-63-1) is an extensive revision and update of the original document, released in 2006, and it recognizes that times, and technologies, have changed.
“Changes made to the document reflect changes in the state of the art,” explains NIST computer security expert Tim Polk, Cryptographic Technology Group manager at NIST. “There are new techniques and tools available to government agencies, and this provides them more flexibility in choosing the best authentication methods for their individual needs, without sacrificing security.”... ...Full Story
Government expands private sector cyber security partnerships in NCSS drive ComputerWeekly.com December 19, 2013 - The UK government plans to concentrate on expanding partnerships around cyber security with the private sector in 2014 as part of the National Cyber Security Strategy (NCSS).
This includes introducing a cyber security kitemark for firms that do business with the government, to help boost UK cyber exports and a cyber security baseline standard....
The NCSS is supported by £860m funding from the National Cyber Security Programme for delivering projects as part of the government’s response to growing threats in cyberspace.... ...Full Story
Standardization Priorities for Smart and Sustainable Cities Discussed at ANSI Workshop ANSI Weekly December 19, 2013 - The American National Standards Institute (ANSI) convened a workshop on November 21, 2013, in Washington, DC, to examine the role of standardization in achieving the promise of smart and sustainable cities. The full workshop report is available online.
The inspiration for both the workshop and the larger smart cities movement is the ongoing growth of urban communities, particularly in developing countries, along with the proliferation of information and communications technologies (ICTs), such as sensors, smart phones, intelligent transport systems, building energy management systems, etc., that can assist cities in making their operations more efficient, more sustainable, and more resilient. Countries in Europe and Asia, with support from their national governments, have undertaken strategic initiatives to explore this area. Likewise, a number of new standardization roadmapping activities have emerged at the national, regional, and international levels to assess what standards and conformance programs already exist and what additional activity may be needed....
The workshop identified a number of priority areas where standardization can contribute to smart and sustainable cities. These included:
- a standardized set of definitions/lexicon for smart cities applicable across sectors
- interoperability for systems of systems, including common data formats and communication protocols to enable sharing of data between systems
- key performance indicators so that measurements are consistent and comparable
- a baseline guidance document which can be adapted to address the specific needs of sectors
- resiliency for disaster preparedness and recovery
As a result of the workshop, ANSI will develop a proposal for a collaborative to further define standardization needs, particularly through outreach and engagement of public-sector stakeholders.... ...Full Story
EU challenges US hegemony in global internet governance Cécile Barbière EurActiv December 18, 2013 - French lawmakers, supported by the EU's Digital Agenda Commissioner Neelie Kroes, are pressing the European Union to stand up more firmly against American domination in cyberspace....“The European Union is not present enough in the different international fora on Internet governance although the future of the Internet is a significant challenge,” said Catherine Morin-Desailly, vice-president of the EU Affairs Committee in the French Senate....
“Only the EU has the necessary power to influence this new cyberspace where the USA dominates,” she added.
The MPs’ concerns stem largely from the massive and illegal wiretapping done by the Americans which were revealed by whistleblower Edward Snowden.
... ...Full Story
One European copyright law-to-rule-them-all? EU launches review OUT-LAW,com The Register December 18, 2013 - The European Commission is seeking industry views on whether to completely harmonise copyright laws across the EU....Respondents are being asked for views on matters ranging from the accessibility of digital content across the trading bloc, limitations and exceptions to copyright protection and remuneration for rights holders.
However, it is also consulting on whether to set copyright rules that apply consistently across the whole of the EU...."Some see this as the only manner in which a truly Single Market for content protected by copyright can be ensured,"...The Commission has also asked whether the act of linking to copyrighted material should require the permission of rights-holders.... ...Full Story
Switch to open source successfully completed, city of Munich says PCWorld December 17, 2013 - Munich’s switch to open-source software has been successfully completed, with the vast majority of the public administration’s users now running its own version of Linux, city officials said Thursday.
In one of the premier open-source software deployments in Europe, the city migrated from Windows NT to LiMux, its own Linux distribution. LiMux incorporates a fully open-source desktop infrastructure. The city also decided to use the Open Document Format (ODF) as a standard, instead of proprietary options....As of November last year, the city saved more than €11.7 million (US$16.1 million) because of the switch. More recent figures were not immediately available, but cost savings were not the only goal of the operation. It was also done to be less dependent on manufacturers, product cycles and proprietary OSes, the council said Thursday.... ...Full Story
Christmas comes early for the Open Document Faithful (ODF) Mark Ballard Public Sector IT December 16, 2013 - The UK government has spruced its open document policy up for Christmas.
The Cabinet Office began a public consultation on open document formats this week, three and a half years after it came to power promising they would be one of the first things it delivered....The Cabinet Office Open Standards Board issued a "challenge" for public comment on a proposal this week that government documents be published in a format that anyone can read....[Meanwhile,]
The European Commission is meanwhile coming to the latest break point in contracts that have made Microsoft the sole supplier of desktop office and operating software for more than 20 years. The Commission had been aspiring to find an open format alternative to Microsoft standards even when it signed the first contract to buy Microsoft Office in 1992.... ...Full Story
Consortium Advances Spatial Computing Standard Tiffany Trader HPCWire December 13, 2013 - A new programming standard, called the Open Spatial Programming Language (or OpenSPL), debuted today “to enable the next generation of high performance parallel spatial computers.”
The open standard was developed by the Open Spatial Programming Language (OpenSPL) consortium, which formed to promote the use of spatial computing among a wide set of users and to standardize the OpenSPL language. The overarching goal of the consortium is for spatial computing to become the industry standard for mission critical computations.... ...Full Story