Well, it’s been a busy week in Lake Wobegon, hasn’t it? First, the Wall Street Journal broke the story that Microsoft had unwittingly sold 22 patents, not to the Allied Security Trust (which might have resold them to patent trolls), but to the Open Inventions Network. A few days later, perhaps sooner than planned, Microsoft announced the formation of a new non-profit organization, the CodePlex Foundation, with the mission of “enabling the exchange of code and understanding among software companies and open source communities.”
Not surprisingly, more articles were written about the apparent snookering of Microsoft by AST and OIN than about the new Foundation. But while the tale of the 22 patents is now largely over, the CodePlex story is just beginning. Microsoft says that its goal for the new Foundation is to create an open and neutral environment, and that the formation documents posted and governance structure described at the CodePlex Foundation site can provide a foundation for such an organization. The CodePlex site also makes clear that the Bylaws you can find there are just a starter set, stating, “Our governance documents are deliberately sparse, because we expect them to change.”
That’s good to hear, because I’ve reviewed all of the material at the CodePlex site, and I think that quite a bit of the governance structure will need to change before CodePlex can expect to attract broad participation.
Steve Jobs is a genius of design and marketing, but his track record on calling the right balance between utilizing proprietary arts and public resources (like open source and open standards) is more questionable. Two news items caught my eye today that illustrate the delicacy of making choices involving openness for the iPhone platform - both geopolitically as well as technically.
The first item can be found in today's issue of the London Sunday Times, and the second appears at the MacNewsWorld.com Web site. The intersecting points of the two articles are the iPhone and, less obviously, openness. But the types of openness at issue in the two articles are at once both different, and strangely similar.
The Sunday Times piece recounts the (unsuccessful) efforts of Andre Torrez, the chief technology officer at Federated Media in San Francisco, to switch from the iPhone to an Android-based G1 handset, because he objects to the closed environment that the iPhone represents. But after just a week, Torrez reverts to the better app-provisioned iPhone. The Sunday Times author concludes in part as follows:
Modern society harbors many bad habits. One is its penchant for enthusiastically embracing the benefits of new technologies before considering their less desirable side effects. Whether we look at the development of automobiles (first) and safety features (much later), or industrialization (first) and environmental protection (much, much later), the story is always much the same: we reach for the candy before we grasp the reality of the cavities. Only after the problems become too great to ignore do we investigate the unintended consequences, realize how difficult and expensive they are to address, and grudgingly start to rein in our appetites and exercise a bit of prudent self-discipline.
Perhaps we should not be surprised, then, that the U.S. government is only now becoming alarmed over the vulnerability to which we have become exposed as a result of our whole-hearted embrace of the Internet. With the operations of government, defense, finance, commerce, power distribution, communications, transportation, and just about everything else now dependent on the healthy operation of the Internet, that alarm is well-justified. And with the creation and storage now of virtually all data in digital, rather than physical form, exposure of our financial as well as our most intimate personal and health information is only a hack away as well.
Man's ability to affect the land is all too evident in these times of climate change, pollution and habitat destruction. Happily, the landscape can change man as well.
The weather finally broke last night, dropping 30 degrees by dawn, and thanks be for that. The night before I had camped in the Sheyenne National Grasslands, heavy with heat and humidity. But the next day it was pleasantly cool (upper 60s), albeit overcast rather than sunny.
Nor was this the only change. It took over 2400 driving miles to finally leave the Eastern, and then Midwestern terrain behind, but today I reached the beginnings of what I think of as the West. More than anything else, in my mind that means “dry.” For the last 800 miles, the landscape had been primarily flat, lush - and transitionally post-glacial. That last factor means an area where the great ice sheets completed their periodic southward pulses, dumping rich, black earth born of thousands of miles of ice grinding down stone, some deposited by glacial steams, and other as windblown “loess” – very fine mineral particles.
Mea Culpa. I am uncharacteristically late in commenting on the XML Wars of August, 2009, which have already received so much attention in the press and in the blogs of the technology world. The wars to which I refer, of course, broke out with the announcement early in the month that Microsoft had been granted an XML-related patent. The opening of that front gave rise to contentions that patenting anything to do with XML was, in effect, an anti-community effort to carve a piece out of a public commons and claim it as one's own.
The second front opened when a small Canadian company, named i4i, won a stunning and unexpected remedy (note that I specifically said "remedy" and not "victory," on which more below) in an ongoing case before a judge in Texas, a jurisdiction beloved of patent owners for its staunch, Red State dedication to protecting property rights - including those of the intangible, intellectual kind.
So if this is war, why have I been so derelict in offering my comments, as quite a few people have emailed me to tell me they are waiting to hear? Here's why.
Cybersecurity is an increasingly frequent topic in the news, and this week brought word of the indictment of someone who must be the leading contender for the title, Master Cybercriminal of All Time (Payment Card Fraud Division): Albert Gonzalez. More recent press reports point to additional conspirators who Gonzalez's attorney contends were there real masterminds. Top honors aside, government prosecutors contend that the team are responsible for all of the most high profile data breaches publicized to date: Heartland, Hannaford, TJX, and more - gaining access to information relating to an astonishing 130 million credit and debit cards or more.
With so many breaches in the news, you might understandably be wondering how safe your own financial information is, and whether anyone is doing anything to protect you. Happily, the answer is "yes," and as it happens, the organization that has been tackling this problem is a client of mine, PCI Security Standards Council, which creates and enables a global, end to end ecosystem of standards, certifications, auditors and more to secure payment card data from the moment that your card gets swiped on a reader to the time it reaches its ultimate destination.
In 2001, I took a one month solo cross country trip, driving from Massachusetts across the Northeast, the Midwest, and then the prairie states, until I reached what we generally think of as “the West” – the land of canyons and buttes, deserts and mesas. Once there, I spent the rest of the time backpacking in the canyonlands of Utah, and then meandering North on dirt roads until I reached Glacier National Park, in the Northwest corner of Montana. After that, I zigzagged back East until I reached the Mississippi. Then, it was just a straight highway shot till I arrived back home once again. It was during that trip that I began writing in earnest, although I haven’t (yet) posted anything from that journey to the Web.
Last week, Microsoft and the European Commission each announced that Microsoft had proposed certain concessions in response to a "Statement of Objections" sent to Microsoft by the EC on January 15 of this year relating to Microsoft's bundling of Internet Explorer with Windows. If you've been reading the reams of articles that have been written since then, you may have noticed that the vast majority of the virtual ink spent on the story has been directed at the terms relating to browser choice. Typically, and as an afterthought, most of these stories have added a brief mention that Microsoft also proposed commitments relating to "another" dispute, this one relating to interoperability.
While the browser question is certainly important, in many ways it is far less important than the interoperability issue. After all - the primary benefit for consumers under the browser settlement is that they can choose their favorite browser when they first boot up their new computer, as compared to investing a few extra clicks to download it from the site of its developer - as they can already do now. Interoperability, of course, goes far deeper. There's no way that you can make one program work the way you really want it to with another unless it comes out of the box that way, or unless you have not only the ability, but also the proprietary information, to hack it yourself. And if both programs don't support the same standards, well, good luck with that.
So what exactly did Microsoft promise to the EC, regarding interoperability? Let's use ODF as a reference point and see.
I'm pleased to report this morning on the formation of a new advocacy group for the use of free and open source software in the U.S. Government. I'm also pleased to have been asked to serve on its Board of Advisors, along other proponents of free and open source software, such as Roger Burkhard, Dawn Meyerriecks, Eben Moglen, Tim O'Reilly, Simon Phipps, Mark Shuttleworth, Michael Tiemann, Bill Vass, and Jim Zemlin.
The new organization is called Open Source for America (OSA), and you can find its Web site here. Tim O'Reilly will officially announce OAS at OSCON later today, and you can find the launch press release here, as well as pasted in at the end of this blog post for archival purposes. I'm sure that you'll also see quite a few articles blossom across the Web today relating to its announcement, but having been in on the planning, here's what it's all about.
The dominance of Microsoft's Office in the marketplace would be logical (if frustrating, to those that think that competition breeds better products), if it was simply a matter of developer seats. After all, Microsoft deployed hundreds, and then thousands of engineers to develop and evolve its flagship app over the last 25 years. How could anyone expect a less well funded commercial competitor, much less an open source project, to equal Office for features, performance and interoperability with other office suites?
At the same time, people keep trying - a lot of them. Not just long-established competitors, like Corel, with the venerable and estimable WordPerfect office suite it bought from Novell, open source projects like OpenOffice and KOffice, as well as projects launched by much larger players, such as IBM (Lotus Symphony) and Google (Docs).
WordPerfect aside, most of these offerings disappoint when it comes to round tripping documents with Office users, although many provide perfectly fine alternatives for stand-alone use, particularly by those that don't need to create the most complex business document.
The funny thing is, though, that the quality of the result, and even the ability to interoperate in a world dominated by Microsoft's Office, doesn't necessarily equate to the depth of the resources of the developer. Now isn't that an interesting observation?
Quote of the Day
“[D]o you want to [hand] a 500-page specification...to a light bulb manufacturer, or do you want source code that you can hand to that manufacturer that enables interoperability?”
-Linux Foundation Jim Zemlin on why open source software is replacing open standards
W3C Declares HTML5 Standard Complete Frederic Lardinois TechCrunch October 30, 2014 - More than four years ago, Steve Jobs declared war on Flash and heralded HTML5 as the way to go. You could be forgiven if you thought the HTML5 standard — the follow-up to 1997’s HTML 4 — has long been set in stone, given that developers, browser vendors and the press have been talking about it for years now. In reality, however, HTML5 was still in flux — until today. The W3C today published its Recommendation of HTML5 — the final version of the standard after years of adding features and making changes to it....the W3C today notes in its press release that the next version of the standard needs to focus on a number of core “application foundations” like tools for security and privacy, device interactions, application lifecycle, media and real-time communications and services around the social web, payments and annotations. All of these are meant to make it easier for developers to support the web platform.... ...Full Story
Alliance to Promote Multi-Gigabit Ethernet Technology for Enterprise Wired and Wireless Access Networks Press Release NBASE-T Alliance October 30, 2014 - Cisco, Aquantia, Freescale and Xilinx today announced that they have formed the NBASE-T Alliance, an industry-wide cooperative effort to promote the development of 2.5 and 5 Gigabit Ethernet (2.5GE and 5GE) technology for enterprise network infrastructure. The objective of the nonprofit organization is to advance multi-gigabit Ethernet technology that enables faster data rates on existing enterprise cabling originally designed for 1 Gigabit Ethernet (1GbE) technology....Early promoters Cisco, Aquantia, Freescale and Xilinx welcome interested parties to join the alliance and contribute to its objectives. More details can be found on the alliance website, at www.nbaset.org.
According to Cisco Visual Networking Index (VNI), total mobile data traffic will surpass 30 Exabytes per month in 2018. An estimated 52 percent of that traffic will be offloaded from cellular networks to the fixed network through WiFi, adding to the vast amount of wireless data transmitted over WLAN in enterprise branch and campus networks. The 802.11ac WiFi standard was developed to deal with this massive amount of wireless data. As the Wave 2 of the technology gets introduced, traffic aggregated on APs will quickly surpass multiple gigabits per second, and therefore require both the access point and the Ethernet Switch ports to scale beyond the 1GbE used in most networks....In most enterprise campus networks around the world, Category 5e (Cat5e) and Category 6 (Cat6) twisted-pair copper cables are the most common deployed. These cables do not support 10 Gigabit Ethernet (10GbE) up to 100 meters, therefore the need for intermediate rates between 1 and 10 Gigabit has gained support throughout the industry. To advance the enormous potential for rates greater than 1GbE on legacy cabling, the NBASE-T Alliance founding companies teamed up to promote the development of 2.5GbE and 5GbE that will extend the life of the installed cable plant.... ...Full Story
NIF observatory: interoperability platforms boost data exchange, eServices and eSignature EC Joinup October 29, 0214 - The National Interoperability Framework Observatory (NIFO) community is making available an updated series of NIFO factsheets. The updates track interoperability initiatives in European countries.
Recently published on the Joinup platform, the updated NIFO factsheets provide new information on interoperability for over half of the countries. The update replaces factsheets from May this year. The observatory identified new interoperability platforms in many fields, including data exchange, eServices and eSignature.... ...Full Story
Take Control With Open Source Hardware Carla Schroder Linux.com October 29, 0214 - Free and open source software are no good without open hardware. If we can't install our software on a piece of hardware, it's not good for anything. Truly open hardware is fully-programmable and replicable. So what is open hardware, exactly? OSHWA, the Open Source Hardware Association, defines it as:
"Open source hardware is hardware whose design is made publicly available so that anyone can study, modify, distribute, make, and sell the design or hardware based on that design. The hardware's source, the design from which it is made, is available in the preferred format for making modifications to it. Ideally, open source hardware uses readily-available components and materials, standard processes, open infrastructure, unrestricted content, and open-source design tools to maximize the ability of individuals to make and use hardware. Open source hardware gives people the freedom to control their technology while sharing knowledge and encouraging commerce through the open exchange of designs."... ...Full Story
NISO Launches Open Discovery Initiative (ODI) Standing Committee NISO October 28, 0214 - The National Information Standards Organization (NISO) is pleased to announce the next phase for the Open Discovery Initiative, a project that explores community interactions in the realm of indexed discovery services. Following the working group’s recommendation to create an ongoing standing committee as outlined in the published recommended practice, Open Discovery Initiative: Promoting Transparency in Discovery (NISO RP-19-2014), NISO has formed a new standing committee reflecting a balance of stakeholders, with member representation from content providers, discovery providers, and libraries. The ODI Standing Committee will promote education about adoption of the ODI Recommended Practice, provide support for content providers and discovery providers during adoption, conduct a forum for ongoing discussion related to all aspects of discovery platforms for all stakeholders, and determine timing for additional actions that were outlined in the recommended practice.... ...Full Story
How a USB key drive could remove the hassles from two-factor authentication Tony Bradley PC World October 28, 0214 - We've had enough malware campaigns and data breaches to confirm the need for better data protection online. The Universal 2nd Factor (U2F) standard is a step in the right direction, and the first compatible devices are coming out now.
U2F is an open authentication standard. It was initially developed by Google, but it's now managed by the FIDO (Fast Identity Online) Alliance....Two-factor, or multi-factor authentication has long been promoted as a more effective security mechanism, but it's a hassle, requiring you to juggle passwords with a second factor such as a texted code or an authentication app. U2F proposes to streamline the process using a U2F-enabled USB or NFC key fob, card, or mobile device alongside traditional authentication methods.... ...Full Story
The Future of the Internet - 20 Years Ago The birth of Netscape and its browser Glyn Moody ComputerWorld.uk October 27, 0214 - By Glyn Moody | Published 15:15, 20 October 14
Facebook 3 Twitter 34 LinkedIn 0 Google Plus 2 Share This 75 Article comments
Last week, the following tweet appeared:
Netscape Navigator was released 20 years ago [last week]...The fall of Netscape was not entirely down to Microsoft's aggressive moves. Netscape made a number of serious missteps, and the quality of the Netscape Navigator code started deteriorating. Eventually, that led to most of the Netscape program being released as open source, and the creation of the Mozilla project - something I wrote about in detail in an Open Enterprise column published seven years ago.
But here, I'd like to dwell on that moment in October 1994 when the first beta version of Netscape Navigator was released, and many of us sensed that this was the start of a new era in computing. Below is a column I wrote at that time, exactly as it first appeared; I hope it conveys a little of the atmosphere of those heady times.... ...Full Story
NIST's Cloud Computing Roadmap Details Research Requirements and Action Plans NIST Techbeat October 27, 0214 - The National Institute of Standards and Technology (NIST) has published the final version of the US Government Cloud Computing Technology Roadmap, Volumes I and II. The roadmap focuses on strategic and tactical objectives to support the federal government’s accelerated adoption of cloud computing. This final document reflects the input from more than 200 comments on the initial draft received from around the world.
The roadmap leverages the strengths and resources of government, industry, academia and standards development organizations to support technology innovation in cloud computing.
The first volume, High-Priority Requirements to Further USG Agency Cloud Computing Adoption, describes the roadmap’s purpose and scope....The second volume, Useful Information for Cloud Adopters, introduces a conceptual model, the NIST Cloud Computing Reference Architecture and Taxonomy and presents U.S. government cloud target business and technical use cases.... ...Full Story
ITU Plenipotentiary Conference elects Houlin Zhao as next Secretary-General ITU.org October 24, 0214 - The ITU Plenipotentiary Conference roundly endorsed Houlin Zhao of China as its next Secretary-General. Zhao will take office on 1 January 2015 for a term of four years, with the possibility of re-election for one additional four-year term.
The election took place in Busan, Republic of Korea, during the Plenary session of the PP-14 conference this morning. Zhao won the position with 152 votes, with 156 countries present and voting. He contested the position unopposed. Full election results are available here.
Addressing the conference after the vote, Zhao told some 2,000 conference participants from around the world that he would do his best to “fulfil ITU's mission, and, through our close cooperation, ensure ITU delivers services to the global telecommunication and information society at the highest level of excellence."... ...Full Story
Who Open Source is Replacing Open Standards Glyn Moody ComputerWorld.uk October 23, 0214 - ...Here's [Linux Foundation Executive Director Jim] Zemlin's perspective on why the Foundation is becoming involved in so many collaborative industry projects:
"Companies are now as the norm using open source to shed comunity R&D, to do collective innovation, particularly at the infrastructure layer, for almost every aspect of technology, not just Linux - SDN, IOT, network functions virtualisation, cloud computing, etc. What you have seen as a result is this proliferation of organisations who facilitate that development, on a very large professional scale. That's a permanent fixture of how the tech sector operates. We launch a new one of these about every 3 months. Next year we'll have many many more of these type of projects....The largest form of collaboration in the tech industry for 20 years was at standards development organisations - IEEE, ISO, W3C, these things - where in order for companies to interoperate, which was a requirement in tech, they would create a specification, and everyone would implement that. The tech sector is moving on to a world where, in the Internet of things [for example], do you want to have a 500-page specification that you hand to a light bulb manufacturer, or do you want source code that you can hand to that manufacturer that enables interoperability? I think that's a permanent fixture. People have figured out for a particular non-differentiating infrastucture they want to work on that through open source, rather than creating a spec."... ...Full Story