Don't have an account yet? Sign up as a New User
Lost your password?
Welcome to ConsortiumInfo.org
Monday, May 02 2016 @ 01:02 AM CDT
Sunday, February 24 2008 @ 02:34 PM CST
Contributed by: Andy Updegrove
This rather long essay is in one sense a reply to the open letter recently released by Patrick Durusau, in which he suggested that it was time to acknowledge progress made and adopt OOXML. But it is also an explanation of why I have for the first time in my career become personally involved in supporting a standard. The reason is that I believe that we are at a watershed in public standards policy, and that there is much more at stake than ODF and OOXML. In this essay, I explain why I think we need to recognize the existence and vital importance of what I call “Civil ICT Standards,” and why more than simple technical compromises are needed to create them in order to protect our “Civil ICT Rights.”
As I write this entry, hundreds of people from around the world are converging on Geneva, Switzerland. 120 will meet behind closed doors to hold the final collaborative discussions that will determine whether OOXML will become an ISO/IEC standard. When their work is complete, not everyone will be pleased with the changes agreed upon, but all will acknowledge that the specification that eventually emerges will be much improved from the version that was originally submitted to Ecma two years ago.
Most will also agree that Microsoft’s customers and independent software vendors (ISVs) will be far better off with OOXML publicly available than they would if Microsoft had not offered the specification up at all.
To reach this final draft, hundreds of standards professionals in many nations have spent a great deal of time and effort, including many at Microsoft. And while Microsoft, working with Ecma, has not agreed to all of the changes that have been requested, my impression is that it has agreed to many that will, if implemented by Microsoft, require a substantial amount of work and technical compromise on its part.
Thursday, February 21 2008 @ 09:28 AM CST
Contributed by: Andy Updegrove
Microsoft has just made a major announcement relating to its core products and involving the degree and manner in which it will make the details of those products available to developers. The importance of the announcement was underlined by those that were brought together for the press event at which the decisions were announced: chief executive Steve Ballmer, chief software architect Ray Ozzie, senior vice president of the server and tools business Bob Muglia, and Brad Smith, the senior vice president and general counsel for legal and corporate affairs.
At first glance, this appears to be an important decision by Microsoft indicating a greater willingness to be both open and cooperative. There are a number of promises in the announcement that I like, including the commitment to publish a great deal of material on the Web, as well as the freedom that will be offered to developers to take certain actions without the necessity of first obtaining a license. However, I have not had the opportunity to read any of the supporting details, and those details will be extremely significant, especially as regards the open source community, where subtle differences in legal terms can permit use under some open source licenses, but not others.
Similarly, with respect to ODF, it will be important to see what kind of plug ins are made available, how they may be deployed, and also how effective (or ineffective) those translators may be. If they are not easy for individual Office users to install, or if their results are less than satisfactory, then this promise will sound hopeful but deliver little. I am disappointed that the press release does not, as I read it, indicate that Microsoft will ship Office with a "save to" ODF option already installed. This means that ODF will continue to be virtually the only important document format that Office will not support "out of the box."
Friday, February 08 2008 @ 08:25 AM CST
Contributed by: Andy Updegrove
The Wall Street Journal reported this morning that EU regulators have announced a third investigation into Microsoft's conduct on the desktop. This latest action demonstrates that while the EU has settled the case against Microsoft that ran for almost a decade, it remains as suspicious as ever regarding the software vendor's conduct, notwithstanding Microsoft's less combative stance in recent years. The news can be found in a story reported by Charles Forelle bylined in Brussells this morning.
According to the Journal, the investigation will focus on whether Microsoft "violated antitrust laws during a struggle last year to ratify its Office software file format as an international standard." The article also says that the regulators are "stepping up scrutiny of the issue." The Journal cites the following as the type of activity it will look into:
In the months and weeks leading up to [last summer's vote on OOXML], Microsoft resellers and other allies joined standards bodies en masse -- helping swell the Italian group, for instance, from a half-dozen members to 85. Opponents said Microsoft stacked committees. People familiar with the matter say EU regulators are now questioning whether Microsoft's actions were illegal. Microsoft said at the time that any committee expansion had the effect of making more voices heard; it also said rival International Business Machines Corp. mobilized on the other side of the vote.
A Microsoft spokesman referred to a statement issued last month, in which the company said it would "cooperate fully" with the EU regulator and was "committed to ensuring" the company is in compliance with EU law.
Wednesday, January 30 2008 @ 06:21 AM CST
Contributed by: Andy Updegrove
As many of you are aware, Alex Brown will be the "Convenor" of the OOXML Ballot Resolution Meeting (BRM) that will run from February 25 through 29 in Geneva, Switzerland. Alex has a variety of unenviable tasks, including:
Trying to interpret various standing Directives and other ISO/IEC JTC1 rules and practices that were created for what might be described as kinder, gentler times (not to mention for shorter specifications).
Figuring out how to process c. 1,000 comments (after elimination of duplicates) during a 35 hour meeting week, without the currently contemplated possibility of an extension.
Herding 120 cats, some of which will have strong opinions on individual points, others of which will have alternating suggestions on how to resolve a given point, and many of whom may be just plain bewildered, due to the lack of time to be fully prepared.
For better or worse, the rules that Alex will be interpreting and applying are not as comprehensive, and certainly not as detailed, as the situation might demand to put everyone on exactly the same page regarding what should (or at least could) be done at many points in time. As a result, knowing how Alex's thoughts are shaping up is both interesting and important. To his credit, he has been generous about sharing those thoughts, and often how he arrived at them, at his blog, which can be found here.
While I've often linked to Alex's blog and have had a permanent link in the "Blogs I Read" category for some time, I'd like to point to Alex's latest entry, which covers several important points that others have recently blogged on. In many cases, Alex comes out differently than some others that have stated firm opinions, and since Alex has the gavel, his opinion will be the one that counts.
Thursday, January 17 2008 @ 01:03 PM CST
Contributed by: Andy Updegrove
If you're reading this blog entry, you've probably been following the battle between ODF and OOXML. If so, you may be thinking of that conflict as a classic standards war, but in fact, it goes much deeper than that label would suggest. What is happening between the proponents of ODF and OOXML is only a skirmish in a bigger battle that involves a fundamental reordering of forces, ideologies, stakeholders, and economics at the interface of society and information technology.
Today, open source software is challenging proprietary models, hundreds of millions of people in emerging societies are choosing their first computer platforms from a range of alternatives, major vendors are converting from product to service strategies, and software as a service is finally coming into its own - to mention only a few of the many forces that are transforming the realities that ruled the IT marketplace for decades. When the dust settles, the alignments and identities of the Great Powers of the IT world will be as different as were the Great Powers of the world at the end of the First World War.
It is in this light that the ODF vs. OOXML struggle should really be seen, and for this reason I've dedicated the latest issue of Standards Today to exploring these added dimensions on the eve of the OOXML Ballot Resolution Meeting that will begin on February 25 in Geneva, Switzerland.
Monday, January 14 2008 @ 10:50 AM CST
Contributed by: Andy Updegrove
Regulators in the EU today announced that they are opening two new investigations against Microsoft, this time focusing not on peripheral functionalities like media players, but on the core of Microsoft's business: its operating and office suite software. The investigations are in response to a recent complaint filed by Norway browser developer Opera Software ASA and a 2006 complaint brought by the European Committee for Interoperable Systems (ECIS), which includes Microsoft rivals IBM, Nokia, Sun, RealNetworks and Oracle among its members.
Both investigations focus on the benefits that Microsoft gains by combining features, such as search and Windows Live, into its operating system. But the investigation sparked by the Opera complaint also includes some novel and interesting features, based upon Opera's contention that Microsoft's failure to conform Internet Explorer to prevailing open standards puts its competitors at a disadvantage (Opera also asks that either IE not be bundled with Windows, or that other browsers, including its own, should be included as well, with no browser being preset as a default).
The investigations will also look into whether Microsoft has failed to adequately open OOXML, or to take adequate measures to ensure that Office is "sufficiently interoperable" with competing products. This would seem to indicate that Microsoft's strategy of offering OOXML to Ecma, and then ISO/IEC JTC1, may fail to achieve its objective, whether or not OOXML is finally approved as a global standard.
Thursday, January 03 2008 @ 01:43 PM CST
Contributed by: Andy Updegrove
It's not often I find myself at a loss for words when I read something, but this is one of those times.
Or perhaps it would be more accurate to say that it isn't really necessary for me to add any words to the following news, other than to characterize them with a Latin phrase lawyers use: Res ipse loquitor, which translates as "the thing speaks for itself." I'll give one clue, though: I've added this blog post to the "ODF and OOXML" folder. That's "OOXML" as in "the world must have this standard so that our customers can open the billions of documents that have already been created in older versions of" a certain office productivity suite.
So without further ado, here's the news, along with what a few other people have had to say about it [Update: see also the comments that readers have added below interpreting the original Microsoft information]:
Thursday, December 13 2007 @ 04:55 AM CST
Contributed by: Andy Updegrove
As the date for the February BRM (Ballot Resolution Meeting) on ISO/IEC JTC1 DIS 29500 (a/k/a Ecma 376, a/k/a Microsoft OOXML) approaches, more and more attention is being paid to how Ecma will propose the disposition of the comments submitted during the general voting period. This level of heightened interest is legitimately urgent, due to both the great number of the comments that need to be resolved, even after elimination of duplicates, as well because of the late date upon which the proposed resolutions will be made public (the deadline, if memory serves, is January 19, while the BRM will commence its deliberations on February 25 of next year).
The words are therefore flying fast and furious at the many blogs covering this question, and tempers are rising in the comments appended to those of bloggers that have a direct interest in the outcome. A particularly contentious issue has been whether Ecma is trying to make it as easy as possible, or is trying to make it as difficult as possible while still scoring PR points, for interested parties to view proposed dispositions of comments, and whether it does, or does not, have the latitude under ISO rules to be more transparent. The fairly opaque, and sometimes contradictory nature of those rules, has not made the debate any easier, and gives rise to the possibility of confusion, at best, and serious mistakes, at worst, as Pamela Jones pointed out at Groklaw this morning.
The result is that there will be very little real data available to the general public until Ecma opens the curtains on January 19. And the import of what little data does become available is usually the subject of instant disagreement.
With that as prelude, I've pasted in the text of a press release at the end of this blog entry that Ecma issued yesterday. The release gives only a peek at some of the issues addressed in the new dispositions, giving varying degrees of detail on each area highlighted - but that's more than we've had to go on so far. Here is my summary of the press release and its significance, when viewed in the context of other reliable, available information:
Saturday, November 17 2007 @ 08:15 AM CST
Contributed by: Andy Updegrove
Those of us who live in America are currently in the midst of that most protracted, expensive and (often) tedious of all democratic processes: the quadrennial quest to find, and perhaps even elect, the most able leader to guide the nation into the future. Part and parcel to that spectacle is a seemingly endless torrent of printed words and video. These emanate from more than a dozen candidates, each of whom is trying to convince the electorate that he or she is The One, while at the same time hoping to avoid offering any point of vulnerability that can be exploited by the opposition.
It is an overwhelming and leveling experience for all concerned, electorate and candidates alike.
Out of the campaign cacophony of the last week emerged a handful of words from Senator and Democratic party hopeful Barack Obama that could not fail to catch my attention. He used them during the presidential debate held in Las Vegas, and they also appear in the "Innovation Agenda" that Obama had released a few days before. He announced this agenda in a speech he delivered on November 14 at an aptly selected venue: the Google campus in Mountainview, California. One of the pledges he made in the course of that speech reads in part as follows:
To seize this moment, we have to use technology to open up our democracy. It's no coincidence that one of the most secretive Administrations in history has favored special interests and pursued policies that could not stand up to sunlight. As President, I'll change that. I'll put government data online in universally accessible formats. [emphasis added]
A presidential candidate that is including "universally accessible formats" in his platform? How did that come about?
Friday, November 09 2007 @ 07:00 AM CST
Contributed by: Andy Updegrove
Wednesday I attended the W3C Technical Plenary Day festivities, which included a brief press conference with Tim Berners-Lee, interesting insights into the W3C's work in progress and future plans, and much more (you can view the agenda here). And it also gave me a chance to sit with Chris Lilley, a W3C employee whose responsibilities include Interaction Domain Leader, Co-Chair W3C SVG Working Group, W3C Graphics Activity Lead and Co-Chair, W3C Hypertext CG. What that combination of titles means is that he is the "go to" guy at W3C to learn what W3C's CDF standard is all about.
CDF is one of the very many useful projects that W3C has been laboring on, but not one that you would have been likely to have heard much about. Until recently, that is, when Gary Edwards, Sam Hiser and Marbux, the management (and perhaps sole remaining members) of the OpenDocument Foundation decided that CDF was the answer to all of the problems that ODF was designed to address. This announcement gave rise to a flurry of press attention that Sam Hiser has collected here. As others (such as Rob Weir) have already documented, these articles gave the Foundation's position far more attention than it deserved.
Quote of the Day
“A difficult issue that needs to be solved
-Ian Skerrett, VP of Marketing and Ecosystem at the Eclipse Foundation, commenting on the challenge of making the IoT secure See all Quotes
Latest NewsWeb Storage (Second Edition) is a W3C RecommendationPress ReleaseW3C.org
May 2, 2016 - The Web Platform Working Group has published a W3C Recommendation of "Web Storage (Second Edition)." This specification defines an API for persistent data storage of key-value pair data in Web clients. It introduces two related mechanisms, similar to HTTP session cookies, for storing name-value pairs on the client side. The first mechanism is designed for scenarios where the user is carrying out a single transaction, but could be carrying out multiple transactions in different windows at the same time. The second mechanism is designed for storage that spans multiple windows, and lasts beyond the current session.... ...Full Story
Survey Highlights Security Concern Among IoT Developers
Programmable Web April 29, 2016 - According to the second annual IoT Developer Survey, security is the top concern of IoT developers. The survey, which polled 528 IoT developers, was conducted by the Eclipse IoT Working Group in partnership with the IEEE IoT and the AGILE-IoT research project.
Of developers working in organizations that have deployed IoT solutions, nearly half (48.3%) identified security as their leading concern. In the same group of respondents, interoperability and performance were the second and third biggest concerns, with 31.9% and 21%, respectively....
Not only can vulnerabilities in IoT applications be the source of privacy breaches, as the IoT extends its reach to things like cars, security vulnerabilities could theoretically put lives in danger....In this year's IoT Developer Survey, nearly half (46%) of those polled indicated that their company is developing and deploying IoT solutions, and 29% indicated that their company plans to within the next 18 months, suggesting that adoption of IoT technologies is accelerating.... ...Full Story
The advantages of open source in Internet of Things design
DesignWorldOnline April 28, 2016 - The Internet of Things is booming and with millions of devices to be connected over the coming years, many developers are focusing on the IoT opportunity....There are many commonalities between IoT solutions across different applications—the need for wireless connections, communication between devices and back-end systems, and data collection/interpretation are a few examples. But the proliferation of proprietary systems that are often in silos makes developing and building these solutions more complex and time consuming than needed. In a fast-moving, fragmented industry, open source technologies will play an increasingly fundamental role in mitigating these challenges and enabling seamless systems to further fuel innovation.
One way to circumvent the interoperability challenge is by establishing and using standards. Thoughtful and collaborative standardization improves choice and flexibility. As a result, developers can use devices from multiple vendors to build a solution that is innovative and meets their specific needs. We’ve outlined a few key channels that are essential to unlocking the potential of open source in IoT development.
Standards are necessary across the whole ecosystem and are being addressed by the industry in multiple ways. For example, industry standards organizations, like oneM2M (a consortium of industry stakeholders), has developed technical specifications to address the need for a common M2M Service Layer that can be embedded within various hardware and software and relied on to connect a wide range of devices to M2M application servers.
Another complementary approach to standards development is the release of designs and specifications into the open source community as open hardware and interface standards for others to adopt. Examples include Arduino, Raspberry Pi, and Beaglebone, which enable quick prototyping, as well as the mangOH open hardware reference design, an open source design that is more easily scalable in commercial settings and is built specifically for IoT cellular connectivity.
Open source platforms like these enable developers that may have limited hardware, wireless or low-level software expertise to start developing IoT applications in days—rather than months. If executed properly, these can significantly reduce the time and effort to get prototypes from paper to production by ensuring that various connectors and sensors work together automatically with no additional coding required. With industrial-grade specifications, these next-generation platforms not only allow quick prototyping, but also rapid industrialization of IoT applications.
On the software side, using widely supported open source software application frameworks and development environments, such as Linux—itself an open source solution—can be extremely helpful by providing developers the head start that is required to get a product to market faster. When it comes to proprietary solutions, support for its development framework tends to rest on the original vendor, whose agenda may not align with the needs of the community. Open source solutions ensure a future-proof investment and longevity, so that resources and tools are available and continually enhanced for years to come....
To further advance the industry, we must commit to a standards-based and open-source strategy. Not only will it continue to be critical to the health of IoT innovation, but it will lay the groundwork for real innovation. Just as it supported many other areas of technology development—including nothing less than the Internet itself—open standards are the key to realizing the unforeseen benefits of a more connected world. ...Full Story
ANSI Energy Efficiency Standardization Coordination Collaborative (EESCC) Releases Roadmap Progress Report
ANSI.org April 27, 2016 - The American National Standards Institute (ANSI) Energy Efficiency Standardization Coordination Collaborative (EESCC) announced today the publication of a Progress Report detailing the standardization community’s activity to advance recommendations outlined in the EESCC’s Standardization Roadmap: Energy Efficiency in the Built Environment. Published in June 2014 to serve as a national framework for action and coordination, the roadmap identified gaps where standards and codes were needed to improve energy and water efficiency in the built environment.
Available as a free resource, the Progress Report features updates on 71 of the 109 standards-based gaps identified in the roadmap, demonstrating significant progress within the standardization community to advance energy and water efficiency through standards-based solutions. The report also includes a summary of all of the standards-based roadmap gaps, including those for which there is no known progress at this time, so that readers may easily identify opportunities to take action on closing the gaps.... ...Full Story
Anti-innovation: EU excludes open source from new tech standards
Ars Technica April 27, 2016 - As part of its Digital Single Market strategy, the European Commission has unveiled "plans to help European industry, SMEs, researchers and public authorities make the most of new technologies." In order to "boost innovation," the Commission wants to accelerate the creation of new standards for five buzzconcepts: 5G, cloud computing, internet of things, data technologies, and cybersecurity.
The key document is one entitled "ICT Standardisation Priorities for the Digital Single Market," which says: "Open standards ensure ... interoperability, and foster innovation and low market entry barriers in the Digital Single Market, including for access to media, cultural and educational content." The word "open" occurs 26 times in the document, and is also frequently found in the other "communications" just released by the European Commission: on digitising European industry (9 times), and on the European Cloud Initiative (50 times).
"Open" is generally used in the documents to denote "open standards," as in the quotation above. But the European Commission is surprisingly coy about what exactly that phrase means in this context. It is only on the penultimate page of the ICT Standardisation Priorities document that we finally read the following key piece of information: "ICT standardisation requires a balanced IPR [intellectual property rights] policy, based on FRAND licensing terms."...
The problem for open source is that standard licensing can be perfectly fair, reasonable, and non-discriminatory, but would nonetheless be impossible for open source code to implement. Typically, FRAND licensing requires a per-copy payment, but for free software, which can be shared any number of times, there's no way to keep tabs on just how many copies are out there. Even if the per-copy payment is tiny, it's still a licensing requirement that open source code cannot meet....Ars has asked the European Commission for comment on its decision to use FRAND, rather than a royalty-free approach. We'll update this story when the EC responds.... ...Full Story
Open Data Barometer 2015: 5 European countries in the Top 10
EU Joinup April 26, 2016 - Five European countries ranked in the top 10 of the 2015 Open Data Barometer, recently published by the World Wide Web Foundation.
The UK is still at the top of the barometer, but is now followed by the USA and France, both ranked second. France, which was third in 2014, received good marks in three criteria: government action, political impact and, citizens and civil rights.
Denmark ranked 5th and moved up by four positions. The Netherlands ranked 7th and Sweden 9th, with both losing ground (-1 for the former, -6 for the latter)....Other conclusions from 2015 include the fact that “Open Data is entering the mainstream”, with 55% of the 92 countries listed in the survey now having an open data initiative in place. However, almost 90% of data are still locked, the report said. Only 10% of the published data are open (following the open data definition) but are also of poor quality, “making it difficult for potential data users to access, process, and work with it effectively”.
Lastly, this Open Data Barometer warns about “open-washing” behavior, which is “jeopardizing progress”. “Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and are supported by a legal framework”, the report said. “Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries.” ...Full Story
European Cloud Initiative to give Europe a global lead in the data-driven economy
European Commission April 25, 2016 - Europe is the largest producer of scientific data in the world, but insufficient and fragmented infrastructure means this 'big data' is not being exploited to its full potential. By bolstering and interconnecting existing research infrastructure, the Commission plans to create a new European Open Science Cloud that will offer Europe's 1.7 million researchers and 70 million science and technology professionals a virtual environment to store, share and re-use their data across disciplines and borders. This will be underpinned by the European Data Infrastructure, deploying the high-bandwidth networks, large scale storage facilities and super-computer capacity necessary to effectively access and process large datasets stored in the cloud. This world-class infrastructure will ensure Europe participates in the global race for high performance computing in line with its economic and knowledge potential.
Focusing initially on the scientific community - in Europe and among its global partners -, the user base will over time be enlarged to the public sector and to industry. This initiative is part of a package of measures to strengthen Europe's position in data-driven innovation, to improve competitiveness and cohesion and to help create a Digital Single Market in Europe (press release)....The European Cloud Initiative will make it easier for researchers and innovators to access and re-use data, and will reduce the cost of data storage and high-performance analysis. Making research data openly available can help boost Europe's competitiveness by benefitting start-ups, SMEs and data-driven innovation, including in the fields of medicine and public health. It can even spur new industries, as demonstrated by the Human Genome Project.... ...Full Story
ANAB and ASCLD/LAB Merge Forensics Operations
ANSI.org Weekly News April 25, 2016 - The ANSI-ASQ National Accreditation Board (ANAB) has signed an affiliation agreement with the American Society of Crime Laboratory Directors/Laboratory Accreditation Board (ASCLD/LAB), merging ASCLD/LAB into ANAB.
Like ANAB, ASCLD/LAB provides accreditation based on international standards for public and private sector crime laboratories. Both ANAB and ASCLD/LAB are grounded in conducting scientific and technical assessments and committed to assuring competent and credible test and inspection results. The merger with ASCLD/LAB allows ANAB to enhance its expertise in the field of forensics accreditation while providing uninterrupted service to the customers of both organizations.... ...Full Story
Commission publishes reports on eGovernment and Standards public consultations
EU Joinup April 22, 2016 - Today the European Commission published the analysis reports on two public consultations: eGovernment Action Plan 2016-2020 and Standards....The majority of the respondents to the consultation on standardisation in the Digital Single Market supported the Commission’s initial problem analysis on ICT standardisation, in particular the need to define clearer priorities for core ICT related technologies. These recommendations to the Commission, along with the advice of the European Multi-Stakeholder Platform on ICT standardisation will form the basis for the Communication setting up priorities on ICT standardisation for the Digital Single Market.
Building on today's results of the public consultations, the Commission proposed measures on the digitisation of European industry on the 19 April. ...Full Story
LocalGovDigital agrees 15 service standards
UKAuthority.com April 22, 2016 - Agile methodologies, consistency with other government digital services, open standards and making use of common platforms are among the key features of the final draft of the Digital Service Standard for local government, which was released by the practitioners' group LocalGovDigital late last week....Open standards are highlighted as important, along with using existing data and registers, and where possible making source code and service data open and reusable.... ...Full Story