Don't have an account yet? Sign up as a New User
Lost your password?
Welcome to ConsortiumInfo.org
Sunday, July 24 2016 @ 09:43 AM CDT
Sunday, February 24 2008 @ 02:34 PM CST
Contributed by: Andy Updegrove
This rather long essay is in one sense a reply to the open letter recently released by Patrick Durusau, in which he suggested that it was time to acknowledge progress made and adopt OOXML. But it is also an explanation of why I have for the first time in my career become personally involved in supporting a standard. The reason is that I believe that we are at a watershed in public standards policy, and that there is much more at stake than ODF and OOXML. In this essay, I explain why I think we need to recognize the existence and vital importance of what I call “Civil ICT Standards,” and why more than simple technical compromises are needed to create them in order to protect our “Civil ICT Rights.”
As I write this entry, hundreds of people from around the world are converging on Geneva, Switzerland. 120 will meet behind closed doors to hold the final collaborative discussions that will determine whether OOXML will become an ISO/IEC standard. When their work is complete, not everyone will be pleased with the changes agreed upon, but all will acknowledge that the specification that eventually emerges will be much improved from the version that was originally submitted to Ecma two years ago.
Most will also agree that Microsoft’s customers and independent software vendors (ISVs) will be far better off with OOXML publicly available than they would if Microsoft had not offered the specification up at all.
To reach this final draft, hundreds of standards professionals in many nations have spent a great deal of time and effort, including many at Microsoft. And while Microsoft, working with Ecma, has not agreed to all of the changes that have been requested, my impression is that it has agreed to many that will, if implemented by Microsoft, require a substantial amount of work and technical compromise on its part.
Thursday, February 21 2008 @ 09:28 AM CST
Contributed by: Andy Updegrove
Microsoft has just made a major announcement relating to its core products and involving the degree and manner in which it will make the details of those products available to developers. The importance of the announcement was underlined by those that were brought together for the press event at which the decisions were announced: chief executive Steve Ballmer, chief software architect Ray Ozzie, senior vice president of the server and tools business Bob Muglia, and Brad Smith, the senior vice president and general counsel for legal and corporate affairs.
At first glance, this appears to be an important decision by Microsoft indicating a greater willingness to be both open and cooperative. There are a number of promises in the announcement that I like, including the commitment to publish a great deal of material on the Web, as well as the freedom that will be offered to developers to take certain actions without the necessity of first obtaining a license. However, I have not had the opportunity to read any of the supporting details, and those details will be extremely significant, especially as regards the open source community, where subtle differences in legal terms can permit use under some open source licenses, but not others.
Similarly, with respect to ODF, it will be important to see what kind of plug ins are made available, how they may be deployed, and also how effective (or ineffective) those translators may be. If they are not easy for individual Office users to install, or if their results are less than satisfactory, then this promise will sound hopeful but deliver little. I am disappointed that the press release does not, as I read it, indicate that Microsoft will ship Office with a "save to" ODF option already installed. This means that ODF will continue to be virtually the only important document format that Office will not support "out of the box."
Friday, February 08 2008 @ 08:25 AM CST
Contributed by: Andy Updegrove
The Wall Street Journal reported this morning that EU regulators have announced a third investigation into Microsoft's conduct on the desktop. This latest action demonstrates that while the EU has settled the case against Microsoft that ran for almost a decade, it remains as suspicious as ever regarding the software vendor's conduct, notwithstanding Microsoft's less combative stance in recent years. The news can be found in a story reported by Charles Forelle bylined in Brussells this morning.
According to the Journal, the investigation will focus on whether Microsoft "violated antitrust laws during a struggle last year to ratify its Office software file format as an international standard." The article also says that the regulators are "stepping up scrutiny of the issue." The Journal cites the following as the type of activity it will look into:
In the months and weeks leading up to [last summer's vote on OOXML], Microsoft resellers and other allies joined standards bodies en masse -- helping swell the Italian group, for instance, from a half-dozen members to 85. Opponents said Microsoft stacked committees. People familiar with the matter say EU regulators are now questioning whether Microsoft's actions were illegal. Microsoft said at the time that any committee expansion had the effect of making more voices heard; it also said rival International Business Machines Corp. mobilized on the other side of the vote.
A Microsoft spokesman referred to a statement issued last month, in which the company said it would "cooperate fully" with the EU regulator and was "committed to ensuring" the company is in compliance with EU law.
Wednesday, January 30 2008 @ 06:21 AM CST
Contributed by: Andy Updegrove
As many of you are aware, Alex Brown will be the "Convenor" of the OOXML Ballot Resolution Meeting (BRM) that will run from February 25 through 29 in Geneva, Switzerland. Alex has a variety of unenviable tasks, including:
Trying to interpret various standing Directives and other ISO/IEC JTC1 rules and practices that were created for what might be described as kinder, gentler times (not to mention for shorter specifications).
Figuring out how to process c. 1,000 comments (after elimination of duplicates) during a 35 hour meeting week, without the currently contemplated possibility of an extension.
Herding 120 cats, some of which will have strong opinions on individual points, others of which will have alternating suggestions on how to resolve a given point, and many of whom may be just plain bewildered, due to the lack of time to be fully prepared.
For better or worse, the rules that Alex will be interpreting and applying are not as comprehensive, and certainly not as detailed, as the situation might demand to put everyone on exactly the same page regarding what should (or at least could) be done at many points in time. As a result, knowing how Alex's thoughts are shaping up is both interesting and important. To his credit, he has been generous about sharing those thoughts, and often how he arrived at them, at his blog, which can be found here.
While I've often linked to Alex's blog and have had a permanent link in the "Blogs I Read" category for some time, I'd like to point to Alex's latest entry, which covers several important points that others have recently blogged on. In many cases, Alex comes out differently than some others that have stated firm opinions, and since Alex has the gavel, his opinion will be the one that counts.
Thursday, January 17 2008 @ 01:03 PM CST
Contributed by: Andy Updegrove
If you're reading this blog entry, you've probably been following the battle between ODF and OOXML. If so, you may be thinking of that conflict as a classic standards war, but in fact, it goes much deeper than that label would suggest. What is happening between the proponents of ODF and OOXML is only a skirmish in a bigger battle that involves a fundamental reordering of forces, ideologies, stakeholders, and economics at the interface of society and information technology.
Today, open source software is challenging proprietary models, hundreds of millions of people in emerging societies are choosing their first computer platforms from a range of alternatives, major vendors are converting from product to service strategies, and software as a service is finally coming into its own - to mention only a few of the many forces that are transforming the realities that ruled the IT marketplace for decades. When the dust settles, the alignments and identities of the Great Powers of the IT world will be as different as were the Great Powers of the world at the end of the First World War.
It is in this light that the ODF vs. OOXML struggle should really be seen, and for this reason I've dedicated the latest issue of Standards Today to exploring these added dimensions on the eve of the OOXML Ballot Resolution Meeting that will begin on February 25 in Geneva, Switzerland.
Monday, January 14 2008 @ 10:50 AM CST
Contributed by: Andy Updegrove
Regulators in the EU today announced that they are opening two new investigations against Microsoft, this time focusing not on peripheral functionalities like media players, but on the core of Microsoft's business: its operating and office suite software. The investigations are in response to a recent complaint filed by Norway browser developer Opera Software ASA and a 2006 complaint brought by the European Committee for Interoperable Systems (ECIS), which includes Microsoft rivals IBM, Nokia, Sun, RealNetworks and Oracle among its members.
Both investigations focus on the benefits that Microsoft gains by combining features, such as search and Windows Live, into its operating system. But the investigation sparked by the Opera complaint also includes some novel and interesting features, based upon Opera's contention that Microsoft's failure to conform Internet Explorer to prevailing open standards puts its competitors at a disadvantage (Opera also asks that either IE not be bundled with Windows, or that other browsers, including its own, should be included as well, with no browser being preset as a default).
The investigations will also look into whether Microsoft has failed to adequately open OOXML, or to take adequate measures to ensure that Office is "sufficiently interoperable" with competing products. This would seem to indicate that Microsoft's strategy of offering OOXML to Ecma, and then ISO/IEC JTC1, may fail to achieve its objective, whether or not OOXML is finally approved as a global standard.
Thursday, January 03 2008 @ 01:43 PM CST
Contributed by: Andy Updegrove
It's not often I find myself at a loss for words when I read something, but this is one of those times.
Or perhaps it would be more accurate to say that it isn't really necessary for me to add any words to the following news, other than to characterize them with a Latin phrase lawyers use: Res ipse loquitor, which translates as "the thing speaks for itself." I'll give one clue, though: I've added this blog post to the "ODF and OOXML" folder. That's "OOXML" as in "the world must have this standard so that our customers can open the billions of documents that have already been created in older versions of" a certain office productivity suite.
So without further ado, here's the news, along with what a few other people have had to say about it [Update: see also the comments that readers have added below interpreting the original Microsoft information]:
Thursday, December 13 2007 @ 04:55 AM CST
Contributed by: Andy Updegrove
As the date for the February BRM (Ballot Resolution Meeting) on ISO/IEC JTC1 DIS 29500 (a/k/a Ecma 376, a/k/a Microsoft OOXML) approaches, more and more attention is being paid to how Ecma will propose the disposition of the comments submitted during the general voting period. This level of heightened interest is legitimately urgent, due to both the great number of the comments that need to be resolved, even after elimination of duplicates, as well because of the late date upon which the proposed resolutions will be made public (the deadline, if memory serves, is January 19, while the BRM will commence its deliberations on February 25 of next year).
The words are therefore flying fast and furious at the many blogs covering this question, and tempers are rising in the comments appended to those of bloggers that have a direct interest in the outcome. A particularly contentious issue has been whether Ecma is trying to make it as easy as possible, or is trying to make it as difficult as possible while still scoring PR points, for interested parties to view proposed dispositions of comments, and whether it does, or does not, have the latitude under ISO rules to be more transparent. The fairly opaque, and sometimes contradictory nature of those rules, has not made the debate any easier, and gives rise to the possibility of confusion, at best, and serious mistakes, at worst, as Pamela Jones pointed out at Groklaw this morning.
The result is that there will be very little real data available to the general public until Ecma opens the curtains on January 19. And the import of what little data does become available is usually the subject of instant disagreement.
With that as prelude, I've pasted in the text of a press release at the end of this blog entry that Ecma issued yesterday. The release gives only a peek at some of the issues addressed in the new dispositions, giving varying degrees of detail on each area highlighted - but that's more than we've had to go on so far. Here is my summary of the press release and its significance, when viewed in the context of other reliable, available information:
Saturday, November 17 2007 @ 08:15 AM CST
Contributed by: Andy Updegrove
Those of us who live in America are currently in the midst of that most protracted, expensive and (often) tedious of all democratic processes: the quadrennial quest to find, and perhaps even elect, the most able leader to guide the nation into the future. Part and parcel to that spectacle is a seemingly endless torrent of printed words and video. These emanate from more than a dozen candidates, each of whom is trying to convince the electorate that he or she is The One, while at the same time hoping to avoid offering any point of vulnerability that can be exploited by the opposition.
It is an overwhelming and leveling experience for all concerned, electorate and candidates alike.
Out of the campaign cacophony of the last week emerged a handful of words from Senator and Democratic party hopeful Barack Obama that could not fail to catch my attention. He used them during the presidential debate held in Las Vegas, and they also appear in the "Innovation Agenda" that Obama had released a few days before. He announced this agenda in a speech he delivered on November 14 at an aptly selected venue: the Google campus in Mountainview, California. One of the pledges he made in the course of that speech reads in part as follows:
To seize this moment, we have to use technology to open up our democracy. It's no coincidence that one of the most secretive Administrations in history has favored special interests and pursued policies that could not stand up to sunlight. As President, I'll change that. I'll put government data online in universally accessible formats. [emphasis added]
A presidential candidate that is including "universally accessible formats" in his platform? How did that come about?
Friday, November 09 2007 @ 07:00 AM CST
Contributed by: Andy Updegrove
Wednesday I attended the W3C Technical Plenary Day festivities, which included a brief press conference with Tim Berners-Lee, interesting insights into the W3C's work in progress and future plans, and much more (you can view the agenda here). And it also gave me a chance to sit with Chris Lilley, a W3C employee whose responsibilities include Interaction Domain Leader, Co-Chair W3C SVG Working Group, W3C Graphics Activity Lead and Co-Chair, W3C Hypertext CG. What that combination of titles means is that he is the "go to" guy at W3C to learn what W3C's CDF standard is all about.
CDF is one of the very many useful projects that W3C has been laboring on, but not one that you would have been likely to have heard much about. Until recently, that is, when Gary Edwards, Sam Hiser and Marbux, the management (and perhaps sole remaining members) of the OpenDocument Foundation decided that CDF was the answer to all of the problems that ODF was designed to address. This announcement gave rise to a flurry of press attention that Sam Hiser has collected here. As others (such as Rob Weir) have already documented, these articles gave the Foundation's position far more attention than it deserved.
Quote of the Day
“We are certain that the Internet of Things will only be successful if it is built on open technologies
-Eclipse Foundation Executive Director Mike Milinkovich See all Quotes
Latest NewsA Data Model to Support the Publishing of Legislation as Linked Open DataJens ScheerlinckEU Joinup
July 22, 2016 - Citizens, professionals in the legal domain, businesses as well as civil servants need to know what legislation is in force. Legislation is often amended, repealed and codified, making it difficult to have a clear view of what text is in force at any specific point in time. In this context, the Hellenic Ministry of Interior and Administrative Reconstruction and the Italian Anti-corruption Agency contacted the ISA Programme of the European Commission to develop a pilot that has the two fold objective of making legislation available in both human and machine readable format and visualising the evolution of legislation over time, to enable user friendly consultation.
In order to allow legislative information to be published as Open Data, a data model was proposed to support this publishing process. The suggested data model is based on the ELI ontology and extended with concepts from Akoma Ntoso and the Core Public Organisation Vocabulary, thereby facilitating interoperability with other EU Member States. The full pilot can be downloaded or forked from the SEMICeu Github repository and the documentation on the data model can be consulted on the pilot website.
The data model has been put in public deliberation by the Ministry until 15 July 2016. ...Full Story
TC260 Drafts New Standard for China's Cloud Security Review Regime
USITO.org Weekly July 21, 2016 - Recently, TC260 has published the draft "Information Security Technology - Security Capability Evaluation Methods of Cloud Computing Services" for comments. The public comment period will end on August 11. This draft standard aims to provide guidance for third-party agencies on how to conduct cloud service capability evaluation via interviews, inspections and testing.
This standard, along with two others, cover guidelines for cloud service provider's size and operational experience, business dealings between cloud service providers and government customers, cloud computing services cybersecurity management and a range of other issues. The three standards have also been adopted as main references in the CAC's Cloud Computing Services Cybersecurity Review, which was announced on June 26, 2015 and targets services for Party and government departments. ...Full Story
IoT Security: What IoT Can Learn From Open Source Businesses are hugely concerned about IoT
Datamation July 20, 2016 - When personal computers were introduced, few manufacturers worried about security. Not until the early 1990s did the need for security become widely understood. Today, the Internet of Things (IoT) is following the same pattern -- except that the need for security is becoming obvious far more quickly, and manufacturers should have known better, especially given the overwhelming influence of open source.
The figures speak for themselves. In 2014, a study by Hewlett-Packard found that seven out of ten IoT devices tested contained serious security vulnerabilities, an average of twenty-five per device. In particular, the vulnerabilities included a lack of encryption for local and Internet transfer of data, no enforcement of secure passwords, and security for downloaded updates. The devices test included some of the most common IoT devices currently in use, including TVs, thermostats, fire alarms and door locks.
Given that Gartner predicts that 25 billion smart devices will be in use by 2020, no one needs to be a prophet to foresee a major security problem that will make even the security problems of the basic Internet seem insignificant....how have IoT manufacturers failed to be more security conscious?...
That smart devices, like OpenStack before it, are being built on the shoulders of open source, is too obvious for anyone to doubt. In early 2015, VisionMobile's survey of 3,700 IoT developers indicated that 91% used open source in their work.
This figure suggests that, without open source, the development of the IoT would be much slower if it happened at all. If nothing else, the use of open source and open standards helps to reduce compatibility problems between manufacturers' devices.... ...Full Story
Ultracode Standard Introduced by AIM
AIM July 19, 2016 - AIM announced today the release of the Ultracode international standard, establishing a significant enhancement in barcode technology for the automatic identification and data capture (AIDC) industry and consumerization.
Ultracode is the first 2D, error-correcting color barcode which can either be displayed on smartphones or printed by using a digital color camera or smartphone app. Its development was motivated by the ubiquitous use of color electronic displays, digital cameras and especially the development of the smartphone. Using Ultracode, standard color technology can create an image that encodes the same data in less than half the area of a QR Code, minimizing display space required.
The effort to develop Ultracode as a formal standard began more than a decade ago.... ...Full Story
ITU announces new standard for High Dynamic Range TV
ITU July 18, 2016 - ITU has announced a new standard for High Dynamic Range Television that represents a major advance in television broadcasting. High Dynamic Range Television (HDR-TV) brings an incredible feeling of realism, building further on the superior colour fidelity of ITU’s Ultra-High Definition Television (UHDTV) Recommendation BT.2020. ITU’s Radiocommunication Sector (ITU-R) has developed the standard – or Recommendation – in collaboration with experts from the television industry, broadcasting organizations and regulatory institutions in its Study Group 6.
This latest ITU-R HDR-TV Recommendation BT.2100 brings a further boost to television images, giving viewers an enhanced visual experience with added realism. The HDR-TV Recommendation allows TV programmes to take full advantage of the new and much brighter display technologies. HDR-TV can make outdoor sunlit scenes appear brighter and more natural, adding highlights and sparkle. It enhances dimly lit interior and night scenes, revealing more detail in darker areas, giving TV producers the ability to reveal texture and subtle colours that are usually lost with existing Standard Dynamic Range TV.... ...Full Story
New NERC Rules for Critical Cyber Assets Expand the Scope of U.S. Federal Regulation to New Facilities and Practices
Lexology July 15, 2016 - As a result of federal legislation enacted after the large Northeast/Midwest blackout in 2003, electric utilities and other electric market participants in the United States are subject to mandatory reliability standards developed through stakeholder processes by the North American Electric Reliability Corporation (NERC) and enforced by the Federal Energy Regulatory Commission (FERC) with substantial financial penalties of up to US$1million per day for each standard violation.
Among the categories of mandatory electric reliability standards are Critical Infrastructure Protection (CIP) standards that were first adopted in 2008. Those standards required owners and operators of “Critical Cyber Assets” (CCA)1 to develop, maintain, and implement cybersecurity policies that cover, among other things, training and access restrictions for personnel with access to CCAs, procedures for managing electronic and physical security perimeters, software security, incident reporting and response planning, and recovery plans to restore CCAs following an incident.
In 2013, NERC proposed and FERC approved version 5 of the CIP standards, a wholesale revision and significant change in approach under the standards. The new standards will be phased in, starting on 1 July 2016. The most significant change in the version 5 standards is the methodology to be used and the requirements for identifying assets subject to the standards, as described below for standard CIP-002-5. The scope of the new standards are significantly broader than the prior version and owners and operators of smaller electric generation and transmission facilities and generation control centers will now be subject to the CIP standards for the first time.... ...Full Story
Automotive Grade Linux wants to help open source your next car
Tech Republic July 15, 2016 - ...The [Linux Foundation] started Automotive Grade Linux (AGL) to create open source software solutions for automotive applications. Their initial focus is on In-Vehicle-Infotainment (IVI) and their long-term goals include the addition of instrument clusters and telematics systems. Already AGL has the likes of Ford, Jaguar, Land Rover, Mazda, Mitsubishi Motors, Nissan, Subaru, and Toyota on board and that list will only continue to grow....Instead of depending on a separate device to serve as the operating system to drive the platform, AGL will be a stand-alone platform...Because AGL is open source, car manufacturers won't be dealing with a collection of proprietary code that will work for a single model, only to have to turn around and purchase another collection of proprietary code for the next model. Instead, the manufacturer downloads the source for AGL and makes it work to their exact specifications each time. Couple this with the idea that, according to Emily Olin, senior PR representative for the Linux Foundation, most auto manufacturers don't want to hand over control to the likes of Google or Apple and AGL starts to make a lot of sense.... ...Full Story
A Call for Developing—and Using—Consensus Standards to Ensure the Quality of Cell Lines
NIST July 14, 2016 - Mainstays of biomedical research, permanent lines of cloned cells are used to study the biology of health and disease and to test prospective medical therapies. Yet, all too often, these apparent pillars of bioscience and biotechnology crumble because they are crafted from faulty starting materials: misidentified or cross-contaminated cell lines.
Writing in the June 2016 issue of PLOS Biology, scientists from the National Institute of Standards and Technology (NIST) call for “community action” to assemble a “comprehensive toolkit for assuring the quality of cell lines,” employed at the start of every study.
As important, they assert, more researchers and laboratories should use the tools that already exist. The NIST authors point to the American National Standard for authentication of human cell lines, which can be implemented to detect cell-line mix-ups and contamination before embarking on studies of cancer or other research using human cells.
Unfortunately, the four-year-old standard has not been widely adopted, even though cell-line authentication is a growing priority among funders and publishers of research.
Cell lines are populations of clones: genetically uniform animal or plant cells that are bioengineered to proliferate indefinitely in culture....A “high level of confidence” in published research results requires valid underpinning data on methods and materials—cell lines, instrument performance and more, explain the researchers, who work in the Biosystems and Biomaterials Division of NIST’s Material Measurement Laboratory. “One might argue that these control data are as important as the study data themselves.”...The authors advocate using inclusive, consensus standards-setting processes—like the one used for human cell-line authentication—to address these needs as well as to seize new opportunities that are arising with the commercialization of genome-sequencing technologies.... ...Full Story
Automotive Grade Linux Releases Unified Code Base 2.0
AGL/Linux Foundation July 13, 2016 - Automotive Grade Linux (AGL), a collaborative open source project developing a Linux-based, open platform for the connected car, today announced the release of AGL Unified Code Base (UCB) 2.0. Built from the ground up through a joint effort by automakers and suppliers, the AGL UCB is an In-Vehicle-Infotainment (IVI) platform that can serve as the de facto standard for the industry.
The latest version of the Linux distribution includes new features such as audio routing and rear seat display. Ideal for deploying navigation, communications, safety, security and infotainment functionality, the AGL UCB distribution is supported by a broad community of participants with significant contributions from AGL members.... ...Full Story
Agencies push for open standards across cloud services
GCN.com July 13, 2016 - Agencies are adopting a growing range of cloud solutions, but more-robust open standards would better support hybrid clouds and integrate cross-vendor workflows [according to the International Trade Administration’s CIO Joe Paiva]....While the open standards for web services and application programming interfaces allow ITA to easily move and exchange data on the web, Paiva said, there are no standards for workflows across multiple clouds’ application programming interfaces....Additionally, Paiva said he is pushing industry for open standards for workflow metadata as well, which would eliminate the need to use each vendor’s proprietary coding and settings and allow agencies to easily modify workflows, data and the way data is presented.... ...Full Story