Don't have an account yet? Sign up as a New User
Lost your password?
Welcome to ConsortiumInfo.org
Sunday, May 01 2016 @ 12:23 PM CDT
Monday, June 08 2015 @ 01:20 AM CDT
Contributed by: Andy Updegrove
Last July, the UK Cabinet Office adopted a rule requiring government purchasers to limit their technology acquisitions to products that implement an established list of “open standards.” Last week, Sweden took another step down the same road as it further refined a list of information and communications technology (ICT) standards. That list currently comprises sixteen standards. A posting at the European Commission EU Joinup Web site reports that other standards are to be added this year.
Monday, September 15 2014 @ 09:57 AM CDT
Contributed by: Andy Updegrove
OpenForum Europe, an advocacy group focusing on IT openness in government, issued a press release earlier today announcing its launch of a new public Internet portal. At that site, anyone can report a government page that offers a document intended for collaborative use for downloading if that document is not available in an OpenDocument Format (ODF) compliant version. The portal is called FixMyDocuments.eu, and you can show your support for the initiative (as I have) by adding your name here (the first supporter listed is the EU's indominatable digital champion, Neelie Kroes).
Friday, January 04 2008 @ 06:24 AM CST
Contributed by: Andy Updegrove
This is the fifth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All product names used below are registered trademarks of their vendors.
Chapter 5: Open Standards
One of the two articles of faith that Eric Kriss and Peter Quinn embraced in drafting their evolving Enterprise Technical Reference Model (ETRM) was this: products built to "open standards" are more desirable than those that aren't. Superficially, the concept made perfect sense – only buy products that you can mix and match. That way, you can take advantage of both price competition as well as a wide selection of alternative products from multiple vendors, each with its own value-adding features. And if things don't work out, well, you're not locked in, and can swap out the loser and shop for a winner.
But did that make as much sense with routers and software as it did with light bulbs and lamps? And in any event, if this was such a great idea, why hadn't their predecessors been demanding open standards-based products for years? Finally, what exactly was that word "open" supposed to mean?
To answer these questions properly requires a brief hop, skip and jump through the history of standards, from their origins up to the present. And that's what this chapter is about.
Friday, December 28 2007 @ 12:07 PM CST
Contributed by: Andy Updegrove
This is the fourth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 4 – Eric Kriss, Peter Quinn and the ETRM
By the end of December 2005, I had been blogging on ODF developments in Massachusetts for about four months, providing interviews, legal analysis and news as it happened. In those early days, not many bloggers were covering the ODF story, and email began to come my way from people that I had never met before, from as far away as Australia, and as near as the State House in Boston. Some began with, "This seems really important – what can I do to help?" Others contained important information that someone wanted to share, and that I was happy to receive.
One such email arrived just before Christmas in 2005. In its entirety, it read:
Enjoy reading your consortiuminfo blog ... keep it up.
Happy New Year,
This was a pleasant and welcome surprise. Until the end of September, Eric Kriss had been the Massachusetts Secretary of Administration and Finance, and therefore Peter Quinn's boss. Together, they had conceived, architected and launched the ambitious IT upgrade roadmap that in due course incorporated ODF into the state's procurement guidelines.
Monday, December 10 2007 @ 07:05 AM CST
Contributed by: Andy Updegrove
This is the third chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
This chapter was revised at 8:30 AM on 12/11/07, most significantly by adding the "Lessons applied" section.
Chapter 3: What a Difference a Decade Can Make
In 1980, Microsoft was a small software vendor that had built its business primarily on downsizing mainframe programming languages to a point where they could be used to program the desktop computers that were then coming to market. The five year old company had total revenues of $7,520,720, and BASIC, its first product, was still its most successful. By comparison, Apple Computer had already reached sales of $100 million, and the same year launched the largest public offering since the Ford Motor Company had itself gone public some twenty-four years before. Microsoft was therefore far smaller than the company that Steve Jobs and Steve Wozniak had formed a year after Bill Gates and Paul Allen sold their first product.
Moreover, in the years to come, PC-based word processing products like WordStar, and then WordPerfect, would become far more popular than Microsoft's own first word processing (originally called Multitool Word), providing low-cost alternatives to the proprietary minicomputer based software offerings of vendors like Wang Laboratories. IBM, too, provided a word processing program for the PC called DisplayWriter. That software was based on a similar program that IBM had developed for its mainframe systems customers. More importantly, another program was launched at just the right time to dramatically accelerate the sale of IBM PCs and their clones. That product was the legendary "killer app" of the IBM PC clone market: Lotus 1-2-3, the spreadsheet software upon which Mitch Kapor built the fortunes of his Lotus Development Corporation.
Sunday, December 02 2007 @ 02:07 PM CST
Contributed by: Andy Updegrove
This is the second chapter in a real-time eBook writing project I launched and explained last week. The following is one of a number of stage-setting chapters to follow. Comments, corrections and suggestions gratefully accepted. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 2 – Products, Innovation and Market Share
Microsoft is the envy of many vendors for the hugely dominant position it enjoys in two key product areas: PC desktop operating systems – the software that enables and controls the core functions of personal computers - and "office productivity software" – the software applications most often utilized by PC users, whether at work or at home, to create documents, slides and spreadsheets and meet other common needs. Microsoft's 90% plus market share in such fundamental products is almost unprecedented in the technical marketplace, and this monopoly position enables it to charge top dollar for such software. It also makes it easy for Microsoft to sell other products and services to the same customers.
Microsoft acquired this enviable position in each case through a combination of luck, single-minded determination, obsessive attention to detail, and a willingness to play the game fast and hard – sometimes hard enough to attract the attention of both Federal and state antitrust regulators. Early on, Bill Gates and his team acquired a reputation for bare-knuckle tactics that they sometimes seemed to wear with brash pride. Eventually, these tactics (as well as tales of Gate's internal management style) progressed from industry rumors to the stuff of best sellers, like Hard Drive: Bill Gates and the Making of the Microsoft Empire.
With the emergence of the Web, of course, the opportunity for widely sharing stories, both real (of which there were many) and apocryphal, exploded. Soon Web sites such as Say No to Monopolies: Boycott Microsoft enthusiastically collected and posted tales of alleged technological terror and dirty deeds. More staid collections were posted at sites such as the Wikipedia. The increasing tide of litigation involving Microsoft, launched not only by state and federal regulators but by private parties as well, generated embarrassing documents. Such original sources were not only difficult to deny, but almost impossible to repress in the age of the Web - and of peer to peer file sharing as well.
Moreover, while Bill Gates and his co-founders rarely displayed the creative and innovative flair of contemporaries like Apple's Steve Jobs, neither were they troubled by the type of "not invented here" bias that sometimes led other vendors to pursue unique roads that sometimes led to dead ends.
Sunday, November 25 2007 @ 02:51 PM CST
Contributed by: Andy Updegrove
For some time I've been considering writing a book about what has become a standards war of truly epic proportions. I refer, of course, to the ongoing, ever expanding, still escalating conflict between ODF and OOXML, a battle that is playing out across five continents and in both the halls of government and the marketplace alike. And, needless to say, at countless blogs and news sites all the Web over as well.
Arrayed on one side or the other, either in the forefront of battle or behind the scenes, are most of the major IT vendors of our time. And at the center of the conflict is Microsoft, the most successful software vendor of all time, faced with the first significant challenge ever to ione of its core businesses and profit centers – its flagship Office productivity suite.
Quote of the Day
“A difficult issue that needs to be solved
-Ian Skerrett, VP of Marketing and Ecosystem at the Eclipse Foundation, commenting on the challenge of making the IoT secure See all Quotes
Latest NewsSurvey Highlights Security Concern Among IoT DevelopersPatricio RoblesProgrammable Web
April 29, 2016 - According to the second annual IoT Developer Survey, security is the top concern of IoT developers. The survey, which polled 528 IoT developers, was conducted by the Eclipse IoT Working Group in partnership with the IEEE IoT and the AGILE-IoT research project.
Of developers working in organizations that have deployed IoT solutions, nearly half (48.3%) identified security as their leading concern. In the same group of respondents, interoperability and performance were the second and third biggest concerns, with 31.9% and 21%, respectively....
Not only can vulnerabilities in IoT applications be the source of privacy breaches, as the IoT extends its reach to things like cars, security vulnerabilities could theoretically put lives in danger....In this year's IoT Developer Survey, nearly half (46%) of those polled indicated that their company is developing and deploying IoT solutions, and 29% indicated that their company plans to within the next 18 months, suggesting that adoption of IoT technologies is accelerating.... ...Full Story
The advantages of open source in Internet of Things design
DesignWorldOnline April 28, 2016 - The Internet of Things is booming and with millions of devices to be connected over the coming years, many developers are focusing on the IoT opportunity....There are many commonalities between IoT solutions across different applications—the need for wireless connections, communication between devices and back-end systems, and data collection/interpretation are a few examples. But the proliferation of proprietary systems that are often in silos makes developing and building these solutions more complex and time consuming than needed. In a fast-moving, fragmented industry, open source technologies will play an increasingly fundamental role in mitigating these challenges and enabling seamless systems to further fuel innovation.
One way to circumvent the interoperability challenge is by establishing and using standards. Thoughtful and collaborative standardization improves choice and flexibility. As a result, developers can use devices from multiple vendors to build a solution that is innovative and meets their specific needs. We’ve outlined a few key channels that are essential to unlocking the potential of open source in IoT development.
Standards are necessary across the whole ecosystem and are being addressed by the industry in multiple ways. For example, industry standards organizations, like oneM2M (a consortium of industry stakeholders), has developed technical specifications to address the need for a common M2M Service Layer that can be embedded within various hardware and software and relied on to connect a wide range of devices to M2M application servers.
Another complementary approach to standards development is the release of designs and specifications into the open source community as open hardware and interface standards for others to adopt. Examples include Arduino, Raspberry Pi, and Beaglebone, which enable quick prototyping, as well as the mangOH open hardware reference design, an open source design that is more easily scalable in commercial settings and is built specifically for IoT cellular connectivity.
Open source platforms like these enable developers that may have limited hardware, wireless or low-level software expertise to start developing IoT applications in days—rather than months. If executed properly, these can significantly reduce the time and effort to get prototypes from paper to production by ensuring that various connectors and sensors work together automatically with no additional coding required. With industrial-grade specifications, these next-generation platforms not only allow quick prototyping, but also rapid industrialization of IoT applications.
On the software side, using widely supported open source software application frameworks and development environments, such as Linux—itself an open source solution—can be extremely helpful by providing developers the head start that is required to get a product to market faster. When it comes to proprietary solutions, support for its development framework tends to rest on the original vendor, whose agenda may not align with the needs of the community. Open source solutions ensure a future-proof investment and longevity, so that resources and tools are available and continually enhanced for years to come....
To further advance the industry, we must commit to a standards-based and open-source strategy. Not only will it continue to be critical to the health of IoT innovation, but it will lay the groundwork for real innovation. Just as it supported many other areas of technology development—including nothing less than the Internet itself—open standards are the key to realizing the unforeseen benefits of a more connected world. ...Full Story
ANSI Energy Efficiency Standardization Coordination Collaborative (EESCC) Releases Roadmap Progress Report
ANSI.org April 27, 2016 - The American National Standards Institute (ANSI) Energy Efficiency Standardization Coordination Collaborative (EESCC) announced today the publication of a Progress Report detailing the standardization community’s activity to advance recommendations outlined in the EESCC’s Standardization Roadmap: Energy Efficiency in the Built Environment. Published in June 2014 to serve as a national framework for action and coordination, the roadmap identified gaps where standards and codes were needed to improve energy and water efficiency in the built environment.
Available as a free resource, the Progress Report features updates on 71 of the 109 standards-based gaps identified in the roadmap, demonstrating significant progress within the standardization community to advance energy and water efficiency through standards-based solutions. The report also includes a summary of all of the standards-based roadmap gaps, including those for which there is no known progress at this time, so that readers may easily identify opportunities to take action on closing the gaps.... ...Full Story
Anti-innovation: EU excludes open source from new tech standards
Ars Technica April 27, 2016 - As part of its Digital Single Market strategy, the European Commission has unveiled "plans to help European industry, SMEs, researchers and public authorities make the most of new technologies." In order to "boost innovation," the Commission wants to accelerate the creation of new standards for five buzzconcepts: 5G, cloud computing, internet of things, data technologies, and cybersecurity.
The key document is one entitled "ICT Standardisation Priorities for the Digital Single Market," which says: "Open standards ensure ... interoperability, and foster innovation and low market entry barriers in the Digital Single Market, including for access to media, cultural and educational content." The word "open" occurs 26 times in the document, and is also frequently found in the other "communications" just released by the European Commission: on digitising European industry (9 times), and on the European Cloud Initiative (50 times).
"Open" is generally used in the documents to denote "open standards," as in the quotation above. But the European Commission is surprisingly coy about what exactly that phrase means in this context. It is only on the penultimate page of the ICT Standardisation Priorities document that we finally read the following key piece of information: "ICT standardisation requires a balanced IPR [intellectual property rights] policy, based on FRAND licensing terms."...
The problem for open source is that standard licensing can be perfectly fair, reasonable, and non-discriminatory, but would nonetheless be impossible for open source code to implement. Typically, FRAND licensing requires a per-copy payment, but for free software, which can be shared any number of times, there's no way to keep tabs on just how many copies are out there. Even if the per-copy payment is tiny, it's still a licensing requirement that open source code cannot meet....Ars has asked the European Commission for comment on its decision to use FRAND, rather than a royalty-free approach. We'll update this story when the EC responds.... ...Full Story
Open Data Barometer 2015: 5 European countries in the Top 10
EU Joinup April 26, 2016 - Five European countries ranked in the top 10 of the 2015 Open Data Barometer, recently published by the World Wide Web Foundation.
The UK is still at the top of the barometer, but is now followed by the USA and France, both ranked second. France, which was third in 2014, received good marks in three criteria: government action, political impact and, citizens and civil rights.
Denmark ranked 5th and moved up by four positions. The Netherlands ranked 7th and Sweden 9th, with both losing ground (-1 for the former, -6 for the latter)....Other conclusions from 2015 include the fact that “Open Data is entering the mainstream”, with 55% of the 92 countries listed in the survey now having an open data initiative in place. However, almost 90% of data are still locked, the report said. Only 10% of the published data are open (following the open data definition) but are also of poor quality, “making it difficult for potential data users to access, process, and work with it effectively”.
Lastly, this Open Data Barometer warns about “open-washing” behavior, which is “jeopardizing progress”. “Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and are supported by a legal framework”, the report said. “Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries.” ...Full Story
European Cloud Initiative to give Europe a global lead in the data-driven economy
European Commission April 25, 2016 - Europe is the largest producer of scientific data in the world, but insufficient and fragmented infrastructure means this 'big data' is not being exploited to its full potential. By bolstering and interconnecting existing research infrastructure, the Commission plans to create a new European Open Science Cloud that will offer Europe's 1.7 million researchers and 70 million science and technology professionals a virtual environment to store, share and re-use their data across disciplines and borders. This will be underpinned by the European Data Infrastructure, deploying the high-bandwidth networks, large scale storage facilities and super-computer capacity necessary to effectively access and process large datasets stored in the cloud. This world-class infrastructure will ensure Europe participates in the global race for high performance computing in line with its economic and knowledge potential.
Focusing initially on the scientific community - in Europe and among its global partners -, the user base will over time be enlarged to the public sector and to industry. This initiative is part of a package of measures to strengthen Europe's position in data-driven innovation, to improve competitiveness and cohesion and to help create a Digital Single Market in Europe (press release)....The European Cloud Initiative will make it easier for researchers and innovators to access and re-use data, and will reduce the cost of data storage and high-performance analysis. Making research data openly available can help boost Europe's competitiveness by benefitting start-ups, SMEs and data-driven innovation, including in the fields of medicine and public health. It can even spur new industries, as demonstrated by the Human Genome Project.... ...Full Story
ANAB and ASCLD/LAB Merge Forensics Operations
ANSI.org Weekly News April 25, 2016 - The ANSI-ASQ National Accreditation Board (ANAB) has signed an affiliation agreement with the American Society of Crime Laboratory Directors/Laboratory Accreditation Board (ASCLD/LAB), merging ASCLD/LAB into ANAB.
Like ANAB, ASCLD/LAB provides accreditation based on international standards for public and private sector crime laboratories. Both ANAB and ASCLD/LAB are grounded in conducting scientific and technical assessments and committed to assuring competent and credible test and inspection results. The merger with ASCLD/LAB allows ANAB to enhance its expertise in the field of forensics accreditation while providing uninterrupted service to the customers of both organizations.... ...Full Story
Commission publishes reports on eGovernment and Standards public consultations
EU Joinup April 22, 2016 - Today the European Commission published the analysis reports on two public consultations: eGovernment Action Plan 2016-2020 and Standards....The majority of the respondents to the consultation on standardisation in the Digital Single Market supported the Commission’s initial problem analysis on ICT standardisation, in particular the need to define clearer priorities for core ICT related technologies. These recommendations to the Commission, along with the advice of the European Multi-Stakeholder Platform on ICT standardisation will form the basis for the Communication setting up priorities on ICT standardisation for the Digital Single Market.
Building on today's results of the public consultations, the Commission proposed measures on the digitisation of European industry on the 19 April. ...Full Story
LocalGovDigital agrees 15 service standards
UKAuthority.com April 22, 2016 - Agile methodologies, consistency with other government digital services, open standards and making use of common platforms are among the key features of the final draft of the Digital Service Standard for local government, which was released by the practitioners' group LocalGovDigital late last week....Open standards are highlighted as important, along with using existing data and registers, and where possible making source code and service data open and reusable.... ...Full Story
NIST Releases New Document on its Cryptographic Standards and Guidelines Process
NIST Techbeat April 21, 2016 - The National Institute of Standards and Technology (NIST) has released the final version of a document outlining its process for developing cryptographic standards and guidelines. NIST Cryptographic Standards and Guidelines Development Process (NISTIR 7977) is an integral part of NIST’s effort to ensure a robust, widely understood and participatory process for developing cryptography, which is the technology used to store and transmit data in a particular form so it can only be read or processed by the intended recipient....The “global acceptability” principle was added to this final draft in response to public comments and reflects the global nature of today’s commerce. The document also explains the different types of cryptographic publications NIST releases and how they are made available for public review, as well as how they are managed over their lifecycle....NIST acknowledges the “possibility for tension between NIST’s mission to promulgate the use of strong cryptography, and the law enforcement and national security missions of other agencies,” and affirms that it makes independent decisions and is committed to using open and transparent processes.... The final document can be found on NIST’s website. ...Full Story