Don't have an account yet? Sign up as a New User
Lost your password?
Welcome to ConsortiumInfo.org
Friday, July 29 2016 @ 10:58 AM CDT
Monday, June 08 2015 @ 01:20 AM CDT
Contributed by: Andy Updegrove
Last July, the UK Cabinet Office adopted a rule requiring government purchasers to limit their technology acquisitions to products that implement an established list of “open standards.” Last week, Sweden took another step down the same road as it further refined a list of information and communications technology (ICT) standards. That list currently comprises sixteen standards. A posting at the European Commission EU Joinup Web site reports that other standards are to be added this year.
Monday, September 15 2014 @ 09:57 AM CDT
Contributed by: Andy Updegrove
OpenForum Europe, an advocacy group focusing on IT openness in government, issued a press release earlier today announcing its launch of a new public Internet portal. At that site, anyone can report a government page that offers a document intended for collaborative use for downloading if that document is not available in an OpenDocument Format (ODF) compliant version. The portal is called FixMyDocuments.eu, and you can show your support for the initiative (as I have) by adding your name here (the first supporter listed is the EU's indominatable digital champion, Neelie Kroes).
Friday, January 04 2008 @ 06:24 AM CST
Contributed by: Andy Updegrove
This is the fifth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All product names used below are registered trademarks of their vendors.
Chapter 5: Open Standards
One of the two articles of faith that Eric Kriss and Peter Quinn embraced in drafting their evolving Enterprise Technical Reference Model (ETRM) was this: products built to "open standards" are more desirable than those that aren't. Superficially, the concept made perfect sense – only buy products that you can mix and match. That way, you can take advantage of both price competition as well as a wide selection of alternative products from multiple vendors, each with its own value-adding features. And if things don't work out, well, you're not locked in, and can swap out the loser and shop for a winner.
But did that make as much sense with routers and software as it did with light bulbs and lamps? And in any event, if this was such a great idea, why hadn't their predecessors been demanding open standards-based products for years? Finally, what exactly was that word "open" supposed to mean?
To answer these questions properly requires a brief hop, skip and jump through the history of standards, from their origins up to the present. And that's what this chapter is about.
Friday, December 28 2007 @ 12:07 PM CST
Contributed by: Andy Updegrove
This is the fourth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 4 – Eric Kriss, Peter Quinn and the ETRM
By the end of December 2005, I had been blogging on ODF developments in Massachusetts for about four months, providing interviews, legal analysis and news as it happened. In those early days, not many bloggers were covering the ODF story, and email began to come my way from people that I had never met before, from as far away as Australia, and as near as the State House in Boston. Some began with, "This seems really important – what can I do to help?" Others contained important information that someone wanted to share, and that I was happy to receive.
One such email arrived just before Christmas in 2005. In its entirety, it read:
Enjoy reading your consortiuminfo blog ... keep it up.
Happy New Year,
This was a pleasant and welcome surprise. Until the end of September, Eric Kriss had been the Massachusetts Secretary of Administration and Finance, and therefore Peter Quinn's boss. Together, they had conceived, architected and launched the ambitious IT upgrade roadmap that in due course incorporated ODF into the state's procurement guidelines.
Monday, December 10 2007 @ 07:05 AM CST
Contributed by: Andy Updegrove
This is the third chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
This chapter was revised at 8:30 AM on 12/11/07, most significantly by adding the "Lessons applied" section.
Chapter 3: What a Difference a Decade Can Make
In 1980, Microsoft was a small software vendor that had built its business primarily on downsizing mainframe programming languages to a point where they could be used to program the desktop computers that were then coming to market. The five year old company had total revenues of $7,520,720, and BASIC, its first product, was still its most successful. By comparison, Apple Computer had already reached sales of $100 million, and the same year launched the largest public offering since the Ford Motor Company had itself gone public some twenty-four years before. Microsoft was therefore far smaller than the company that Steve Jobs and Steve Wozniak had formed a year after Bill Gates and Paul Allen sold their first product.
Moreover, in the years to come, PC-based word processing products like WordStar, and then WordPerfect, would become far more popular than Microsoft's own first word processing (originally called Multitool Word), providing low-cost alternatives to the proprietary minicomputer based software offerings of vendors like Wang Laboratories. IBM, too, provided a word processing program for the PC called DisplayWriter. That software was based on a similar program that IBM had developed for its mainframe systems customers. More importantly, another program was launched at just the right time to dramatically accelerate the sale of IBM PCs and their clones. That product was the legendary "killer app" of the IBM PC clone market: Lotus 1-2-3, the spreadsheet software upon which Mitch Kapor built the fortunes of his Lotus Development Corporation.
Sunday, December 02 2007 @ 02:07 PM CST
Contributed by: Andy Updegrove
This is the second chapter in a real-time eBook writing project I launched and explained last week. The following is one of a number of stage-setting chapters to follow. Comments, corrections and suggestions gratefully accepted. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 2 – Products, Innovation and Market Share
Microsoft is the envy of many vendors for the hugely dominant position it enjoys in two key product areas: PC desktop operating systems – the software that enables and controls the core functions of personal computers - and "office productivity software" – the software applications most often utilized by PC users, whether at work or at home, to create documents, slides and spreadsheets and meet other common needs. Microsoft's 90% plus market share in such fundamental products is almost unprecedented in the technical marketplace, and this monopoly position enables it to charge top dollar for such software. It also makes it easy for Microsoft to sell other products and services to the same customers.
Microsoft acquired this enviable position in each case through a combination of luck, single-minded determination, obsessive attention to detail, and a willingness to play the game fast and hard – sometimes hard enough to attract the attention of both Federal and state antitrust regulators. Early on, Bill Gates and his team acquired a reputation for bare-knuckle tactics that they sometimes seemed to wear with brash pride. Eventually, these tactics (as well as tales of Gate's internal management style) progressed from industry rumors to the stuff of best sellers, like Hard Drive: Bill Gates and the Making of the Microsoft Empire.
With the emergence of the Web, of course, the opportunity for widely sharing stories, both real (of which there were many) and apocryphal, exploded. Soon Web sites such as Say No to Monopolies: Boycott Microsoft enthusiastically collected and posted tales of alleged technological terror and dirty deeds. More staid collections were posted at sites such as the Wikipedia. The increasing tide of litigation involving Microsoft, launched not only by state and federal regulators but by private parties as well, generated embarrassing documents. Such original sources were not only difficult to deny, but almost impossible to repress in the age of the Web - and of peer to peer file sharing as well.
Moreover, while Bill Gates and his co-founders rarely displayed the creative and innovative flair of contemporaries like Apple's Steve Jobs, neither were they troubled by the type of "not invented here" bias that sometimes led other vendors to pursue unique roads that sometimes led to dead ends.
Sunday, November 25 2007 @ 02:51 PM CST
Contributed by: Andy Updegrove
For some time I've been considering writing a book about what has become a standards war of truly epic proportions. I refer, of course, to the ongoing, ever expanding, still escalating conflict between ODF and OOXML, a battle that is playing out across five continents and in both the halls of government and the marketplace alike. And, needless to say, at countless blogs and news sites all the Web over as well.
Arrayed on one side or the other, either in the forefront of battle or behind the scenes, are most of the major IT vendors of our time. And at the center of the conflict is Microsoft, the most successful software vendor of all time, faced with the first significant challenge ever to ione of its core businesses and profit centers – its flagship Office productivity suite.
Quote of the Day
“We are certain that the Internet of Things will only be successful if it is built on open technologies
-Eclipse Foundation Executive Director Mike Milinkovich See all Quotes
Latest NewsGet ready to use your fingerprints to make bank withdrawalsDave ChambersBDLive
July 28, 2016 - USING a PIN for cash withdrawals and credit card transactions will soon seem as old-fashioned as signing your name.
The Payments Association of SA has announced a new standard for biometric authentication‚ which will mean fingerprints‚ palms‚ voices‚ irises and even faces can be used to identify cardholders at any bank or shop.
But the association says it has no plans to force businesses to use the technology‚ and none of the big banks has plans to use biometrics....The technology would be enforced only when there was large-scale deployment.
Bob Reany‚ global boss of identity solutions at MasterCard‚ predicted the first biometrics payment could happen as early as this year‚ because smaller players wanted to innovate to be seen as "cool".
The new system will enable fingerprints to be securely accepted by a biometric reader‚ encrypted‚ then validated. Reany said it would prevent fraud‚ as there would be no passport database.
Fingerprints would be saved only on the user’s bank card chip. ...Full Story
EC to audit Apache HTTP Server and Keepass
EC Joinup July 27, 2016 - The European Commission is preparing a software source code security audit on two software solutions, Apache HTTP server and Keepass, a password manager. The source code will be analysed and tested for potential security problems, and the results will be shared with the software developers. The audits will start in the coming weeks.
The security test is the next phase in the pilot project, involving the IT departments of both the Commission and the European Parliament.
The choice for Apache HTTP Server and Keepass is the result of a public survey. Between 17 June and 8 July, the EU-FOSSA project asked the public to help select the most-appropriate software solution, based on a pre-selection of open source solutions in use at the two European institutes. The survey received 3282 comments, with respondents favouring Keepass and Apache HTTP Server.... ...Full Story
ETSI bundles standards for EU eID regulation
EU Joinup July 26, 2016 - The European Telecommunications Standards Institute has published a collection of standards for electronic signatures, electronic seals, electronic time-stamps, and for trust services providers. The publication coincides with the entering into force on 1 July of the European Union's eIDAS regulation on eID and trust services for electronic transactions.
The bundle of standards was created in April by ETSI’s Technical Committee on Electronic Signatures and Infrastructures. “The set includes a total of 19 European Standards along with guidance documents and test specifications”, ETSI writes.
The standards can be used to audit trust service providers and to assess their conformity with eIDAS Regulation requirements. Others cover the creation and validation of digital signatures and seals.
In April, ETSI updated its ‘technical report on Electronic Signatures and Infrastructures’. This report details the standards that are involved, or could be involved, in electronic signatures.... ...Full Story
Analog 2.0 Specification Available Now
NFC Forum July 25, 2016 - The NFC Forum published the adopted Analog 2.0 Technical Specification today. Members may download the specification from the Adopted Specifications page.
The Analog 2.0 Candidate Specification was published in October 2015 and introduced Active Communication Mode for P2P data exchange and NFC-V technology in poll mode.The Analog 2.0 Technical Specification ensures full interoperability with devices conformant to ISO/IEC 14443 or ISO/IEC 18092 by harmonizing the analog parameter for the contactless communication. This interoperability is important to enable the reliable usage of NFC devices with existing infrastructure using ISO compatible RF readers and/or cards (e.g. for contactless public transport applications).
NFC Forum, 401 Edgewater Place, Suite 600, Wakefield USA, MA 01880
Forward this email | About our service provider
Sent by email@example.com ...Full Story
A Data Model to Support the Publishing of Legislation as Linked Open Data
EU Joinup July 22, 2016 - Citizens, professionals in the legal domain, businesses as well as civil servants need to know what legislation is in force. Legislation is often amended, repealed and codified, making it difficult to have a clear view of what text is in force at any specific point in time. In this context, the Hellenic Ministry of Interior and Administrative Reconstruction and the Italian Anti-corruption Agency contacted the ISA Programme of the European Commission to develop a pilot that has the two fold objective of making legislation available in both human and machine readable format and visualising the evolution of legislation over time, to enable user friendly consultation.
In order to allow legislative information to be published as Open Data, a data model was proposed to support this publishing process. The suggested data model is based on the ELI ontology and extended with concepts from Akoma Ntoso and the Core Public Organisation Vocabulary, thereby facilitating interoperability with other EU Member States. The full pilot can be downloaded or forked from the SEMICeu Github repository and the documentation on the data model can be consulted on the pilot website.
The data model has been put in public deliberation by the Ministry until 15 July 2016. ...Full Story
TC260 Drafts New Standard for China's Cloud Security Review Regime
USITO.org Weekly July 21, 2016 - Recently, TC260 has published the draft "Information Security Technology - Security Capability Evaluation Methods of Cloud Computing Services" for comments. The public comment period will end on August 11. This draft standard aims to provide guidance for third-party agencies on how to conduct cloud service capability evaluation via interviews, inspections and testing.
This standard, along with two others, cover guidelines for cloud service provider's size and operational experience, business dealings between cloud service providers and government customers, cloud computing services cybersecurity management and a range of other issues. The three standards have also been adopted as main references in the CAC's Cloud Computing Services Cybersecurity Review, which was announced on June 26, 2015 and targets services for Party and government departments. ...Full Story
IoT Security: What IoT Can Learn From Open Source Businesses are hugely concerned about IoT
Datamation July 20, 2016 - When personal computers were introduced, few manufacturers worried about security. Not until the early 1990s did the need for security become widely understood. Today, the Internet of Things (IoT) is following the same pattern -- except that the need for security is becoming obvious far more quickly, and manufacturers should have known better, especially given the overwhelming influence of open source.
The figures speak for themselves. In 2014, a study by Hewlett-Packard found that seven out of ten IoT devices tested contained serious security vulnerabilities, an average of twenty-five per device. In particular, the vulnerabilities included a lack of encryption for local and Internet transfer of data, no enforcement of secure passwords, and security for downloaded updates. The devices test included some of the most common IoT devices currently in use, including TVs, thermostats, fire alarms and door locks.
Given that Gartner predicts that 25 billion smart devices will be in use by 2020, no one needs to be a prophet to foresee a major security problem that will make even the security problems of the basic Internet seem insignificant....how have IoT manufacturers failed to be more security conscious?...
That smart devices, like OpenStack before it, are being built on the shoulders of open source, is too obvious for anyone to doubt. In early 2015, VisionMobile's survey of 3,700 IoT developers indicated that 91% used open source in their work.
This figure suggests that, without open source, the development of the IoT would be much slower if it happened at all. If nothing else, the use of open source and open standards helps to reduce compatibility problems between manufacturers' devices.... ...Full Story
Ultracode Standard Introduced by AIM
AIM July 19, 2016 - AIM announced today the release of the Ultracode international standard, establishing a significant enhancement in barcode technology for the automatic identification and data capture (AIDC) industry and consumerization.
Ultracode is the first 2D, error-correcting color barcode which can either be displayed on smartphones or printed by using a digital color camera or smartphone app. Its development was motivated by the ubiquitous use of color electronic displays, digital cameras and especially the development of the smartphone. Using Ultracode, standard color technology can create an image that encodes the same data in less than half the area of a QR Code, minimizing display space required.
The effort to develop Ultracode as a formal standard began more than a decade ago.... ...Full Story
ITU announces new standard for High Dynamic Range TV
ITU July 18, 2016 - ITU has announced a new standard for High Dynamic Range Television that represents a major advance in television broadcasting. High Dynamic Range Television (HDR-TV) brings an incredible feeling of realism, building further on the superior colour fidelity of ITU’s Ultra-High Definition Television (UHDTV) Recommendation BT.2020. ITU’s Radiocommunication Sector (ITU-R) has developed the standard – or Recommendation – in collaboration with experts from the television industry, broadcasting organizations and regulatory institutions in its Study Group 6.
This latest ITU-R HDR-TV Recommendation BT.2100 brings a further boost to television images, giving viewers an enhanced visual experience with added realism. The HDR-TV Recommendation allows TV programmes to take full advantage of the new and much brighter display technologies. HDR-TV can make outdoor sunlit scenes appear brighter and more natural, adding highlights and sparkle. It enhances dimly lit interior and night scenes, revealing more detail in darker areas, giving TV producers the ability to reveal texture and subtle colours that are usually lost with existing Standard Dynamic Range TV.... ...Full Story
New NERC Rules for Critical Cyber Assets Expand the Scope of U.S. Federal Regulation to New Facilities and Practices
Lexology July 15, 2016 - As a result of federal legislation enacted after the large Northeast/Midwest blackout in 2003, electric utilities and other electric market participants in the United States are subject to mandatory reliability standards developed through stakeholder processes by the North American Electric Reliability Corporation (NERC) and enforced by the Federal Energy Regulatory Commission (FERC) with substantial financial penalties of up to US$1million per day for each standard violation.
Among the categories of mandatory electric reliability standards are Critical Infrastructure Protection (CIP) standards that were first adopted in 2008. Those standards required owners and operators of “Critical Cyber Assets” (CCA)1 to develop, maintain, and implement cybersecurity policies that cover, among other things, training and access restrictions for personnel with access to CCAs, procedures for managing electronic and physical security perimeters, software security, incident reporting and response planning, and recovery plans to restore CCAs following an incident.
In 2013, NERC proposed and FERC approved version 5 of the CIP standards, a wholesale revision and significant change in approach under the standards. The new standards will be phased in, starting on 1 July 2016. The most significant change in the version 5 standards is the methodology to be used and the requirements for identifying assets subject to the standards, as described below for standard CIP-002-5. The scope of the new standards are significantly broader than the prior version and owners and operators of smaller electric generation and transmission facilities and generation control centers will now be subject to the CIP standards for the first time.... ...Full Story