Don't have an account yet? Sign up as a New User
Lost your password?
Welcome to ConsortiumInfo.org
Wednesday, March 29 2017 @ 02:31 AM CDT
Friday, January 04 2008 @ 06:24 AM CST
Contributed by: Andy Updegrove
This is the fifth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All product names used below are registered trademarks of their vendors.
Chapter 5: Open Standards
One of the two articles of faith that Eric Kriss and Peter Quinn embraced in drafting their evolving Enterprise Technical Reference Model (ETRM) was this: products built to "open standards" are more desirable than those that aren't. Superficially, the concept made perfect sense – only buy products that you can mix and match. That way, you can take advantage of both price competition as well as a wide selection of alternative products from multiple vendors, each with its own value-adding features. And if things don't work out, well, you're not locked in, and can swap out the loser and shop for a winner.
But did that make as much sense with routers and software as it did with light bulbs and lamps? And in any event, if this was such a great idea, why hadn't their predecessors been demanding open standards-based products for years? Finally, what exactly was that word "open" supposed to mean?
To answer these questions properly requires a brief hop, skip and jump through the history of standards, from their origins up to the present. And that's what this chapter is about.
Friday, December 28 2007 @ 12:07 PM CST
Contributed by: Andy Updegrove
This is the fourth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 4 – Eric Kriss, Peter Quinn and the ETRM
By the end of December 2005, I had been blogging on ODF developments in Massachusetts for about four months, providing interviews, legal analysis and news as it happened. In those early days, not many bloggers were covering the ODF story, and email began to come my way from people that I had never met before, from as far away as Australia, and as near as the State House in Boston. Some began with, "This seems really important – what can I do to help?" Others contained important information that someone wanted to share, and that I was happy to receive.
One such email arrived just before Christmas in 2005. In its entirety, it read:
Enjoy reading your consortiuminfo blog ... keep it up.
Happy New Year,
This was a pleasant and welcome surprise. Until the end of September, Eric Kriss had been the Massachusetts Secretary of Administration and Finance, and therefore Peter Quinn's boss. Together, they had conceived, architected and launched the ambitious IT upgrade roadmap that in due course incorporated ODF into the state's procurement guidelines.
Monday, December 10 2007 @ 07:05 AM CST
Contributed by: Andy Updegrove
This is the third chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
This chapter was revised at 8:30 AM on 12/11/07, most significantly by adding the "Lessons applied" section.
Chapter 3: What a Difference a Decade Can Make
In 1980, Microsoft was a small software vendor that had built its business primarily on downsizing mainframe programming languages to a point where they could be used to program the desktop computers that were then coming to market. The five year old company had total revenues of $7,520,720, and BASIC, its first product, was still its most successful. By comparison, Apple Computer had already reached sales of $100 million, and the same year launched the largest public offering since the Ford Motor Company had itself gone public some twenty-four years before. Microsoft was therefore far smaller than the company that Steve Jobs and Steve Wozniak had formed a year after Bill Gates and Paul Allen sold their first product.
Moreover, in the years to come, PC-based word processing products like WordStar, and then WordPerfect, would become far more popular than Microsoft's own first word processing (originally called Multitool Word), providing low-cost alternatives to the proprietary minicomputer based software offerings of vendors like Wang Laboratories. IBM, too, provided a word processing program for the PC called DisplayWriter. That software was based on a similar program that IBM had developed for its mainframe systems customers. More importantly, another program was launched at just the right time to dramatically accelerate the sale of IBM PCs and their clones. That product was the legendary "killer app" of the IBM PC clone market: Lotus 1-2-3, the spreadsheet software upon which Mitch Kapor built the fortunes of his Lotus Development Corporation.
Sunday, December 02 2007 @ 02:07 PM CST
Contributed by: Andy Updegrove
This is the second chapter in a real-time eBook writing project I launched and explained last week. The following is one of a number of stage-setting chapters to follow. Comments, corrections and suggestions gratefully accepted. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 2 – Products, Innovation and Market Share
Microsoft is the envy of many vendors for the hugely dominant position it enjoys in two key product areas: PC desktop operating systems – the software that enables and controls the core functions of personal computers - and "office productivity software" – the software applications most often utilized by PC users, whether at work or at home, to create documents, slides and spreadsheets and meet other common needs. Microsoft's 90% plus market share in such fundamental products is almost unprecedented in the technical marketplace, and this monopoly position enables it to charge top dollar for such software. It also makes it easy for Microsoft to sell other products and services to the same customers.
Microsoft acquired this enviable position in each case through a combination of luck, single-minded determination, obsessive attention to detail, and a willingness to play the game fast and hard – sometimes hard enough to attract the attention of both Federal and state antitrust regulators. Early on, Bill Gates and his team acquired a reputation for bare-knuckle tactics that they sometimes seemed to wear with brash pride. Eventually, these tactics (as well as tales of Gate's internal management style) progressed from industry rumors to the stuff of best sellers, like Hard Drive: Bill Gates and the Making of the Microsoft Empire.
With the emergence of the Web, of course, the opportunity for widely sharing stories, both real (of which there were many) and apocryphal, exploded. Soon Web sites such as Say No to Monopolies: Boycott Microsoft enthusiastically collected and posted tales of alleged technological terror and dirty deeds. More staid collections were posted at sites such as the Wikipedia. The increasing tide of litigation involving Microsoft, launched not only by state and federal regulators but by private parties as well, generated embarrassing documents. Such original sources were not only difficult to deny, but almost impossible to repress in the age of the Web - and of peer to peer file sharing as well.
Moreover, while Bill Gates and his co-founders rarely displayed the creative and innovative flair of contemporaries like Apple's Steve Jobs, neither were they troubled by the type of "not invented here" bias that sometimes led other vendors to pursue unique roads that sometimes led to dead ends.
Sunday, November 25 2007 @ 02:51 PM CST
Contributed by: Andy Updegrove
For some time I've been considering writing a book about what has become a standards war of truly epic proportions. I refer, of course, to the ongoing, ever expanding, still escalating conflict between ODF and OOXML, a battle that is playing out across five continents and in both the halls of government and the marketplace alike. And, needless to say, at countless blogs and news sites all the Web over as well.
Arrayed on one side or the other, either in the forefront of battle or behind the scenes, are most of the major IT vendors of our time. And at the center of the conflict is Microsoft, the most successful software vendor of all time, faced with the first significant challenge ever to ione of its core businesses and profit centers – its flagship Office productivity suite.
Quote of the Day
“CP = velocity x weight x sin (trajectory)
-Nissan's formula for calculating "Desert Camel Power" as a new power standard for off-road vehicles See all Quotes
Latest NewsNissan hopes ‘camelpower’ can become new global standardAlexander MavealGlobal News
March 29, 2017 - ...Horsepower is a unit of measure that is as widely accepted as Einstein’s matter theory, the speed of sound, and pi.
But now horsepower – used to determine the power of a vehicle’s engine – is being thrown into question due to Nissan’s creation of “Desert Camel Power.”
Introduced earlier this month by Nissan’s Middle East division, camelpower aims to provide car buyers in the region a better understanding of how new vehicles perform in the desert.
Nissan says it was about time a new metric was introduced to provide “an at-a-glance indication of a vehicle’s desert fitness.” ...Full Story
New CDISC Data Standard Aids Development of Therapies for Ebola Virus
CDISC.org March 28, 2017 - The Clinical Data Interchange Standards Consortium (CDISC) and the Infectious Diseases Data Observatory (IDDO) announce the availability of a new standard to assist in the collection, aggregation and analysis of Ebola virus disease (EVD) research data. This standard is for use in EVD trials, leading to potential treatments and public health surveillance for this disease....
According to the WHO, the epidemic that began in 2014 in West Africa was “the largest and most complex Ebola outbreak since Ebola virus was first discovered in 1976.” This singular outbreak resulted in over 11,000 fatalities, which is more than 7 times the fatalities of all previous outbreaks combined.
Version 1.0 of the CDISC Ebola Therapeutic Area User Guide (TAUG-Ebola) describes data concepts for use in Ebola clinical studies, so that investigators, data managers, statisticians, programmers and others that handle Ebola clinical trial data can understand the data and apply the standards appropriately. Having this data in CDISC standard format allows for more efficient aggregation and analysis of data collected from various studies across outbreak settings, thereby leading to an enhanced, automated process for developing evidence in evaluating Ebola treatments.... ...Full Story
ITU launches global dialogue on Artificial Intelligence for good
ITU March 27, 2017 - AI for Good Global Summit aims to ensure that AI benefits humanity
Geneva, 23 March 2017 – The AI for Good Global Summit in Geneva, 7-9 June 2017, aims to accelerate the development and democratization of Artificial Intelligence (AI) solutions to address global challenges such as poverty, hunger, health, education, equality and the protection of our environment.
Organized by ITU and the XPRIZE Foundation – in partnership with UN agencies, including OHCHR, UNESCO, UNICEF, UNICRI, UNIDO, UNITAR and UN Global Pulse – the summit will evaluate the opportunities presented by AI with a view to ensuring that AI benefits all of humanity.
The event will offer tangible guidance on the tenets of responsible AI development, from the perspectives of technology, ethics, standardization and policy.... ...Full Story
ITU adopts Chinese-made interactive content format as new global standard
GlobalTimes March 24, 2017 - The International Telecommunication Union (ITU) has adopted a new file structure for interactive mobile comic and animation content designed independently in China as its global standard on March 16, China's Ministry of Culture announced on Monday.
Known as T.621, the file structure will be able to be used on all types of mobile devices and platforms to provide high-definition content in a relatively small file.
The new file structure will also allow content creators to provide interactive content such as motion graphics and audio for online comics, which is in high demand in today's market.... ...Full Story
IEEE Approves New Standards Project IEEE P2755™—Guide to Terms and Concepts in Intelligent Process Automation
IEEE March 24, 2017 - IEEE and the IEEE Standards Association (IEEE-SA), today announced the approval of the IEEE P2755™—Guide to Terms and Concepts in Intelligent Process Automation project. The new standards project aims to build a framework for terminology to help advance related standards efforts. Sponsored by IEEE’s Board of Governors Corporate Advisory Group, the newly formed IEEE P2755 Working Group is defining initial terminology that addresses a range of applications spaces, including Robotic Process Automation, Artificial Intelligence (AI), Cognitive Computing, Autonomics, Machine Learning and related technologies that enable businesses and governments to improve performance and lower costs....Lee Coulter, chair, IEEE Guide to Terms and Concepts in Intelligent Process Automation Working Group [said] “It’s important to establish a framework now that can evolve in step with related industry developments to ensure a commonality for understanding related products, services and concepts, and to help advance the market space for the benefit of all.”... ...Full Story
Patent Advisory Group Recommends Continuing Work on Web Payments Specifications
W3C.org March 23, 2017 - The Web Payments Working Group Patent Advisory Group (PAG), launched in August 2016, has published a report recommending that W3C continue work on the Web Payments Specifications. W3C launches a PAG to resolve issues in the event a patent has been disclosed that may be essential, but is not available under the W3C Royalty-Free licensing terms. ...Full Story
Government Agencies to be Rated on Cybersecurity Using NIST Framework
National Law Review March 22, 2017 - The Trump administration has announced that it will impose new metrics on federal agencies related to cybersecurity. Agencies and departments will be required to comply with the framework developed by the National Institute of Standards and Technology (NIST) and report back to the Department of Homeland Security (DHS), the Office of Management and Budget (OMB), and the White House....Plans to impose the NIST cybersecurity framework on federal agencies illustrate the Framework’s increasing importance as a standard for cybersecurity, not just for government agencies, but more broadly throughout the information ecosystem. With security breaches, state-sponsored cyber-attacks, and ransomware demands increasing, the Framework offers useful guidance on processes and actions designed to enhance data security for government and industry alike. ...Full Story
OGC approves new standard for geological science data
OGC.org March 21, 2017 - The membership of the Open Geospatial Consortium (OGC®) has approved GeoSciML as an OGC Standard. The OGC GeoSciML Standard defines a model and encoding for geological features commonly described and portrayed in geological maps, cross sections, geological reports, and databases.
GeoSciML provides a mechanism for storage and exchange of a broad range of geologic data enabling users to generate geologic depictions (such as maps) in a consistent and repeatable fashion....This standard describes a logical model and GML/XML encoding rules for geological map data, geological time scales, boreholes, and metadata for laboratory analyses....
The GeoSciML standard includes a Lite model, used for simple map-based applications; a basic model, aligned with INSPIRE, for basic data exchange; and an extended model to address more complex scenarios. The standard also provides patterns, profiles (most notably of OGC Observations and Measurements - also ISO 19156), and best practices to deal with common geoscience use cases.... ...Full Story
Three challenges for the web, according to its inventor
The Open Web Foundation March 20, 2017 - Today is the world wide web’s 28th birthday. Here’s a message from our founder and web inventor Sir Tim Berners-Lee on how the web has evolved, and what we must do to ensure it fulfils his vision of an equalising platform that benefits all of humanity.
Today marks 28 years since I submitted my original proposal for the world wide web. I imagined the web as an open platform that would allow everyone, everywhere to share information, access opportunities and collaborate across geographic and cultural boundaries. In many ways, the web has lived up to this vision, though it has been a recurring battle to keep it open. But over the past 12 months, I’ve become increasingly worried about three new trends, which I believe we must tackle in order for the web to fulfill its true potential as a tool which serves all of humanity.
1) We’ve lost control of our personal data
The current business model for many websites offers free content in exchange for personal data. Many of us agree to this – albeit often by accepting long and confusing terms and conditions documents – but fundamentally we do not mind some information being collected in exchange for free services. But, we’re missing a trick. As our data is then held in proprietary silos, out of sight to us, we lose out on the benefits we could realise if we had direct control over this data,...
2) It’s too easy for misinformation to spread on the web
...through the use of data science and armies of bots, those with bad intentions can game the system to spread misinformation for financial or political gain.
3) Political advertising online needs transparency and understanding
Political advertising online has rapidly become a sophisticated industry. The fact that most people get their information from just a few platforms and the increasing sophistication of algorithms drawing upon rich pools of personal data, means that political campaigns are now building individual adverts targeted directly at users. One source suggests that in the 2016 US election, as many as 50,000 variations of adverts were being served every single day on Facebook, a near-impossible situation to monitor. And there are suggestions that some political adverts – in the US and around the world – are being used in unethical ways – to point voters to fake news sites, for instance, or to keep others away from the polls....
These are complex problems, and the solutions will not be simple. But a few broad paths to progress are already clear. We must work together with web companies to strike a balance that puts a fair level of data control back in the hands of people, including the development of new technology like personal “data pods” if needed and exploring alternative revenue models like subscriptions and micropayments. We must fight against government over-reach in surveillance laws, including through the courts if necessary. We must push back against misinformation by encouraging gatekeepers such as Google and Facebook to continue their efforts to combat the problem, while avoiding the creation of any central bodies to decide what is “true” or not. We need more algorithmic transparency to understand how important decisions that affect our lives are being made, and perhaps a set of common principles to be followed. We urgently need to close the “internet blind spot” in the regulation of political campaigning....
It has taken all of us to build the web we have, and now it is up to all of us to build the web we want – for everyone. If you would like to be more involved, then do join our mailing list, do contribute to us, do join or donate to any of the organisations which are working on these issues around the world. ...Full Story
A Standard for Lighting Color Preference?
NIST Techbeat March 20, 2017 - One of the goals of artificial lighting is to make things look natural....To hit the “sweet spot” between too dull and too vivid, lighting manufacturers rely on an international standard that helps them determine whether their white lights will render objects “correctly” – that is, the way they might look in sunlight. This standard is based on an old system called the Color Rendering Index (CRI), which scores lamps on their color fidelity: The higher the CRI score, the more natural objects should look when illuminated. A score of 100 is considered “perfect.” Most good white light lamps get scores of 80 or higher.
But just because something looks natural does not mean that people like it....The final goal is to allow a new version of the CRI to remain as a “color fidelity” metric, but also to create a new standard for “color preference” to give companies further guidance for manufacturing LED lights. Companies could use one or both of these metrics depending on the intended applications.... ...Full Story