In a short while, an important vote will be taken in downtown Denver, Colorado. If as expected that vote is in the affirmative, a unique and important public-private partnership will spring into being. It will also have an extremely ambitious goal: to assess, assemble, explain and promote the complex and evolving web of standards that will be needed to make the vision of a Smart Grid in the United States a reality. It will also mark the end of the first chapter in a journey that began with the passage of the Energy Independence and Security Act of 2007.
What is a Smart Grid, compared to what we have now? Today, we have centralized production of electricity, with distribution of that power being handled by somewhat interconnected, regional networks to commercial and home users. We also have burgeoning green house gas emissions, growing dependence on foreign oil, both as a result of our need to keep increasing our generating capacity in order to meet whatever the peak national electrical need may be.
It's been more than a month since I last wrote about the CodePlex Foundation, the new open source initiative announced by Microsoft in early September. While things were pretty quiet at the Foundation site for some time, that changed on October 21, when the Foundation posted its new Project Acceptance and Operation Guidelines, a key deliverable that gives insight into a variety of aspects of the Foundation's developing purpose and philosophy. A "house" interview of Sam Ramji (pictured at left) by Paul Gillin was posted a week later.
Surprisingly, though, there was very little pickup on any of this new information until yesterday (perhaps with a little nudging from the PR side of the house), when several stories popped up on line, including this one, at InternetNews.com, and another at ZDNet.com. Each is based on a conversation between Sam Ramji and the reporter (Sean Michael Kerner, at InternetNews, and Dana Blankenhorn, at ZDNet.com).
In this blog entry, I'll give my impressions of how the CodePlex Foundation is developing, and (as before) my opinions on how effective the decisions being made are likely to be in achieving the Foundation's goals.
ome of the most beautiful artistic treasures created during the millennium we refer to in the Western world as the Dark Ages are books — usually of a religious nature, they were transcribed by hand in sumptuously precise calligraphy, illuminated with wonderfully colorful and imaginative borders, and graced with elegant inset illustrations that were themselves jewels of inspiration, meticulously set down with pen, brush and burnisher in inks, tempera and gold leaf on laboriously stretched and scraped sheets of parchment. When complete, these beautiful pages were bound in volumes large and small, from enormous folios that were easily read in the pulpits of candlelit cathedrals, to breviaries that nestled comfortably in the pocket of a monk's cassock. Lovingly preserved through many centuries, they are as wonderful to observe today as they were when they were fresh from the standing desks of the monks who gave them birth.
One of the realities that every standards professional must deal with is the sad fact that everyone else in the world thinks that standards are…
[start over; no one else thinks about standards much at all]
Ahem. One of the things that standards folks must come to terms with is the fact that on the rare occasions when anyone else thinks about standards at all, likely as not it's to observe that standards are…
[There. I've said it]
But really, now, this perception has got to change. And with the recent release of Dan Brown's latest pot boiler, The Lost Symbol, I believe I've figured out how to make standards really, really exciting. Really.
They're expected to deal with every new topic that comes down the pike, from regulating securitized credit swaps to beefing up cybersecurity, whether they've had any previous experience with it or not. Of course, there's never a shortage of people who want to educate them, but the "educators" with the greatest access are likely to be lobbyists. And when one paid advocate is promoting one action, political physics dictates that another highly paid individual in somebody else's pocket will be promoting an equal and opposite action. Soon, all potential solutions become obscured by a fog of business propaganda.
Two weeks ago, I wrote a critical analysis of the governance structure of the CodePlex Foundation, a new open source-focused foundation launched by Microsoft.
But what about the business premise for the Foundation itself? Let’s say that Microsoft does restructure CodePlex in such a way as to create a trusted, safe place for work to be done to support the open source software development model. Is there really a need for such an organization, and if so, what needs could such an organization meet?
As with my last piece, I’ll use the Q&A approach to make my points.
Well, it’s been a busy week in Lake Wobegon, hasn’t it? First, the Wall Street Journal broke the story that Microsoft had unwittingly sold 22 patents, not to the Allied Security Trust (which might have resold them to patent trolls), but to the Open Inventions Network. A few days later, perhaps sooner than planned, Microsoft announced the formation of a new non-profit organization, the CodePlex Foundation, with the mission of “enabling the exchange of code and understanding among software companies and open source communities.”
Not surprisingly, more articles were written about the apparent snookering of Microsoft by AST and OIN than about the new Foundation. But while the tale of the 22 patents is now largely over, the CodePlex story is just beginning. Microsoft says that its goal for the new Foundation is to create an open and neutral environment, and that the formation documents posted and governance structure described at the CodePlex Foundation site can provide a foundation for such an organization. The CodePlex site also makes clear that the Bylaws you can find there are just a starter set, stating, “Our governance documents are deliberately sparse, because we expect them to change.”
That’s good to hear, because I’ve reviewed all of the material at the CodePlex site, and I think that quite a bit of the governance structure will need to change before CodePlex can expect to attract broad participation.
Steve Jobs is a genius of design and marketing, but his track record on calling the right balance between utilizing proprietary arts and public resources (like open source and open standards) is more questionable. Two news items caught my eye today that illustrate the delicacy of making choices involving openness for the iPhone platform - both geopolitically as well as technically.
The first item can be found in today's issue of the London Sunday Times, and the second appears at the MacNewsWorld.com Web site. The intersecting points of the two articles are the iPhone and, less obviously, openness. But the types of openness at issue in the two articles are at once both different, and strangely similar.
The Sunday Times piece recounts the (unsuccessful) efforts of Andre Torrez, the chief technology officer at Federated Media in San Francisco, to switch from the iPhone to an Android-based G1 handset, because he objects to the closed environment that the iPhone represents. But after just a week, Torrez reverts to the better app-provisioned iPhone. The Sunday Times author concludes in part as follows:
Modern society harbors many bad habits. One is its penchant for enthusiastically embracing the benefits of new technologies before considering their less desirable side effects. Whether we look at the development of automobiles (first) and safety features (much later), or industrialization (first) and environmental protection (much, much later), the story is always much the same: we reach for the candy before we grasp the reality of the cavities. Only after the problems become too great to ignore do we investigate the unintended consequences, realize how difficult and expensive they are to address, and grudgingly start to rein in our appetites and exercise a bit of prudent self-discipline.
Perhaps we should not be surprised, then, that the U.S. government is only now becoming alarmed over the vulnerability to which we have become exposed as a result of our whole-hearted embrace of the Internet. With the operations of government, defense, finance, commerce, power distribution, communications, transportation, and just about everything else now dependent on the healthy operation of the Internet, that alarm is well-justified. And with the creation and storage now of virtually all data in digital, rather than physical form, exposure of our financial as well as our most intimate personal and health information is only a hack away as well.
Man's ability to affect the land is all too evident in these times of climate change, pollution and habitat destruction. Happily, the landscape can change man as well.
The weather finally broke last night, dropping 30 degrees by dawn, and thanks be for that. The night before I had camped in the Sheyenne National Grasslands, heavy with heat and humidity. But the next day it was pleasantly cool (upper 60s), albeit overcast rather than sunny.
Nor was this the only change. It took over 2400 driving miles to finally leave the Eastern, and then Midwestern terrain behind, but today I reached the beginnings of what I think of as the West. More than anything else, in my mind that means “dry.” For the last 800 miles, the landscape had been primarily flat, lush - and transitionally post-glacial. That last factor means an area where the great ice sheets completed their periodic southward pulses, dumping rich, black earth born of thousands of miles of ice grinding down stone, some deposited by glacial steams, and other as windblown “loess” – very fine mineral particles.
Quote of the Day
“Sometimes upholding constitutional ideas just isn't enough; sometimes you have to uphold the actual Constitution”
-Excerpt from the dedication of a new "dark email" protocol to the NSA by PGP developer Ladar Levison
New NIST Tools to Help Boost Wireless Channel Frequencies and Capacity NIST Techbeat February 27, 2015 - Smartphones and tablets are everywhere, which is great for communications but a growing burden on wireless channels. Forecasted huge increases in mobile data traffic call for exponentially more channel capacity. Boosting bandwidth and capacity could speed downloads, improve service quality, and enable new applications like the Internet of Things connecting a multitude of devices.To help solve the wireless crowding conundrum and support the next generation of mobile technology—5G cellular—researchers at the National Institute of Standards and Technology (NIST) are developing measurement tools for channels that are new for mobile communications and that could offer more than 1,000 times the bandwidth of today’s cell phone systems.... ...Full Story
HTTP/2 Will Make The Web ‘Faster And Safer’ Steve McCaskill Tech Week Europe February 27, 2015 - The Internet Engineering Steering Group (IESG) has approved the final standard for the HTTP/2 protocol, which could make browsing the Internet quicker and safer.
HTTP/2 is a major update to the Hypertext transfer protocol (HTTP), which is the foundation of data communication for the World Wide Web. The most widely used version of the standard, HTTP/1.1 was defined in 1999.
A working group has been developing HTTP/2 since 2012 and adopted Google’s SPDY protocol as an initial blueprint, with community feedback resulting in “substantial changes” to the standard, such as the compression scheme and the format of protocol.... ...Full Story
NIST Releases Update of Industrial Control Systems Security Guide for Final Public Review NIST Techbeat February 26, 2015 - The National Institute of Standards and Technology (NIST) has issued proposed updates to its Guide to Industrial Control Systems (ICS) Security (NIST Special Publication 800-82) for final public review and comment....Downloaded more than 3 million times since its initial release in 2006, the ICS security guide advises on how to reduce the vulnerability of computer-controlled industrial systems to malicious attacks, equipment failures, errors, inadequate malware protection and other threats. Industrial control systems encompass the hardware and software that control equipment and the information technologies that gather and process data. They are commonly used in factories and by public utilities and other owners and operators of major infrastructure.
Most industrial control systems began as proprietary, stand-alone collections of hardware and software that were walled off from the rest of the world and isolated from most external threats. Today, widely available software applications, Internet-enabled devices and other nonproprietary IT offerings have been integrated into most such systems. This connectivity has delivered many benefits, but it also has increased the vulnerability of these systems.... ...Full Story
Big Data, Hadoop Standards Group: Who's In, Who's Missing? Joe Panettieri Information Management February 25, 2015 - All eyes in the big data world are on the Open Data Platform -- a new association that strives to promote big data technologies and open source platforms like Hadoop. While promising and backed by big names like GE and IBM, the Open Data Platform initiative also lacks some key names....
Several industry giants and startups are driving the Open Data Platform group -- including Altiscale, Capgemini, CenturyLink, EMC, GE, Hortonworks, IBM, Infosys, Pivotal, SAS, Splunk, Teradata Verizon and VMware.
Still, some key names also are missing from effort.... ...Full Story
Security Standard Proposed for Bitcoin Exchanges and Wallets Stan Higgins Coindesk February 25, 2015 - A group composed of developers and security professionals has proposed a set of rules aimed at standardizing security protocols used by companies that handle or store digital currencies for their clients.
The proposal, created by the Cryptocurrency Certification Consortium (C4)...aims to provide an industry-level standard by which exchanges and wallet providers can operate.
The Cryptocurrency Security Standard (CCSS) draft proposal calls for 10 standardized approaches to key and seed generation, storage and usage, proof-of-reserve and security audits, among other areas. The framework consists of three levels per section, with each grade signifying a higher degree of security based on the proposed guidelines.... ...Full Story
How can you tell when the standards process isn't working? Perhaps the best indication is when a vendor decides it has to go to the time and cost (passed through to customers) of implementing two different standardized technologies in the same product. Hopefully this approach doesn't represent the future of wireless charging.
Samsung's Solution To Wireless Charging Fragmentation: Use All The Standards Lucian Armasu Giga.om February 24, 2015 - In a recent post on one of its websites, Samsung talked about the recent history of wireless charging and how the company has been working on bringing this technology to market since late 2000. It finally did it in 2011 when the company brought wireless charging support for its Droid Charge smartphone....Because we're talking about a brand new type of technology, having multiple standards can hurt adoption, so Samsung, which is a member of both consortiums, has decided that it's best to just use both technologies in its upcoming devices. This way, a device such as the Galaxy S6 could be backwards compatible with both standards and all the accessories that support them. Soon, for example, Samsung's devices could be charged wirelessly either at McDonalds restaurants, which use Qi charging, or at Starbucks stores, which use PowerMat chargers.... ...Full Story
LTE standards group targeting mission-critical push-to-talk specifications for early 2016 UrgentComm February 23, 2015 - Officials for 3GPP, the standards body for LTE technology, recently said the organization plans to establish a standard for mission-critical-voice functionality over LTE early next year. That action could have significant impact on both 4G LTE initiatives and LMR plans for public-safety and critical-communications entities.
To help ensure that this aggressive timeline can be met, 3GPP has created a new working group—called SA6—specifically to tackle the challenges associated with mission-critical applications, with an initial focus on mission-critical voice, according to 3GPP officials.... ...Full Story
Call for Papers: Conference Theme: Interoperability, Intellectual Property and Standards IEEE-SIIT.org February 23, 2015 - Interoperability has never been more important than it is today. It can be achieved by design, following the market or through standardization. How does intellectual property impact interoperability? How do these factors interact with standardization? IEEE-SIIT 2015 will explore these, and other, important questions.
IEEE-SIIT conferences aim at bringing together academia, government and industry participants engaged in standardization to foster the exchange of insights and views on all issues surrounding standards, standardization, interoperability and innovation. Contributing academic disciplines include, but are not limited to: Business Studies, Computer Science, Economics, Engineering, History, Information Systems, Law, Management Studies and Sociology....[the deadline for submissions is April 3, 2015] ...Full Story
Wireless Power Consortium Achieves Key Technology Milestones for Fast Charging and Resonant Multi-Device Charging with Spatial Freedom Press Release WPC.com February 20, 2015 - The Wireless Power Consortium (WPC), the driving force and leader in the global adoption of wireless power technology, today made two draft specifications available to its members that extend the capabilities of the Qi wireless power standard.
The first extension of the Qi specification, called "Volume II: Medium Power," enables fast charging of smartphones with up to 15 Watts delivered into the battery....The second extension of the Qi specification, called "Volume III: Shared Mode," enables multi-device charging with a single inverter, a resonant technology that reduces the cost of manufacturing multi-device chargers while providing large freedom of spatial positioning.... ...Full Story
Web standard promising faster page loads wins approval Steven Musil and Stephen Shankland CNET February 20, 2015 - A new version of the HTTP standard that promises to deliver Web pages to browsers faster has been formally approved, the Internet protocol's first revision in 16 years.
The specifications for HTTP 2.0 have been formally approved, according to a blog post by Mark Nottingham, who as chairman of the IETF HTTPBIS Working Group serves as the standard effort's leader. The specifications will go through a last formality -- the Request for Comment documenting and editorial processes -- then be published, Nottingham wrote.
HTTP, short for Hypertext Transfer Protocol, is one of the seminal standards of the Web. It governs how a browser communicates with a Web server to load a Web page. HTTP 2.0, the protocol's first major revision since HTTP 1.1 in 1999, is designed to load Web pages faster, allowing consumers to read more pages, buy more things and perform more and faster Internet searches.
The new standard is based on SPDY, a protocol Google introduced in 2009.... ...Full Story