The Launch of AllSeen Alliance (and the Next Generation of Open Collaboration)

If you read the technology press today, odds are you already know about the launching of the AllSeen Alliance (a Google News search I just did produced 412 results in a wide range of languages). That’s not a surprise, because this is an important and ambitious project. But there’s a story behind the story that likely won’t get the attention that it deserves, and that’s what this blog post is about. (Disclosure: the AllSeen Alliance is a Linux Collaboration Project – the 11th so far – and I assisted in its structuring and launch.)

By way of background, the Alliance was created to play a critical role in enabling the “Internet of Things” to become a reality. The Internet of Things is one of those phrases that has been around a long time, and people are still waiting to see if it come into existence. In a nut shell, the concept is to have an almost infinite number of devices – many intensely humble in their purpose, such as lights, switches, thermostats (and yes, your refrigerator, if you must), each of which is in direct or indirect connection with every other similarly enabled thing in its local cloud (it may also be in direct or indirect connection to the Internet).

As an example, if you turned the key in your front door, your lights, heat, sound system, and all the rest of your everyday, inanimate, powered objects would spring into action. And as you left one room, your loyal helpmates there would be alerting their peers in the next room that the Master Cometh, so snap to! There’s more to it than that, of course, but you get the idea.

If this sounds like one of those future fantasies that never actually happens, you’ll be starting to grasp why something new and different is needed to make the Internet of Things possible, or at least to make it happen much faster.  The reason? Because before it’s worthwhile for any vendor to buy into this vision, a whole lot of other vendors need to as well or there’s no personal cloud for their “thing” to communicate with. And hence no reason for them to invest their time or resources in adding the new capability to their existing products.

That’s not unusual, of course – it’s been true for standards since the very beginning. But unlike most traditional standards, each developed by a single organization to serve a single industry, an Internet of Things needs to include, well, a whole bunch of different types of things from a whole bunch of different types of vendors – from lamp manufacturers to refrigerator vendors to thermostat designers – before it really starts to makes sense. And it also takes many different types of standards, such as very low power radios that use ambient vibrations as a source of power rather than batteries. And that takes more than a single organization serving a single industry to provision.

Have YOU Discovered the Alexandria Project?

$2.99 or less at Amazon, iTunes,Barnes & Noble and GooglePlay

To see why, let’s look at how a somewhat similar reality came into existence – WiFi – which required only a single standard to initially take off, and which only involved traditional ICT vendors.

First, there were a number of competing technologies, each trying to become the new standard. Some failed entirely, others (like Bluetooth) went in a different, but still useful direction. So first, the technological route had to be hashed out and then turned into a specification.  Next, vendors had to be convinced that people would buy a new type of device they never needed before (a wireless router), laptop vendors had to be convinced that it was worth driving the costs of their products up by adding WiFi capability, and ordinary people like you and me had to be convinced not only to buy a router, but also to upgrade to one of those new laptops – the classic chicken and egg situation. Companies like Starbucks needed to be persuaded to play ball, too.

How did this all come about?  The standards organization that developed the standard – IEEE – only creates standards. So those who wanted to see WiFi succeed had to form a new organization – the WiFi Alliance –  to create and promote a brand, fund the development of test software, launch and administer a certification program, and more, all in order to persuade the market place that the standard would in fact be widely adopted, thereby justifying vendors and customers’ investments. All this took a lot of effort, a lot of promotion – and a lot of time. Among other things, it took many, many vendors a lot of time and money to do their own implementations, all built to the specification, running on their own software.  Happily for us all, it took hold.

So now let’s look at what it takes to make an Internet of Things possible, comprising the wares and services of many different vendors, and types of vendors. It represents roughly the same goal – to create another type of local area network – but this time, there’s no router. Each thing is its own router, and for every other neighboring thing as well, passing along messages from device to device, and perhaps eventually back out to the Internet. That requires more than just a single interoperable communication standard, and more than just devices that can send and receive signals. It also requires all sorts of different types of companies, and not just laptop vendors, to make the investment and take the risk to enable their respective products.

Granted, it’s a pretty cool concept – creating another, more low level, more intimate Internet to the one we already have. But it takes more than a cool concept to make all this happen.

The old way would be to create a framework of standards that would describe use cases allowing different types of vendors to find the standards they would need in order to achieve the common goal (for an example of this approach, check out another one of my clients, SGIP.org, which is creating the standards to support the SmartGrid). But that would only solve the interoperability element, and not plug the other missing gaps, such as the fact that there’s no central router mediating the traffic between and among the things, and the fact that some of the standards needed don’t yet exist.

And consider this: we’d also like our Internet of Things to play nicely with the operating systems of all of the equipment we already own. That way we don’t have to buy more equipment, or risk being locked into a single vendor or OS environment, or worry about whether the companies that control those operating systems want to participate or not.  So we want our own personal clouds to be able to be immediately at home with Linux, iOS, Windows, and so on.

Oh – and did I mention we’d like to do this really quickly?

Roll all of these issues together, and that’s where the need for a new type of collaboration becomes clear. Specifically, the mission of the AllSeen Alliance includes creating a layer of software that implements existing standards and finesses the need to create many new standards, because anyone can use the software right out of the virtual box.  In other words, it is the framework, rather than a description of one – ready to go, and already interoperable by design. And because it’s open source, it can be readily and easily adapted as needed by anyone to allow their particular things to join the party.

That’s a very new, very holistic (but very targeted) approach. It’s also one that meets all of the key needs we identified above, and some others as well:

  • The length of the list of initial members presents a critical mass of players, providing comfort to other vendors that their investment in time, money and strategic direction is likely to be rewarded, whether they decide to become an active member or simply to incorporate the free deliverables into their own products and services.
  • Potential vendors and customers can see that interaction with a wide variety of operating systems will be supported, including the ones that their own businesses or personal systems are based or run on.

  • They can also see the use cases that will be supported to be sure that their own product and service offerings should be in demand (and if they become a member, they can influence the selection and prioritization of those use cases).

  • They can dramatically lower their own cost of entering into this new and enormous marketplace (forecasted by Gartner to contribute $1.9 trillion to the global economy by 2020), because the essential software will already have been built, be available for free, and will be supported on an ongoing basis by a robust, well respected organization.

  • Because of its mission and the fact that it is a well-funded legal entity, the new Alliance has the ability to do anything else that may be needed to promote the ubiquitous uptake of its software framework, whether that be certification, branding, or whatever else it may take.

  • Because it does not itself create standards, the Alliance can be nimble and agnostic, taking advantage of those standards that show the greatest technical promise, and which are independently enjoying the widest possible adoption. Or it can make the creation of a given standard unnecessary by providing an implementation of the functionality in question.

As importantly, all this can be done very quickly, rather than through a series of sequential steps. Instead of taking a long time to create a standards framework and then try to persuade vendors to adopt that framework, the framework has already been substantially instantiated in actual software, which will become richer and multi-purpose as the effort continues.

That’s a revolutionary rather than an evolutionary step in the history of open collaboration, marrying open standards and open source in a way that hasn’t happened before, except in some other LF hosted Collaborative Projects (such as OpenDaylight).

Taken to its logical final step, under this model, a collaborative project needn’t limit itself to any one discipline, but can employ working groups in all relevant disciplines, using off the shelf, existing standards and software where they exist, and creating new ones where they don’t. Historically, that’s been hard to do, because most organizations know how to do one or the other, but not both. And most of those people who represent members are skilled in one or the other, but not both.

The Linux Foundation, as it happens, has both the skill sets in its DNA, as it was formed through the combination of two different organizations – Open Source Development Labs (OSDL), which supported open source projects, including Linux, and the Free Standards Group, which developed and maintained the Linux Standards Base. It also already supports the types of activities that the WiFi proponents needed to form a new organization to provision, such as collaborative development, branding, meeting planning, and other activities needed to spread the word and credential the interoperability of compliant products.

What you see here is the type of approach that is increasingly needed to solve the most interesting challenges of the future – a project that looks to use all of the open collaboration tools needed to conclusively enable an out of the box solution, rather than a blueprint for just a piece of one, and a supporting organization able to support those efforts in real time.

Net net, if you’d like to get a glimpse of how the most intriguing collaborative challenges of the future will be solved, keep an eye on the AllSeen Alliance.

Sign up for a free subscription to Standards Today today!