The Data Transfer Project and the Hammer

Hammer%20and%20Nail%20128.pngFirst, the good news: last week, Google, Microsoft, Twitter and Facebook announced the Data Transfer Project, inviting other data custodians to join as well. DTP is an initiative that will create the open source software necessary to allow your personal information, pictures, email, etc. to be transferred directly from one vendor’s platform to another, and in encrypted form at that. This would be a dramatic improvement from the current situation where, at best, a user can download data from one platform and then try and figure out how to upload it to another, assuming that’s possible at all.


So what’s the bad news, and what does a hammer have to do with it?


Let’s look at the bad news next. While the information released by the project’s founders indicates participating organizations will be encouraged to focus on a limited number of data formats relevant to the data “vertical” in question (e.g., pictures, email, etc.), there is no requirement to do so. In other words, every vendor is free to continue to use its own proprietary APIs and chosen data formats, and new market entrants are free to develop their own as well.


It also means that every time a user wants to transfer anything from one platform to another, it has to go through a conversion process. That process will necessarily be “lossy” to some degree, and will be resource intensive and slow. Does that sound familiar? It should, because that’s what converting Microsoft Word documents into another format has always been like.


Now, about that hammer.


For the last several years, there’s been a dramatic move away from using standards to achieve interoperability, and using open source instead. More specifically, that usually means creating an open source product core that can be customized to create separate, competing products instead of developing a standard to enable interoperability between entirely independently developed, competing products.  Often times that makes perfect sense. But sometimes it doesn’t – sometimes developing a new standard would be far more beneficial to users. Stated another way, just because you can use open source to achieve a goal doesn’t mean you always should.


The Data Transfer Project provides an excellent example of a situation where this is the case. It also represents yet another example of how collaboration remains vendor-centric rather than user-centric, although you’ll need to read between the lines to realize that.


How so? Because the several blog entries from the participants and the white paper describing the project stress the benefits to users. Those benefits are real and considerable: more freedom to change vendors, more privacy controls, and so on. But when I say “considerable,” that’s on an absolute, and not on a relative, basis. If you compare the open source to the open standards approach, the open source method falls short, particularly for users.


Why? Because the standards approach would allow faster, real-time sharing of data with higher fidelity. That means new use cases would be possible that would be infeasible where conversion must occur every time the data passes back or forth, such as conducting joint research, and coordinating disaster relief by first responders.


What does the DTP open source approach look like?  Here’s an excerpt from the DTP White Paper:


Adapters provide a method for converting each Provider’s proprietary data and authentication formats into a form that is usable by the system…Data Adapters are pieces of code that translate a given Provider’s APIs into Data Models used by DTP. Data Adapters come in pairs: an exporter that translates from the Provider’s API into the Data Model, and an importer that translates from the Data Model into the Provider’s API.


Got that? There are actually two conversions each time data passes back or forth: first from the proprietary API of Company A into the Data Model for that type of information, and then from the Data Model to the proprietary API of Company B. With the standards approach, Company A simply sends its data to Company B directly without the need for conversion even once, because both companies create and store data using the same format.


Stated another way, using adapters is a band aid approach that allows proprietary vendors to continue to use proprietary technology to silo your data, while providing just enough mobility to users to permit them to tolerate the continuation of life as we know it and compliance with evolving regulations, such as the GDPR.


In short, using an open source hammer treats the user as a nail. Using open standards would turn the user into a hammer, empowering her to use whatever vendor she wishes, and putting the maximum incentive on all vendors to compete on services, features and performance to earn the user’s continued business.


I think we can all agree that users would rather be the hammer. We’ve all been the nail for far too long, and all it’s given us is headaches.