February – March 2009
Vol VIII No 2
IT Policy And The Road To Open Government
With Access and Information for All
Over the last two hundred years, poll taxes, literacy tests and other artificial barriers have too often stood between the less privileged and the exercise of their rights of citizenship. Will the migration of government services to the Web be a great leap forward, or an avoidably negligent step back?
Enabling Open Government
The Obama Administration has committed to creating an "unprecedented level of openness in government," made possible by an equally ambitious utilization of information technology. How that technology, and the standards upon which it relies, are selected will determine whether open government is made available to all, or only to the technically sophisticated, the able-bodied, and the well to do.
How Open Must Open Government Platforms be?
Governments represent some of the most complex enterprises in existence, typically comprising multiple "silos" of high-value information trapped within proprietary legacy systems. Government CIOs today are struggling to upgrade their vast IT systems to exchange information across the enterprise, even as President Obama has called upon them to make much of the same information publicly accessible via the Internet. What should the result look like, and why?
"Openness," the Customer and the IDABC
"Openness" in technology means different things to different people in different settings, because different settings attract different participants with different goals. What should it mean to governments?
Killing the Cockroach: The Incredibly Illogical, Fundamentally Odious — but Seemingly Ineradicable — Billable Hour
You'd never write a blank check to a house painter to paint your house. So why hand one to a lawyer? Good question.
Download PDF of this issue
With Access and Information for All
This issue marks the third and final (for now) entry in a cycle that addresses the role of technical standards in government, and vice versa, occasioned by the election of America's new, technologically-savvy president, Barack Obama. Each of these issues looks back to a greater or lesser extent to a concept I introduced a year ago in an article titled Recognizing "Civil ICT Rights" and Civil ICT Standards.
The first issue in this cycle was intended to provide an overview of the standards opportunities, and standards dependencies, of Mr. Obama's ambitious technology technology-based agenda, and was called A Standards Agenda for the Obama Administration, while the second focused more narrowly on the President's most ambitious (and costly) standards-dependent program (that issue was called The Electronic Health Record Standards Challenge). This issue addresses another important goal established by Barack Obama early in his campaign, and confirmed on the first day of his presidency: a commitment to bring about a new level of public transparency, participation and collaboration in government, in large part through the innovative use of technology.
I introduce this theme with my Editorial, which makes the point that if Internet-based government is to be truly open, it needs to be accessible to all — and not just those who can afford a laptop or desktop, know how to use it, and are not held back by physical disabilities.
This month's Feature Article addresses the broader IT challenge of making the IT systems of government agencies internally interoperable and externally transparent and collaborative at the same time. Like the Standards Blog selection for the month, it refers in part to the already well advanced and sophisticated efforts of the European Union to achieve similar goals.
The issue concludes, as usual, on a lighter note with my newest installment of Consider This, which shines a harsh light on a standard of a different — and much less popular — type altogether: the much maligned but tenacious billable hour of law firms at home and abroad.
As always, I hope you enjoy this issue. Whether you do or don't, it's always good to hear from you. You can reach me at firstname.lastname@example.org.
Editor and Publisher
2005 ANSI President's
Award for Journalism
The complete series of Consortium Standards Bulletins can be accessed on-line at http://www.consortiuminfo.org/bulletins/. It can also be found in libraries around the world as part of the EBSCO Publishing bibliographic and research databases.
Sign up for a free subscription to Standards Today.
return to top
Enabling Open Government for All
One of the refreshing promises that Barack Obama made to the nation during the recent campaign was to bring greater openness and transparency to the operation of government. What distinguished this pledge from those of his predecessors was his commitment to go beyond soft promises of greater candor in press conferences and fewer assertions of privilege. This administration, he pledged, would provide an unprecedented level of access to information and direct interaction with government, enabling a richly interactive, ongoing dialogue between citizens and their elected representatives.
Achieving true openness in government will rely as much on the manner in which openness is technically enabled as it will upon the commitment of the administration to be truly forthcoming.
The president reaffirmed that promise immediately upon his inauguration, in a directive he issued to the heads of all federal executive departments and agencies. In that memorandum, he committed his administration to, "work together to ensure the public trust and establish a system of transparency, public participation, and collaboration." The transparency would be achieved in part by harnessing, "new technologies to put information about their operations and decisions online and readily available to the public," and the promised participation and collaboration would be made possible by, "innovative tools, methods, and systems." He charged the nation's first Chief Technology Officer and other senior administration leaders with turning his instructions into an Open Government Directive within 120 days.
As with much of the new administration's lofty goals, reality and the press of the current economic crisis have made implementation significantly more challenging than conceptualization. Indeed, more than half of those 120 days have now gone by, and President Obama's CTO has not yet been appointed, while the start date for Vivek Kundra, the former CTO of the District of Columbia and Mr. Obama's pick to be Chief Information Officer, was unexpectedly delayed, pending completion of an investigation into the acts of two of his former subordinates. Meanwhile, countless bloggers and journalists cast critical eyes over every move and statement the new administration makes, debating whether it lives up to the President's pledge. And Mr. Obama's staff are struggling to adapt to a technical infrastructure that is far less nimble than the state of the art platform they created for the president-to-be's campaign.
Beneath the surface of these public and all-too typical events, however, a number of less visible challenges stand in the way of keeping the President's open government promises. They can be summed up with a simple question: "transparency, public participation and collaboration for whom?" The answer to that question is obvious, from an aspirational perspective, but not so simple to achieve technically.
The problem arises from the mixed blessings of technology. If the major benefits of the new policy are delivered via the Internet, then by definition these benefits will be available only to those that have the access and skills needed to locate, review, and interact with what has been made available on line. As a result, achieving true openness in government will rely as much on the manner in which openness is technically enabled as it will upon the commitment of the administration to be truly forthcoming.
Having said that, what should be done?
Providing a free computer to every American is clearly not in the economic cards. But ensuring that everything that the government provides on line rigorously supports all accessibility standards and can be accessed by the greatest range of devices (e.g., browser-equipped cell phones), running the most diverse range of operating systems and software applications, will make an enormous difference.
In the past, many government Web sites have served pages that often render poorly (or not at all) on some popular browsers, and have provided or accepted documents in only one or two proprietary document formats. The result has been to force citizens to download or purchase the specific software products that support this technology, or be denied access to the public information at all. Famously, Apple computer users running Safari, the Apple Web browser, found themselves unable to access urgently needed emergency information in the aftermath of hurricane Katrina, because the government Web sites posting that information supported only Microsoft's Internet Explorer Web browser. Similarly, many government sites still do not support the accessibility standards that are essential to allow many citizens with visual and other physical handicaps to access and interact with these sites.
On the one hand, then, we have the past, where a fully equipped desktop or laptop computer, running proprietary software, was needed to exercise citizenship rights on line, assuming that the owner was fully sighted and able-bodied. If this is also the future, then the promise of transparent, participatory and collaborative government will be a democratic embarrassment, rather than an egalitarian promise fulfilled.
To ensure that better future, the government must set itself the task of provisioning Web sites that citizens can access using not only desktops and laptops, but also the types of devices that almost all citizens can afford and know how to use — increasingly powerful and inexpensive cell phones, smart phones and netbook computers running open source as well as proprietary software — then the true promise of open government will be realized.
Though few will realize it, this goal can only be realized by committing — now — to the mandatory implementation of the full range of interoperability and accessibility standards needed to ensure that all Web pages will render properly and usefully on all devices running all operating systems and other appropriate software. The great majority of these standards already exist, and the remaining tools are currently under development in consortia such as the W3C and OASIS. All that is required is the decision to select and use them.
Too often in our history we have allowed our democratic reach to exceed the grasp of the less fortunate. It will be sad indeed if, through indifference, this most hopeful and progressive of recent administrations provides open government not for all, but only for the technically sophisticated, the able-bodied, and the well to do.
Copyright 2009 Andrew Updegrove
Sign up for a free subscription to Standards Today.
return to top
How Open Must an
Open Government Platform be?
Abstract: The advent of the Internet and the realization of the promise of open standards presents challenges as well as opportunities to governments. Internally, these technological tools can provide the means to finally allow information locked in disparate proprietary "silos" to pass freely between all units and levels of government. Externally, they can provide cheaper and better ways to serve the public, providing not only traditional services more efficiently, but enabling new types of openness and interactivity as well. Achieving either of these goals individually would be difficult, and providing both simultaneously is challenging indeed, in part because providing government services on line raises new issues that must be addressed relating to vendor neutrality, physical and economic accessibility, data preservation, and much more. Governments are only now confronting these new issues. Those that deal with them effectively and rapidly will not only provide better services to their citizens sooner, but ensure that their substantial investments in upgrading legacy systems will be best rewarded.
Fulfilling a campaign pledge, President Barack Obama has committed his administration to providing an "unprecedented level of openness in Government," and called upon all of his heads of Executive Departments and Agencies to, "work together to ensure the public trust and establish a system of transparency, public participation, and collaboration…[in order to] strengthen our democracy and promote efficiency and effectiveness in Government."1
Consistent with an ability to understand and utilize technology that was well-demonstrated in the course of his campaign, Mr. Obama has stated his intention to make full use of technology, and particularly Internet-based technology, to permit citizens to access information and interact with government.
Deploying information technology (IT) effectively will prove to be more difficult from the Oval Office than from the campaign trail, however, due to a number of significant challenges, many of which are well recognized. These hurdles include the comparatively basic level of most in-place government Web-based services, the fact that technology continues to develop at a rapid pace, the difficulties of deploying IT (or indeed any) policy across many government agencies, and the reality that, like so many other complex and long-extant enterprises, the Federal government comprises a bewildering archipelago of proprietary, legacy systems that are only gradually being transitioned into a more interoperable, coherent whole.
Adding to these infrastructural challenges is a lack of consensus over the boundaries of the field upon which the game of "open government" should be played. True, there are general expectations based upon past practices, but the exact locations of the goal posts have typically shifted from administration to administration, with some presidents acting more secretively than others. Similarly, while the Federal Freedom of Information Act grants a legal right to public access to appropriate documents, some administrations and agencies have interpreted this law more narrowly than others, requiring citizens and journalists to seek access via the courts to materials that at other times might have been produced without argument under other presidents.
Deploying state of the art information technology effect-tively will prove to be more difficult from the Oval Office than from the campaign trail
Physical world issues like these involve subjective interpretations, but the underlying issues are at least well understood at law and in public discourse. Introducing technology into the equation, though, introduces new questions that have to a large extent not as yet been addressed in the courts (or confronted only in loosely analogous, rather than similar, situations). Such issues are only now becoming part of the public dialogue over how interaction between the governors and the governed should occur.
The questions that technology raises involve issues such as whether citizens should be constrained to use certain software or hardware platforms, or even the products of specific vendors, in order to access and use government sites and services, the degree of effort that governments should invest in ensuring that their Web sites are as state of the art as possible with respect to accessibility standards, what level of privacy and security governments owe to their citizens' data, how long such data must be preserved, and what document formats should be mandated to ensure that result.
These decisions are complicated by issues of cost (technology is expensive), difficulty (designing openness into a system adds an extra level of complexity), security (how to protect non-public information, and the information of individuals, from non-privileged users), and integration (governments are simultaneously wrestling with the challenge of making a host of proprietary, legacy systems interoperable). Ideally, governments will wish to support not two systems but one for both internal and external purposes to the greatest extent practical and consistent with the maintenance of security.
But the rewards of taking up this challenge can also be great. They include the prospect of providing better service to citizens, improved efficiency and lower technology costs for governments over time, and the ability to promote social and commercial policy by providing (via the very substantial magnitude of government procurement) attractive incentives for vendors to make their own products more interoperable and accessible. In many ways, the goal to enable what has often been referred to as "eGovernment" is comparable to another ambitious goal of the Obama administration: to deploy a nationally interoperable healthcare system based upon "electronic health records." In the near term, the challenges and the costs of such transitions will be enormous. But in the long term, the efficiencies and cost savings promise to be far more substantial — if the technical challenges are properly addressed and public expectations are satisfied.
In this article, I will review some of the principal technology-related topics that have been identified under the heading of "open government" and discussed (more vigorously abroad than in the United States) to date, as well as the standards-related tools that can be used to address these issues. I will also describe the goals and current status of one of the most sophisticated efforts underway today in this domain: the European Union-based initiative known as the Interoperable Delivery of European eGovernment Services to public Administration, Business and Citizens (IDABC). The IDABC provides a well documented, impressively reasoned, and detailed example of how a government can modernize its IT systems to provide not only internal interoperability among multiple governmental units, but a high degree of openness to its citizens and a way to advance public policy as well.
Legacy systems and changing expectations: Not so long ago, digital information resided on discrete, proprietary systems, typically made up of hardware and software procured from a single vendor, often supplemented by costly custom software. The most successful vendors were understandably well content with this state of affairs, because the resulting barriers to switching to another vendor were very substantial. Certainly this situation was less than ideal, but it was nonetheless tolerable, because although a customer's valuable information was trapped within the proprietary formats and databases its vendor provided, that information was usually only of internal relevance (e.g., payroll and sales data). To the limited extent that data needed to be shared externally, paper provided an adequate means of transfer (or at least so it seemed in the absence of any viable alternative).
In the days before the Web and the Internet provided a practical means to break down data room walls, citizens themselves had little reason to be concerned over a world where public data, like private data, lived in such proprietary walled gardens. At most, the impact of the old paradigm on the public was indirect, involving difficult to investigate issues such as efficiency and cost-effectiveness.
Today, of course, much has changed. From the technical perspective, once dominant vendors, such as IBM, have lost some of their near-monopoly positions in key product areas, leading them to champion open rather than closed systems as a way to lower the switching costs of customers they hope to acquire. At the same time, other vendors, like Microsoft, have successfully acquired equal, if not greater, monopolies in areas such as operating systems and office productivity suites.
More importantly, the technical means of achieving real interoperability within networks assembled from the products of multiple vendors, and between networks owned by multiple parties, have become far more credible in the last fifteen years. The tools enabling this transition include not just the Internet, but a multitude of technical standards (e.g., Ethernet, http, html, XML, and many more), new design and delivery concepts such as service oriented architectures (SOA) and the provision of software as a service (SAAS)), and the infrastructural buildout needed to support an ever-expanding Internet and the Web.
The landscape has changed from the citizens' point of view as well, because technology has now become not only more visible, but accessible and non-threatening to the majority of those living in modern societies. Not surprisingly, governments now wish to interact with citizens via the Web, in part to provide better services, and in part to provide those services at lower cost. Citizens are increasingly eager to take advantage of these services, to save time, to gain greater access to hitherto difficult to obtain information, and to make their opinions known.
Even as technology provides the means to achieve true interoperability, it magnifies the challenges to attaining it.
Indeed, the rise of technology gives government little, if any, choice in the matter, because the volume and types of data that citizens, businesses and governments create and consume have exploded. Instead of creating only text and numerical data, we now create documents of many other types, including presentations and spreadsheets (in each case both editable and fixed), graphics, audio and video files, and much more — often in the same electronic file. Most dauntingly, we now wish to exchange such information across public, private, national and social boundaries — all while preserving a broad array of choices in hardware, software and service providers.
Driving our desire to do so is the continuing power of the network effect: the name economists use to recognize the fact that the larger and more efficient a network is, the more valuable it becomes. In the case of information and communications technology (ICT) networks, that value can be found in multiple ways, including increased efficiency (e.g., being able to enter data once, rather than every time we visit a new Website), access to expanding resources (by permitting more people to access and/or add to common data repositories), better decision making (through real-time information sharing and communication), and greater convenience (through remote access).
As the size and nature of such networks expand, of course, so also does the challenge of achieving interoperability across them to satisfy market forces and emerging regulatory requirements.2 Indeed, even as technology provides the means to achieve interoperability, it magnifies the challenges to attaining it in part due to the speed with which underlying technologies continue to evolve, and in part because the enormous profit opportunities at stake provide incentives for one vendor to promote its technology as the basis for standardization over its competitor's.
Interoperability and social policy: Small wonder, then, that most market participants give little weight to achieving social goals in the process of developing and deploying the standards that enable interoperability. And yet such goals are important, including ensuring equal access to those of limited means, accommodating those with disabilities, and maintaining competition in the marketplace. That commercial interests give little attention to such concerns, while regrettable, is not surprising. But for governments to ignore them would be inexcusable. Unless due regard is given to these needs, valuable civil rights may be prejudiced or lost, as governments increasingly move their activities from the physical to the virtual world.3
That commercial interests give little attention to such accessibility concerns, while regrettable, is not surprising. But for governments to ignore them would be inexcusable.
Many governments are aware of the benefits of achieving broad and effective interoperability both within and among their networks, as well as between those networks and the ICT assets of their citizens, but only some have considered the potential for pursuing policy goals through the exercise of targeted ICT procurement. As governments continue to convert paper-based systems and face to face services to ICT based storage and online provisioning, ensuring effective interoperability passes from a desirable goal to an essential requirement.
Transition tools: The short shelf life of modern ICT technology provides a more rapid, if still difficult, means for governments and other large enterprises to transition from isolated legacy systems to interoperable enterprise-wide networks. Since new hardware and software must be replaced or upgraded every few years, the old can theoretically be swapped out for the new, although doing so while the same systems remain in operation presents challenges. The means of accomplishing this feat is through the careful development and deployment of tools that not only describe the desired interoperability end state and the standards needed to enable it, but also facilitate the gradual migration from the old systems to the new.
Governments utilize the same techniques that complex commercial enterprises use to bring legacy systems under control, such as designing and adopting "interoperability frameworks" that define and mandate the architectures into which new technology purchases must fit and the standards that they must support. The results should be lower costs, increased efficiency, and interoperability not only within, but among, agencies — usually for the first time. Where increasing transparency and interaction is also a goal, the same tools should be designed to ensure that appropriate information can be accessible to, and freely exchangeable with, citizens as well.
Special challenges: Governments have special challenges not shared by commercial enterprises when they seek to establish inter-governmental (e.g., among agencies, states, nations, and/or municipalities) as well as intra-governmental interoperability. These challenges are often exacerbated by the incredibly Byzantine collection of proprietary, legacy systems that many governments have accumulated over the years.4 These additional dimensions can greatly multiply the difficulty of agreeing upon, designing and deploying the solutions needed to achieve success.
For large governmental units, interoperability challenges will therefore often be quantitatively different from those encountered by all but the largest commercial enterprises.5 In the first instance, government ICT systems can be extremely large — more vast than those of even the greatest multinational corporations. They can also be more architecturally diverse, where each department has exercised independence in ICT procurement over several decades. The difficulties of making progress in such a case are multiplied when achieving public goals requires the exchange of data not only with external private industry partners, but also with a diverse host of public sector partners, such as states, municipalities, first responders, and more.
Governments have interoperability needs that are qualitatively different as well. For example, while a corporation may need to retain records for ten years at most, a government must plan to preserve some categories of documents indefinitely, resulting in the need to develop and utilize format standards that will hopefully be accessible a hundred or more years into the future. Similarly, while a private company may concern itself only with communicating with those market segments that it regards as most attractive, a government should make its information and services accessible to all of its citizens, rich and poor, well educated and otherwise, able-bodied and disabled. Proactively, it may also wish to exercise its procurement power in such a way as to provide incentives for industry to move in directions that the government deems to be socially beneficial.
For better or worse, the awareness of such special needs and the inadequacy of current systems to meet them has only begun to emerge in the last several years. As a result, governments are only beginning to consider what special retooling efforts are incumbent upon them. Acting on these needs, once realized, is complicated by factors such as cost, the low level of familiarity of most legislators technical matters, and the lobbying efforts of vendors that may have more to lose than to gain by progressive government action.
Nonetheless, awareness is rising in many parts of the world, both regionally (as in the European Union), nationally (as in Malaysia, South Africa, Brazil, India, and a variety of other primarily emerging countries), and within individual provinces, states, and even municipalities.6 Some of these efforts have emerged as recently as last year, while others, such as those in the EU, have been ongoing for many years. The latter in particular provide useful examples that other governments would do well to examine and emulate to a greater or lesser extent, as best suits their individual policy goals and political sensibilities.
For large governmental units, interoperability challenges will often be quantitatively different from those encountered by all but the largest commercial enterprises.
A case study: the IDABC: One of the most ambitious and interesting efforts to achieve both internal interoperability as well as efficient and open government/citizen interaction bears an imposing title: the Interoperable Delivery of European eGovernment Services to public Administration, Business and Citizens (IDABC) programme of the European Commission (EC). While the IDABC programme is but one of a number of completed and ongoing efforts to be found throughout the world at all levels, it may well be the most sophisticated and well advanced of these efforts. The reasons can be found in the unique political, economic and social history of the European Union itself. This history has led to a far greater level of government awareness in Europe, in comparison to (for example) the United States, of the value that standards can, and at times must, play in achieving commercial and social objectives. The program of the IDABC is also notable for the degree to which it incorporates open source software into its political and practical consideration.
The Vision of a Common Market (and more): One of the great commercial and social successes of the post-World War II era has been the ongoing rationalization and liberalization of commerce, monetary systems and travel within an expanding (first European Economic Community, and subsequently) European Union. Many challenges have been encountered and met in that process, with some having a higher public profile than others.
One of the less-visible hurdles the architects of the EU faced as they worked towards the creation of a common market involved deconstructing the many standards-based barriers that European nations, like other countries around the world, had created in order to give preferential treatment to the goods of their domestic industries over foreign products. That process was also constructive, as new regional standards-based strategies and institutions were created to promote European trade abroad.7
The result was the creation of what is the most sophisticated regional standards infrastructure in the world today. Unlike other such organizations (e.g., the Pan-American Standards Commission, commonly known as COPANT), economic as well as political goals served to incentivize EU-wide standards activities, to stronger and more sustaining effect. The resulting benefits to the nations of the EU have been numerous, extending beyond the central goal of facilitating internal trade. One indirect result, often noted with some grumbling elsewhere in the world, is the ability of EU members states to exercise disproportionate influence in "one nation, one vote" standards bodies such as the ISO, IEC and ITU, and also to identify standards goals of common interest and concern that can then be jointly addressed, to the benefit of European industry.8
The vision of an interoperable Europe: Although some of the grander political dreams of pan-European proponents have not been realized (e.g., an EU Constitution has not thus far garnered the necessary votes to be adopted), the EC has moved well beyond purely trade related goals. In consequence, the need to achieve interoperability both horizontally as well as vertically among governmental entities has arisen in a way not found elsewhere, driven in part by the increasing level of services provided at the EU level to member states and their constituents. The complexity of the interoperability challenges of providing such "Pan-European eGovernment Services" (PEGS), and the essential role of standards in doing so, is suggested by this definition of PEGS:
Cross-border public sector services supplied by either national public administrations or EU public administrations provided to one another and to European businesses and citizens, in order to implement community legislation, by means of interoperable networks between public administrations.9
The difficulty of providing PEGS and the sophistication required to parse out (even linguistically) how to do so are illustrated by the way in which the definition of "interoperability" has evolved over the last several years in IDABC documents. In 2004, a key document defined that word to mean:
[T]he ability of information and communication technology (ICT) systems and of the business processes they support to exchange data and to enable the sharing of information and knowledge.10
By 2008, the political, commercial and technical challenges of achieving consensus around cross-border interoperability had risen, as demonstrated by the new definition for interoperability to be found in a comment draft relating to the revision of the same document:
[T]he ability of disparate and diverse organizations [principally administrations] to interact towards mutually beneficial and agreed common goals, involving the sharing of information and knowledge between the organizations via the business processes they support, by means of the exchange of data between their respective information and communication technology (ICT) systems.11
Moreover, the same comment draft goes on to observe:
It is also worth noting that interoperability is neither ad-hoc, nor unilateral (nor even bilateral) in nature. Rather, it is best understood as a shared value of a community [emphasis in the original].12
Uniquely in the world (at least since the collapse of the Soviet Union), the EU represents a region where enabling standards-based interoperability among and within sovereign nations must rise to the level of a core policy priority. This level of importance, conjoined with the multi-layered, multi-national context within which this policy is being pursued, has led to a similarly unique sensitivity to both the political as well as the technological nuances and complexity of the challenges at hand. Discussions relating to "openness" and "interoperability" therefore a level of awareness of non-technical concerns that is not commonly encountered in dialogues in governments elsewhere in the world — and rarely, if ever, in the United States.
While the EU's situation is unique, its special challenges have led it to consider policy dimensions that the designers of public interoperability frameworks elsewhere should rightly consider. What follows is a brief review of some of the history and elements of the IDABC's work programme.
The IDABC and its projects: The need to achieve EU-wide interoperability was recognized at the Seville Summit, held in June of 2002, at which the members of the EU adopted the "eEurope Action Plan 2005."13 Specifically, the Plan directed the European Commission, "to issue an agreed interoperability framework to support the delivery of pan-European eGovernment services to citizens and enterprises," which came to be called the European Interoperability Framework for Pan-European eGovernment Services (and more briefly as simply the "EIF"). The plan also specified that the framework was to be, "based on open standards and encourage the use of open source software."14
The first version of the EIF was released in 2004, and the second version is now in a state of active preparation. In July of 2008, a lengthy "Draft document as basis for EIF 2.0" ("Comment Draft") was released for public comments during a comment period that closed in September of the same year. This document will serve as one of the significant inputs in upgrading the first version of the EIF.
The IDABC Comment Draft is intriguing in the way that it provides a real-time map of how European thinking on interoperability continues to evolve
The Comment Draft is intriguing in the way that it provides a real-time map of how European thinking on interoperability continues to evolve in tandem with emerging public opinion against the backdrop of events such as the failure of the European Constitution and new patent legislation to be adopted, the burgeoning success of open source software, and publicly reported investigations and prosecutions of proprietary vendors for anticompetitive behavior by the European Commission.15
The EIF is but one of a suite of related deliverables in process, including the European Interoperability Strategy (EIS), the European Interoperability Architecture Guidelines (EAIG) and the European Interoperability Infrastructure Services (EIIS). Together they are intended to:
…provide the basic technical requirements of consumers of eGovernment services, cover the lifecycle from strategy through to operations, and provide IT vendors and suppliers with reliable information on their costumers' needs in this area.
EIF Principles and "Open Standards:" The EIF is more informative than the typical private sector interoperability framework, perhaps by reason of the fact that the powers of the IDABC vis-à-vis the Member States are only advisory in nature. As a result, the EIF comprises goals, detailed discussions, and a total of 17 recommendations, some of which are designated as "organizational," "semantic" or "technical." The interoperability goals of the EIF are further informed by a set of "underlying principles," intended to serve a range of both universal, as well as EU-specific, social values and objectives. Those principles are Accessibility, Multilingualism, Security, Privacy (Personal Data Protection), Subsidiarity,16 Use of Open Standards, Assess the Benefits of Open Software, and Use of Multilateral Solutions. These principles, as stated in Recommendation 2 of EIF 1.0, "should be considered for any eGovernment services to be set up at a pan-European level."17
These principles can be divided into several categories: including those that any properly motivated government should want to adopt on principal (e.g., accessibility), those that, for technical reasons, they would need to adopt (open standards), those to which it should adhere in order to be responsible custodians of their citizens data (privacy and security), and those that, for economic reasons as customers, they might wish to adopt (use of open source software).
However, these same principles involve (or can involve) other policy decisions as well: does a government want: to drive further development of accessibility features in additional products; to ensure that the standards it supports are vendor neutral; to encourage the further development generally of open source software?
The answers to those questions will vary, based upon the trade and social policies of the government in question, the degree to which lobbyists affect its decisions, and (in the case of the EU), sensitive and evolving relations between Brussels and Member States. Some decisions (such as supporting multilingualism) may be givens in some situations, and politically charged in others. What is particularly interesting and instructive about the IDABC's work is the degree to which so many of these questions (some subtle), and others (such as subsidiarity) have been considered and addressed.
While other governments will be unlikely to find it appropriate to adopt principles that are identical in all cases to those articulated in the EIF, the IDABC's work provides a very instructive model deserving of close attention for the degree to which its drafters have comprehended the types of social, commercial and political dimensions that a government ICT policy can incorporate to not only satisfy a government's obligations and maximize the efficiency of its own procurement activities, to pursue its policy goals as well.
Each of the principles set forth in the EIF could be considered to be relevant to a larger definition of "open standards," at least to the extent that they are dependent upon standards to enable them (e.g., Web-based accessibility relies on a number of standards, as does the ability of a document format to accommodate reading from right to left, or the Unicode to incorporate linguistic character sets). Perhaps because so many common open standards criteria are picked up in other principles, the definition of Open Standards that appears in the EIF is comparatively narrow.
That said, the text of the "Use Open Standards" principle in EIF 1.0 is instructive:
The following are the minimal characteristics that a specification and its attendant documents must have in order to be considered an open standard:
- The standard is adopted and will be maintained by a not-for-profit organisation, and its ongoing development occurs on the basis of an open decision-making procedure available to all interested parties (consensus or majority decision etc.).
- The standard has been published and the standard specification document is available either freely or at a nominal charge. It must be permissible to all to copy, distribute and use it for no fee or at a nominal fee.
- The intellectual property - i.e. patents possibly present - of (parts of) the standard is made irrevocably available on a royalty free basis.18
The most notable element of the definition above, from a market perspective, is the last, which precludes any need to pay royalties to a patent owner in order to implement a standard that meets these "minimal characteristics." This is a dramatic departure from traditional practice, as the intellectual property policy rules of each of the "Big I's" (ISO, IEC and ITU), whose standards have in the past been of greatest interest to governments, all permit a patent owner to require payment of a "reasonable royalty" if its patent would be infringed.
The IDABC appears determined to press further rather than retreat from this position. The basis document for the next version of the EIF takes pains to flesh out the intention of the brief statement quoted above:
Since the publication of version 1 of the EIF, several practical cases have however shown the necessity to clearly point out the extent of this definition and to clarify its applicability…
- Open standards or technical specifications must allow all interested parties to implement the standards and to compete on quality and price. The goal is to have a competitive and innovative industry, not to protect market shares by raising obstacles to newcomers. Also, we want to be able to choose open source solutions or proprietary solutions on the basis of price/quality consideration…
- This definition reflects a consumer's viewpoint, with his needs uppermost in mind…19
The requirement of free implementation and the preference for open source software provide examples of two of the types of motivations referred to above at work. The first is the self interest of government as a customer, desiring access to the widest choice of products at the lowest cost (open standards allow multiple vendors to develop competing products, providing more alternative products with more value-added, differentiating features while driving down prices; open source software is often free, usually resulting in a lower total cost of ownership throughout the life of the product).
The commitment of the Obama administration to bring a new level of transparency and par- ticipation to government offers an opportunity to achieve what was promised, but not delivered, by the Bush administration.
The second motivation derives from a policy goal, because government purchasing represents a large enough sales opportunity to incentivize standards developers to create standards that will qualify for recommended EU procurement purposes. Once these standards become widely adopted, citizens can buy the same products, and enjoy the same wider choices and lower costs — all through the exercise of the "soft" power of procurement, rather than the heavy hand of government regulation.
Next Steps: The commitment of the Obama administration to bring a new level of transparency and participation to government offers an opportunity to achieve what was promised, but not delivered, under the E-Government Act of the Bush administration, adopted in 2002. Indeed, the promises of President Obama are in large part simply elaborations of the goals of the prior Act, which read as follows:
- To promote use of the Internet and other information technologies to provide increased opportunities for citizen participation in Government.
- To promote interagency collaboration in providing electronic Government services, where this collaboration would improve the service to citizens by integrating related functions, and in the use of internal electronic Government processes, where this collaboration would improve the efficiency and effectiveness of the processes.
- To promote the use of the Internet and emerging technologies within and across Government agencies to provide citizen-centric Government information and services.
- To make the Federal Government more transparent and accountable.
- To provide enhanced access to Government information and services in a manner consistent with laws regarding protection of personal privacy, national security, records retention, access for persons with disabilities, and other relevant laws.20
Similarly, the E-Government act included provisions to bring about interoperability within government as well. Given that the statutory basis for action is in place, with origins across the aisle from the President's party, the opportunity for progress is apparent.
The question is whether the new president, beset with so many seemingly more urgent issues, will proceed with greater determination than his predecessor. Hopefully, that will be the case, given that attaining the reality of open government is dramatically less expensive, far less technically challenging, and administratively much simpler (for example) than transitioning to an efficient, cost effective national system of electronic health records (EHRs). Making the transition to open government is one of the few important policy goals that President Obama can actually delegate.
There is a very real danger that the design of open government systems under the Obama initiative will be treated as a purely technical exercise, to be carried out by IT staff that will be primarily concerned with achieving interoperability among agencies, with public access arising in their thinking as a purely technical veneer to be dealt with as an afterthought. As demonstrated by the example of the IDABC's EIF, such a neutered technical exercise risks not only failing to ensure equal accessibility, but would also fail to take advantage of other policy goals that the administration might wish to further at little, if any, additional cost.21
What the open government challenge shares with the EHR challenge is the fact that both are standards-dependent, and that neither will succeed unless the proper standards are available and selected before deployment exists. If the wrong standards are chosen, and if the policy goals identified are too limited, then the results will suffer proportionately.
Where open government and EHRs diverge is that in the former case, the standards are largely available, while in the latter many remain to be developed. But in each case, the process must begin by properly defining the goals that the standards must serve. Those goals are comparatively clear in the EHR context. They are less so (or at least consensus has been difficult to achieve) in the case of open government, as demonstrated by the failure of bills in multiple states across multiple legislative sessions to establish rules regulating to use of even a single type of standards (open document formats).
What is urgently needed is a public dialogue about the social and policy goals that open government IT platforms should support, so that the actual work of selecting standards and making procurement decisions can proceed in an informed and efficient manner. Otherwise, we may find ourselves repeating the same exercise under the next administration once again.
Copyright 2009 Andrew Updegrove
Sign up for a free subscription to Standards Today.
1 Memorandum for the Heads of Executive Departments and Agencies, Subject: Transparency and Open Government, at http://www.whitehouse.gov/the_press_office/TransparencyandOpenGovernment/ and subsequently reproduced in the Federal Register. All on-line material cited in this article was accessed on April 7, 2009.
2 A current example on the regulatory front involves XBRL (for "Extensible Business Reporting Language"), one of the vast and still increasing set of XML-based standards. The Securities and Exchange Commission in the United States is requiring public reporting companies to transition to reporting financial information in this format so that it can be more easily searched, compared, and repurposed.
3 I have written at length in the past on the risks that this transition can pose to the exercise of civil rights, such as freedom of expression, freedom of association, and the ability to interact with government. I refer to these freedoms, as exercised on-line, as "Civil ICT Rights," and to the standards that are essential to protect them as "Civil ICT Standards," and have argued that the process and rules regulating the development of such standards should themselves be held to a higher and more rigorous standard in consequence. See, for example, my article "A Proposal to Recognize the Special Status of 'Civil ICT Standards,'" appearing in Standards Today, Vol. VII, No. 2 (February – March 2008) at http://www.consortiuminfo.org/bulletins/feb08.php#feature.
4 Peter Quinn, the Massachusetts CIO that pioneered adoption of the OpenDocument Format (ODF) standard by governments, was fond of observing that if a proprietary system had ever existed, the Massachusetts Executive Agencies had certainly bought at least one of them.
5 By way of example, the Executive Agencies of the Commonwealth of Massachusetts, a small state by U.S. standards, utilize over 50,000 desktop systems spread across more than a dozen agencies, each with its own budget and procurement authority. Moreover, unlike a private sector CIO, the Massachusetts CIO may only issue guidelines, and not mandatory procurement requirements.
6 The recent competition between proponents of ODF and Microsoft's competing XML-based standard, commonly referred to as "OOXML" (for Office Open XML), has provided the most visible sampling of government action in this regard). Both ODF and OOXML have been adopted by ISO/IEC JTC 1 as document format standards). Bills to impose rules for assessing document format standards, such as ODF and OOXML, have been introduced in more than a half dozen U.S. states (mostly without success). Internationally, adoption of ODF is tracked by the ODF Alliance, which reports (as of the end of 2008) that a total of 16 national and 8 provincial governments have either required or recommended use of ODF. See http://www.odfalliance.org/press/Release20081222-annual-report-odf-2008.pdf
7 The new institutions created include officially recognized (but independent) organizations such as the European Telecommunications Standards Institute (ETSI), founded in 1988 by the European Conference of Postal and Telecommunications Administrations and recognized by the European Commission, the European Committee for Standardization (CEN), formed in 1961 by the National Bodies of the then members of the EEC and EFTA, and unofficial bodies such as the European Broadcasting Union (EBU), established in 1950.
8 Europe's progress in aerospace standards is an example. EU efforts to advance the interests of local aerospace companies have been supported by the development of a wide variety of standards by European standards organizations. The result is that it is generally conceded that Europe has taken the lead away from the United States in aerospace standardization. See, Updegrove, Andrew, Standards in Space: an Industry and a Process at a Crossroads, ConsortiumInfo.org, Consortium Standards Bulletin, Vol. IV, No. 7, July 2005, at http://www.consortiuminfo.org/bulletins/july05.php#feature
9 Draft document as basis for EIF 2.0, 15/07/08, p.5, at http://ec.europa.eu/idabc/servlets/Doc?id=31508 (accessed August 2, 2008)
10 European Interoperability Framework for Pan-European eGovernment Services v. 1.0, European Commission 2004, Section 1.1.2, p. 5, at http://ec.europa.eu/idabc/servlets/Doc?id=19529 (accessed August 2, 2008).
11 Draft document as basis for EIF 2.0, Ibid.
13 For earlier EU actions recognizing the importance of the "pan European dimension of eGovernment" and the role of interoperability in pursuit of that goal, see Section 1.2 of the Framework.
14 Version 1.0 of the European Interoperability Framework for Pan-European eGovernment Services (2004) can be found at http://ec.europa.eu/idabc/servlets/Doc?id=19529
15 Whether all changes have real substance is open to question. By way of example, one suggested goal for EIF 2.0 reads as follows, followed by the analogous goal included in EIF 1.0:
New: To support the European Union's strategy of providing user-centered eServices by facilitating the interoperability of services and systems between public administrations, as well as between administrations and the public (citizens and enterprises), at a pan-European level.
Old: To serve as the basis for European seamless interoperability in public services delivery, thereby providing better public services at EU level.
16 Subsidiarity is an EU term recognizing a requirement that decisions must be made by the governmental unit that is as "close as possible" to the citizen. Hence, unless a decision is one that has already been reserved to the EU, it should be taken at the national, regional or local level, unless it would be more effectively taken at the EU level. As applied in EIF 1.0, "The guidance provided by the European Interoperability Framework is concerned with the pan-European level of the services. In line with the principle of subsidiarity, the guidance does not interfere with the internal workings of administrations and EU Institutions. It will be up to each Member State and EU Institution to take the necessary steps to ensure interoperability at a pan-European level."
17 While length constraints limit the discussion above to the "use open standards" principal, each of the other principles is worth careful attention.
18 The most analogous provision in the United States can be found in OMB Circular A-119, which generally requires government agencies to specify (with various exceptions) "voluntary consensus standards" when they are available instead of "government unique standards" in procurement orders. The most significant difference between the two definitions, from a commercial point of view, is the EIF exclusion of royalties; OMB A-119 follows the traditional practice of recognizing a patent owners right to charge a "reasonable royalty" for an infringing implementation of a standard, Office of Management and Budget Circular No. A-119, Section 4.1, at http://www.whitehouse.gov/omb/circulars/a119/a119.html
19 IDABC, Draft document as basis for EIF 2.0, Section 8.4, p. 61, European Community 2008, at http://ec.europa.eu/idabc/servlets/Doc?id=31597
20 The complementary goals, as set forth in President Obama's Memorandum to Heads of Agencies and Departments, read as follows:
Government should be transparent. Transparency promotes accountability and provides information for citizens about what their Government is doing. Information maintained by the Federal Government is a national asset. My Administration will take appropriate action, consistent with law and policy, to disclose information rapidly in forms that the public can readily find and use. Executive departments and agencies should harness new technologies to put information about their operations and decisions online and readily available to the public. Executive departments and agencies should also solicit public feedback to identify information of greatest use to the public.
Government should be participatory. Public engagement enhances the Government's effectiveness and improves the quality of its decisions. Knowledge is widely dispersed in society, and public officials benefit from having access to that dispersed knowledge. Executive departments and agencies should offer Americans increased opportunities to participate in policymaking and to provide their Government with the benefits of their collective expertise and information. Executive departments and agencies should also solicit public input on how we can increase and improve opportunities for public participation in Government.
Government should be collaborative. Collaboration actively engages Americans in the work of their Government. Executive departments and agencies should use innovative tools, methods, and systems to cooperate among themselves, across all levels of Government, and with nonprofit organizations, businesses, and individuals in the private sector. Executive departments and agencies should solicit public feedback to assess and improve their level of collaboration and to identify new opportunities for cooperation.
21 The basic question is whether those that make the decisions will place the civil servant, or the citizen, first. I recently attended a panel discussion at a conference in Europe, where a government speaker noted the difficulties his country was encountering in deploying a "single sign on" requirement that would permit any citizen to access any government Web site using the same user name and password. Despite the costs and complexities involved, his government had taken the enlightened approach that the government should accommodate its citizens, and not the other way around. What decision will the United States make, when faced with the same litmus test decision?
return to top
"Openness," Customers and the IDABC
Any old standards hand forced to choose the single most disputed issue in standard setting over the past decade would likely respond with a deceivingly simple question: "What does it mean to be an 'open standard?'" A similar debate rages in the open source community between those that believe that some licenses (e.g., the BSD, MIT and Apache licenses) are "open enough," while others would respond with an emphatic Hell No! (or less printable words to similar effect).
That's not too surprising, because the question of what "open" means subsumes almost every other categorical question that information and communications technology (ICT) standards and open source folk are likely to disagree over, whether they be economic (should a vendor be able to be implement a standard free of charge, or in free and open source software (FOSS) licensed under a version of the General Public License (GPL)); systemic (are standards adopted by ISO/IEC JTC 1 "better" than those that are not); or procedural (must the economic and other terms upon which a necessary patent claim can be licensed be disclosed early in the development process)?
One school of thought holds that there should be no single definition of "openness" in standards (or open source code), as this would in some cases needlessly over-constrain the development of standards (or source code). By this line of reasoning (and narrowing the focus just to standards), "openness" should be conceptualized not as an absolute, but as something that exists along a pragmatic continuum, with progressively stricter requirements applying depending upon contextual factors, such as who will use the standard or software, for how long, and for what purpose.
Underlying the debate is the necessity of balancing the rights of the owners of any intellectual property rights (IPR) that might be infringed by the implementers of a standard with the benefits to society and industry that can be obtained from wide adoption — a complex question, when it is remembered that the owner of such a claim can often gain significant indirect as well as direct benefits from such adoption. Given that standard setting is a consensual process and that it is not possible to compel a non-member to make any promises at all, it is hardly surprising that different balance points have been found within any given organization, and in particular among diverse industry niches.
Traditionally, openness related discussions leading to the adoption of IPR policies have been "bottom down" in nature, in that those that develop standards have decided what they are, and are not, willing to do. In other words, the opinion of the customer has been largely missing from the equation, because they are usually underrepresented, when they are represented at all, in the standards development process. This, because although customers are welcome at the standards development table, few have taken up the invitation. Nor should this be a surprise, because participation in standards development is time consuming and expensive, and individual customers usually have much less to gain or risk from how a standard turns out than a vendor. Consequently, most customers have traditionally been content (or at least willing) to take what they have been given.
Now, however, this dynamic is shifting with respect to one particular type of customer, and the agents of change are governments, in their capacities both as customers and as the developers of policy.
Governments today are beginning to offer new answers to the question, "What does it mean to be an ‘open standard'?"
Governments have always been interested parties in standards development, and especially in areas such as health and safety. But governments have largely played a passive rather than an influential role when it comes to developing ICT openness requirements.
With governments in many nations becoming enamored with the potential for something new called Open Government or "eGovernment," however, legislators and bureaucrats alike are taking a harder look at what openness does — or should mean. While only some candidate attributes of openness are relevant to interactions with citizens (e.g., accessibility), others are meaningful to governments as consumers of ICT standards for their internal usage, and still others are vital with respect to the performance by governments of specific public functions, such as archival storage, where such standards were never relevant before.
But where to begin? Some governments are only now awakening to these concerns. But others have been studying these issues for some time. Their emerging conclusions are instructive, and can serve as an important roadmap for those that are only now beginning their examinations.
In January of 2005, the European Commission created a new programme, called the Interoperable Delivery of European eGovernment Services to public Administration, Business and Citizens (IDABC), and charged it with investigating how the EU could move forward into a future in which information could freely flow not only among EU agencies, but also between these agencies and those of EU Member States (MS), and between all of the above and their citizens. Such a system would not (indeed, could not) be forced upon the MS, but the recommendations and models developed in collaboration between the IDABC and MS could be employed by the MS as they saw fit to achieve the common goal.
Last year, the IDABC released drafts of several new deliverables that demonstrate what must be one of the most thorough, thoughtful and pragmatic efforts ever mounted to envision what an interoperable eGovernment should look like, and how it can be achieved. These deliverables include a lengthy study intended to serve as one of the principal bases for version 2.0 of the European Interoperability Framework (EIF 2.0), and a Report and Proposal for a Common Assessment Method for Standards and Specifications (CAMSS) that can be profitably employed by governmental entities of any type, anywhere. Importantly, each of these represents a substantial advance in the examination and determination of what should constitutes openness in government ICT standards from both an aspirational as well as a pragmatic perspective.
Significantly, one of the cornerstone requirements for achieving interoperability identified in the original (2004) version of the EIF is the use of open standards, which in EIF 1.0 are defined as having the following minimum attributes:
- The open standard is adopted and will be maintained by a not-for-profit organisation, and its ongoing development occurs on the basis of an open decision-making procedure available to all interested parties (consensus or majority decision etc.).
- The open standard has been published and the standard specification document is available either freely or at a nominal charge. It must be permissible to all to copy, distribute and use it for no fee or at a nominal fee.
- The intellectual property — i.e. patents possibly present — of (parts of) the open standard is made irrevocably available on a royalty free basis.
- There are no constraints on the re-use of the standard.
These recommendations met with wide, but not universal, approval. Some vendors of proprietary products were especially unhappy that a requirement to charge a royalty could potentially invalidate a standard from consideration for inclusion in a tender for government procurement. The IDABC countered by noting that such royalties could make interaction with government too expensive for some citizens, and that royalty-bearing standards could help entrench dominant vendors, decrease competition, and result in less innovation.
The new basis document for EIF 2.0 goes even further, noting in part as follows:
Since the publication of version 1 of the EIF, several practical cases have however shown the necessity to clearly point out the extent of this definition and to clarify its applicability….
- Open standards or technical specifications must allow all interested parties to implement the standards and to compete on quality and price. The goal is to have a competitive and innovative industry, not to protect market shares by raising obstacles to newcomers. Also, we want to be able to choose open source solutions or proprietary solutions on the basis of price/quality consideration…
- Practices distorting the definition of open standards or technical specifications should be addressed by protecting the integrity of the standardisation process….
- This definition reflects a consumer's viewpoint, with his needs uppermost in mind….
The last point, perhaps, is the most significant, when read in the context of those that precede it. It puts vendors on notice that while they cannot be forced to make their patent claims available for free, or on terms conducive to licensing under Free and Open Source Software (FOSS) licensing terms, or prevented from advancing their agendas preferentially in compliant development venues (subject to the limits of anticompetition laws), neither can they force governments to buy their wares.
These documents, which were posted for public comment through September 22 of last year, represent but the latest deliverables of a carefully considered and practical process. The definition and requirements for open standards that the IDABC has developed are both sound in substance and founded on real and well articulated justifications.
I believe that the EU is following a path that is leading towards the type of interoperability within governments, and between governments and citizens, that should serve as a model for governments everywhere. Hopefully governments around the world will so conclude as well. If they do decide to follow along on the carefully considered roadmap that the IDABC and the Member States of the EU have laid out, vendors as well as citizens will benefit, as achieving a global consensus on what constitutes an open standard for government procurement must inevitably serve to rationalize and expand the market for compliant products.
Bookmark the Standards Blog at http://www.consortiuminfo.org/newsblog/ or set
up an RSS feed at http://www.consortiuminfo.org/rss/
Copyright 2009 Andrew Updegrove
Sign up for a free subscription to Standards Today.
return to top
#57 Killing the Cockroach: The Incredibly Illogical,
Fundamentally Odious — but Seemingly Ineradicable
— Billable Hour
The service the lawyer renders is his professional knowledge and skill, but the commodity he sells is time — Reginald Heber Smith, inventor of the billable hour
Both reviled and ubiquitous, the billable hour is the cockroach of the legal world — Douglas McCollam, writing in the American Lawyer
Let's imagine that you would like to have your dilapidated, wood-sided house painted. The southern exposure is peeling, the soffits sport dark Rorschach patterns of mildew, and more than a few window sills have that uncomfortably punky feel to the touch that whispers "we're rotting — you must help us." You know that you can't put off facing the music any longer, and hope that the impact on your wallet will be no more painful than absolutely necessary.
So you do what any rational homeowner would — you get some referrals from people you trust, call the folks they recommend, and tell each of them that you'll be soliciting several bids. While you're at it, you also call the painter who, as luck would have it, had dropped a flyer in your mailbox that very afternoon.
How did such an illogical and unequal standard get established, and how does it survive?
Over the next week each housepainter stops by after work, walks around your house, scribbles a few notes, and promises to get back to you with a quote. Within a week, most of them actually do. Like any homeowner would, you select the cheapest, failing to note that it came from the painter you found through the flyer. Soon, the job is done, and he drops by to collect the agreed upon amount. Pleased, you pay him on the spot.
What a nice, logical system, especially for the buyer. You know just what you'll have to pay before you commit to pay it, and gain the benefit of competitive bidding as well. You'd be crazy to take on such a large financial commitment any other way, wouldn't you?
Let's say, though, that a year goes by, and the house starts to peel. Calls to the painter go unanswered, so you get in touch with another painter, who stops by and shakes his head pityingly. He informs you that flyer-man applied cheap latex over the old oil paint, didn't wash the mildew off the soffits or stabilize the rot in those windowsills. You'll need to have the whole job repeated — soon.
More calls go unanswered to the original painter, so it's referral time again — for a lawyer. Soon you are sitting in the office of the attorney with the best recommendation, and are recounting your tale of familiar woe.
After you agree to work together, you meekly ask how much it will cost. The answer is, "Oh, well, that's hard to say. It depends on so many things…" Although she gives you a ballpark estimate, that's all it is. She goes on to say that she will bill you monthly, with the amount invoiced based on the new time she has spent on your case. Oh — and the retainer to begin work will be $1,000, thank you very much.
Welcome to the improbable and seemingly impregnable world of the lawyers' billable hour — perhaps the only pervasive, virtually undisputed commercial standard in existence that puts almost all of the risk on the buyer, and where the cost of the commoditized product goes up, not down, year after year.
How did such an illogical and unequal standard become so well established, and how does it survive? Good questions. The first one is easy. But there's really no good answer for the second one at all, although I invite you to consider the following.
The billable hour traces its origins to one Reginald Heber Smith, a Harvard graduate with a mind that that ran to quantification as naturally as water runs downhill (his other lasting contribution was a formula for calculating partner compensation. it's still referred to by some as the "Smith System"). In 1914, Smith was hired to head Boston Legal Aid, and soon decided that its finances were in need of an overhaul. The concept of "Scientific Management" had just been introduced and was all the rage, and Harvard Business School was located conveniently near by. Smith enlisted his alma mater's aid, and soon was applying his brand of scientific principles to the management of the practice of law.
Word spread, and soon Mr. Smith was the managing partner of the (even then) venerable Boston law firm of Messrs. Hale and Dorr (now Wilmer Hale, following its merger with the D.C. firm of Wilmer, Cutler & Pickering). And so the seed of the billable hour came to be planted in the flinty soil of the private practice of law.
The billable hour did not initially propagate, kudzu-like, through the legal world. After all, commoditizing a profession seemed demeaning to many attorneys. And in order to be effective, lawyers needed to submit to another annoying Smith invention: the paper time sheet. On that rigid, ruled rectangle, every lawyer was required to record the time, client name, and task description for everything he did during the day. As if to rub it in, Mr. Smith's time sheet was marked in six minute increments.
Meanwhile, what was taking root in other law firms was another, equally insidious system: the minimum fee schedule. Rather incredibly, in light of the spirit, if not the letter, of antitrust law, bar associations were hard at work creating flat fee schedules for common legal tasks. By the 1940s, it had become an ethics violation under the rules of some states to financially undercut thy fellow attorney. Meanwhile, Reginald Heber Smith's seminal management work, "Law Office Organization," was in its eleventh printing, and the process of litigation, in particular, had become more complex (read: risky — for the law firm) to prosecute on a fixed-fee basis.
When the Supreme Court at last (in 1975!) applied to legal services similar rules to those that had long prohibited price fixing in the sale of products, the billable hour's hour had truly come at last.
Why does the much-loathed and unwelcome billable hour, like the cockroach, continue to infest the legal kitchen?
Which more or less brings us to today — or almost. For some time now, the billable hour and its many failings have been much commented upon. Associates and partners alike hate the slavery of the time sheet and the billable hour requirements that, like their billable hourly rates, continue to creep upwards. Clients resent the uncertainty of the bills they receive and their consequent lack of control over their legal budgets, not to mention the fact that an hour of busy work is charged out at the same rate as an hour of truly valuable advice. And no one denies that the billable hour system sets the best interests of the law firm against those of the client, resulting in a constant temptation for lawyers to over-staff projects, over-research legal issues and over-estimate time spent.
The result has been an ongoing dialogue in the legal profession over so-called "alternative billing systems," with many opinions offered on what such systems might look like, and how they might work (as if the rest of the professional — or for that matter the house painters' — world doesn't already have enough models from which to choose).
So why does the much-loathed and unwelcome billable hour, like the cockroach, continue to infest the legal kitchen?
The explanation, if not an adequate answer, is pure and simple. On the law firm side, the culprit is unbridled self interest. Keeping time sheets and meeting hourly expectations may demean and oppress hapless associates, but the results are beloved by law firm managers, because they dramatically increase the ability to budget more reliably and lower risk more consistently (Reginald Heber Smith may have been a number nerd, but he was after all no fool). So the law firms are few and far between that have abandoned the seductive billable hour, once they have succumbed to its siren song.
On the client side, the answer would appear to be mostly one of inertia. Even in-house general counsels that trained in big law firms and should know better continue to tolerate billable hour-based bills, perhaps because the cost and effect are at least trackable, detailed and familiar. Moreover, it's easy to simply ask for a discount off a firm's hourly rack rate, and hourly rates provide a simplistic way to cost-compare between law firms and individual attorneys. Perhaps the fact that lawyers are rarely accused of imagination may factor into the mix as well.
The real genius of Mr. Smith, perhaps, was to apply the principles of "scientific management" to legal services early on in the development of the business management discipline, affording the billable hour a faux veneer of respectability, and camouflaged the basic illogicality of a system that values reading email the same as negotiating a multi-billion dollar merger.
Ultimately, though, I expect that the invisible hand of Adam Smith will not remain sprained forever. Surely, some day it will heal, and just as the United Auto Workers in Detroit are finding that the real world has caught up with them, the hard heel of the marketplace will come down on the billable hour as well.
When it does, there will be a crunch that will reverberate around the client world, and the legal kitchen will never be the same again.
Copyright 2009 Andrew Updegrove
Read more Consider This… entries at: http://www.consortiuminfo.org/blog/
Sign up for a free subscription to Standards Today.
return to top