In 2011 the Dutch Court of Audit released a report on the benefits of using open standards and open source software for government IT, concluding that there were hardly any benefits to be gained. The Court’s underlying research was widely criticized. In this article, the authors analyze the report’s omissions and weaknesses, introduce an economic framework for evaluating standardization, apply that framework to the subject of switching costs, and conclude that the framework, in combination with elements from other existing methodologies, can provide a starting point for more systematically performing international policy research relating to the benefits of open standards.
Consensus on what constitutes an ‘open standard’ has always been difficult to achieve. In this article, I review the norms of openness required by traditional SDOs and modern consortia in light of their goals, as well as the traditional and just-adopted definitions of openness that can be found in a variety of national, regional and treaty settings.
While network effects provide great benefits, they can also lead to great risks. The Internet is increasingly becoming the unique host for essential services, as government, finance, energy management, supply chains, telephony and much more abandon traditional channels and migrate to the Web. Now, the rapid adoption of the cloud computing model points towards a future of rapidly growing concentrations of crucial data and software in a limited number of massive data centers that are inadequately protected against cyber attack, and not at all against a determined physical assault. In this article, I outline the growing risks, suggest the types of physical and virtual security standards frameworks that should be developed and implemented to minimize vulnerability to catastrophic attacks, and evaluate whether new cyber security legislation proposed by the Obama administration is sufficient to meet the need.
Since the passage of the National Technology Transfer and Advancement Act of 1995, government has by law taken a back seat to the private sector in standards development. For years, the national interest has been well-served by the “bottom up” standards development process mandated by the NTTAA. The advent of globalization and the need to implement policies dependent on the development of complex, cross-sectoral standards profiles, however, indicates that the public-private partnership institutionalized by the NTTAA needs to be rebalanced.
The last 25 years have been marked by an explosion of consortia formed to develop, promote and otherwise support ICT standards. The reasons for creating a new consortium include the absence of appropriate technical expertise, interest, and/or supporting programs in existing organizations, and the benefits to be gained from directing all of the resources and efforts of a self-selecting membership to a particular set of common objectives. This article reviews the benefits to be obtained from launching a new consortium, the criteria to use to determine whether doing so is appropriate, the programs and functionalities available for achieving specific goals, and the stage of institutional maturity at which each function must be added to accomplish a new organization’s mission.
For over 100 years, the National Institute of Standards and Technology has fulfilled a vital role in supporting industry, science, public safety and more. In the standards arena, its mission has focused primarily on defining and enabling the measurement of weights and measures. A bill now in Congress would expand NIST’s historical authorization to institutionalize the important role it is now playing addressing complex, cross-sectoral, standards-dependent challenges such as the SmartGrid.
For decades, legal structures of various kinds have been developed to support collaborative activity, all referred to generically as “consortia.” While superficially quite different, each addresses the same core needs in its own way. In this article, I examine the ever versatile concept of the consortium, a uniquely appropriate vehicle for the rapidly proliferating collaborative initiatives of the digital age.
Before there were electronic documents, information could only be gathered by hand from multiple sources, and then combined into new documents that in turn became static. The same would be true for electronic documents today, if it weren’t for the Extensible Markup Language and the seemingly endless stream of derivative languages it has made possible.
Software that could be freely edited existed long before proprietary programs became the norm — but then it largely disappeared. When source-available, “free and open software” (FOSS) reemerged in the marketplace, it did so in a manner that was novel from both a social as well as a legal perspective. Today, it is an increasingly important part of the information technology landscape. In this article, I provide an overview of the history, legalities, social theory and commercial impact of the FOSS phenomenon, as well as some thoughts about its future.
Our headlong rush to migrate almost all key aspects of modern society to the Internet means that we must design new virtual defenses to emulate the walls and bars, guards and locks that protect us in the physical world. In this article, I survey the challenges to implementing cybersecurity, the types of standards used to provide it, the organizations that develop such standards, and the federal government’s first steps towards implementing them.