I asked my friend and colleague, Andrew Russell, to give us his take on Snowden, the NSA, and Internet politics, especially focusing on recent discussions amongst members of the Internet Engineering Task Force. Andy's book, Open Standards and the Digital Age: History, Ideology, and Networks, will be published by Cambridge University Press in early 2014. Readers can contact Andy at email@example.com.
“Government and industry have betrayed the internet, and us. By subverting the internet at every level to make it a vast, multi-layered and robust surveillance platform, the NSA has undermined a fundamental social contract. The companies that build and manage our internet infrastructure, the companies that create and sell us our hardware and software, or the companies that host our data: we can no longer trust them to be ethical internet stewards.”
-- Bruce Schneier, September 5, 2013
As details of the NSA PRISM program continue to be published, public interest advocates—a mix of civil libertarians, privacy advocates, and security experts—are struggling to form an effective response. President Obama is trapped in the national security apparatus; Congress is useless; and our beloved online destinations—Google, Facebook, Skype, etc.—are compromised, powerless, or both. Where else can we turn to safeguard the integrity of the Internet?
Bruce Schneier—a prominent author and security expert—addressed this difficult question in a September 5 column in the Guardian, boldly titled "The US government has betrayed the internet. We need to take it back."
His call to action provides a point of entry to reflect on the state of affairs that Edward Snowden’s trickery has brought to light. Before I continue, I want to make it clear that I admire Schneier’s work (his Eternal Value of Privacy essay is classroom gold), and I very much support his activist impulse. But I also worry that there are some problems with his suggestions—problems that become clearer when we see the issues through the lens of some concepts from the history of technology and STS.
The foundation of Schneier’s argument is a specific vision of the Internet’s history and position in contemporary society—a vision that we can safely assume the Guardian’s readers will share:
“Government and industry have betrayed the internet, and us… This is not the internet the world needs, or the internet its creators envisioned. We need to take it back. And by we, I mean the engineering community.”
With these sentences, Schneier advances two conventional beliefs about the Internet’s history as well its broader social “imaginaire.” First, Schneier presumes that his readers identify with a techno-utopian Californian Ideology in which the Internet acts as an agent of liberation and democratization. In this interpretation of the Internet’s world-historical role, its birthright is to escape the clutches of governments, memorably described by John Perry Barlow as “weary giants of flesh and steel.” Second, Schneier’s reference to the Internet’s “creators” invites readers to forget the fundamental irony of the Internet’s creation: the American Department of Defense sponsored the research that produced it, and subsidized the implementation of Internet protocols in popular operating systems of the early 1980s. Omission of this fact tempts us to forget the economic realities of Internet standards—a matter I will return to below.
a collective vision and fantasy—that are under attack. With the establishment of this shared understanding, Schneier moves briskly to three specific recommendations for action: “expose what you know,” “re-engineer the internet,” and “influence governance.” For the sake of brevity, I’ll confine my comments to his second recommendation, to use engineering and design to subvert the betrayal of the Internet.
“We can design. We need to figure out how to re-engineer the internet to prevent this kind of wholesale spying. We need new techniques to prevent communications intermediaries from leaking private information.”
To accomplish these tasks, Schneier calls on the “Internet Engineering Task Force, the group that defines the standards that make the internet run.” He notes that the IETF “has a meeting planned for early November in Vancouver,” and declares that it should “dedicate its next meeting to this task.” We have good reason to believe that Schneier’s faith in the IETF is shared widely: over the past decade or so, the IETF’s modus operandi of “rough consensus and running code” has been invoked repeatedly as a manifesto and organizational template for collaborative endeavors. Observers commonly cite the IETF as both a generator and an exemplar of a new, distributed form of social organization that can direct the talents of good-natured programmers to build robust systems.
“The gauntlet is in our face. What are we goingto do about it?”
The discussion that resulted is difficult to summarize, but fun to read. If you follow the thread, you’ll get a good sense for the IETF’s nuance, experience, humor, touch of paranoia, and—above all—the practical orientation of its organizational culture. It’s too early to declare any group consensus position, but there is certainly sympathy with Schneier’s agenda, understanding that the crisis presents a “teachable moment,” and recognition that the IETF cannot offer a technological fix singlehandedly. There are no signs whatsoever that the IETF will even try to organize the sort of ambitious technical project of re-engineering that Schneier recommends: as one experienced engineer summarized, “This whole 'surveillance of online activity' is a lot bigger problem than the IETF's work domain. For us to think we can 'solve' it is massively hubristic.”
For my colleagues in STS and the history of technology who wonder about the technical community’s capacity to protect the values of privacy and security, there is an encouraging message: Schneier and other prominent voices in the Internet technical community believe that their success will rely on their ability to combine technical skill with moral fortitude and political will. I believe that most of my academic friends would endorse this approach, and be especially pleased that Schneier and the IETF are not living up to our caricatures of engineers who believe that “technology” can be separated from “politics.” This is good news. However, there are significant institutional obstacles ahead that Schneier and his allies may not realize, and certainly haven’t publicized.
First, it turns out that it’s not so easy to re-engineer the Internet. The IETF has tried, most visibly between 1992 and 1996 with the specification of IP version 6 (IPv6). But almost 2 decades after IPv6 specifications were published, less than2% of Internet traffic uses it. One difference between the adoption of IPv4 and IPv6 is that a single sponsor—the Department of Defense—forced its contractors to adopt IPv4 in 1983. (For this discussion, it doesn’t matter that the US military was the single sponsor; it only matters that there was one single sponsor that could insulate the Internet from competing agendas in the international networking industries .) IPv6’s lackluster adoption can be explained, in part, by the lack of a single, powerful sponsor or alliance of sponsors.
The upshot of this history, for Schneier’s proposal, is that it’s difficult to see how the economics of Internet standardization would allow significant re-engineering to the standards and infrastructure currently in place. The American government can no longer be trusted, and this Congress has a poor track record of supporting science and technology. There’s no reason why capitalists would be interested, either. The giants of today’s Internet industries (Cisco, Huawei, Google, Comcast, et al), if trustworthy at all, only would support new standards that fit into their existing business plans. Venture capitalists are equally unreliable, since they seem to prefer to invest in high-margin “app economy” start-ups that take advantage of existing infrastructure. They know that investments in a new, open infrastructure would not generate the profits they desire.
Apart from the economics of Internet standards, there are also aspects of its institutional sociology that pose problems. Standards bodies are, by design, incrementalist organizations. In most cases they are not effective venues for conducting research or promulgating new techniques. Again, Internet history clarifies the point: the IETF was created in 1986 as a forum to stabilize implementations of TCP/IP, which was first developed over 10 years earlier. In other words, its foundational value was to sustain technological momentum, not to initiate it. One needs only a passing familiarity with some conceptual foundations of STS and the history of technology—Thomas Hughes’s “momentum,” Ludwik Fleck’s “thought collectives,” and and Joseph Schumpeter’s “creative destruction”—to understand that we’re more likely to see radical changes and fresh ideas come from somewhere else, somewhere unexpected.
If Fleck and Schumpeter were right, new ideas will arise from a place unaffected by the stable alliances of technical ideas, cultural norms, and business models in the IETF. We can and should count on the IETF for incremental suggestions for the Internet, but only an act of “dot org entrepreneurship” can generate something truly different. If/when that happens, it won’t be the "internet" that Schneier, Snowden, and their allies seek to defend: it would have a new “imaginaire” that, one hopes, would embody the values of privacy and security in ways the Internet does not and never has.
I’ll conclude with a historian’s lament. I worry that we are witnessing a cautionary tale of writing history without the benefit of one of our most powerful tools: long-term perspective. Thus far, it has seemed reasonable to cast the Internet’s brief history as a narrative of success. Perhaps it is time to re-imagine Internet history as a tragedy.