In the wake of Edward Snowden’s revelations about government surveillance, Laura Poitras, director of the Oscar-winning documentary Citizenfour, and many Internet-freedom activists and security engineers have told the public to trust math—encryption—not politics or law to protect their privacy. Our track record of reining in US surveillance through the law is abysmal: To date, there are no proven instances of a law permanently removing an operational, cost-effective, productive foreign-surveillance capability on human rights or constitutional grounds.
If compelling the NSA to respect human-rights obligations is unlikely, it must be clear how much harder regulating Israel’s Unit 8200, Russia’s FSB, or the Third Department of the Chinese army’s General Staff will be. Americans—not to mention the other 95 percent of humanity—are just as vulnerable to Russian, Chinese, or Israeli surveillance as they are to the NSA’s. Even closer to home, domestic abusers, racist law enforcement, and organized crime also aim to violate individuals’ privacy. Much of what the NSA can do now will soon be in other hands. Surveillance technology, like the rest of the digital world, is often adapted for sale to the rest of us.
* * *
Surveillance gets cheaper by the day. In the 1970s, three minutes of voice traffic between New York City and London cost about $40 (adjusting for inflation) for the US government. But by 2005, the rise in Internet calling made conversations so cheap that the cost is difficult to meter—well under a penny—and the cost to monitor them has dropped fast. Intelligence budgets have grown massively over the past 40 years—Australia’s, for example, increased a shocking 600 percent—but are difficult to measure because they’re classified. But this is nothing compared to the thousandfold decrease in the cost of collecting information.
Information collected through surveillance has two useful components: the content of communication—what is said during a phone call, for example—and its context, or “metadata,” which includes time, location, and identities. While the former is what we usually think of as surveillance, it’s often less revealing. Take a sudden burst of SMS traffic between two coworkers who have never previously communicated on their personal phones, followed by a set of calls to an abortion clinic and a PayPal transfer between them six weeks later. The content of the communication adds relatively little to the story. Despite resistance from analysts worried about their careers and (much) wasted money, the past 20 years have seen the deployment of automated analysis systems on the communications metadata of most of the world in an attempt to keep up with the flood that NSA surveillance has unleashed. It’s unclear if the intelligence from these efforts is accurate or useful, but as the former director of the NSA, Gen. Michael Hayden, said in reference to the CIA’s drone program, “We kill people based on metadata.”
So, if the law has failed to protect us from giant NSA sweeps of our metadata, and these operations cost the government little, what can encryption do to help?
* * *
Encryption is a category of mathematical operations in which one string, a key, is used to transform another, the plain text, in an encoded version according to a specific algorithm. Once the text is transformed, reversing the transformation without a key takes tens or hundreds of orders of magnitude longer than the encryption did. A secure, unsurveilled Internet depends on widely shared protocols between different systems—two smartphones, for example, or a smart meter and the local electrical substation—and all secure protocols depend on encryption and related operations. Correctly encrypted content is generally not something that can be spied on. Intelligence agencies are not magic; we have no reason to believe that the NSA boasts mathematical advances relevant to decryption beyond what the unclassified world has.
Intelligence agencies tend to rely on legal coercion of big centralized providers, theft of encryption keys or data from those providers, or taking advantage of errors in security in order to watch the exchanges occurring between systems. Any of these tactics can give them access to the content and metadata of all communications at that provider—even, as we’ve seen, for giants like Google.
What if all of this data was properly encrypted? If this were the case, the individual communicating hosts would have to be individually compromised. This is an active attack, only feasible against small numbers of targets instead of all of us at once in giant sweeps. The cost and risk of discovery relegates it to high-value targets—exactly the situation we want. Protecting metadata is more complex, but the same pattern applies.
Commercial businesses are already implementing better encryption to protect their information from spying, as seen with Apple’s decision to encrypt data stored on iPhones by default and Google’s improvements in security between its data centers. However, privacy should be easily accessible for everyone, and most of these changes neither protect communications metadata nor stop companies from snooping on their customers for advertising purposes, which still allows the government to collect data in bulk.
The challenge of protecting the average citizen’s metadata is currently best handled by tools like the Tor project, an international network of volunteer proxies with an associated browser and other client software. Tor is used by journalists, activists, domestic-violence victims, and even US government investigators—all groups for whom surveillance can have serious consequences. Tor encrypts traffic and bounces it between several relays, ensuring that the relay a user connects to doesn’t know where the traffic is going, and the relay passing the traffic to its final destination doesn’t know the source. Decentralized approaches like Tor’s, in which security is designed into the system and there is no central party to coerce or subvert, are as necessary a response to surveillance as encryption is.
Tor is a small project, but the resources required to run it are still significant. It grew out of ideas first developed inside the US Naval Research Labs, and when folks outside of the Department of Defense picked up the project, the Defense Advanced Research Projects Agency, or DARPA, decided it was worth funding as basic research. The code for Tor is open-source (meaning anyone can read and copy it) and thoroughly reviewed, and the operating relays are independent, so the Net-freedom community at large generally doesn’t worry about this arrangement. Not all government funding is so unproblematic, but alternatives are hard to find.
* * *
Exact numbers are hard to come by, but globally, the intelligence activities of governments and corporations consume hundreds of billions of dollars. The cost to protect individuals and communities from surveillance is also hard to estimate, but such projects are funded primarily by grants from NGOs and the State Department—a tiny fraction of the money spent by governments and corporations, and hardly enough to cover all of the necessary work on this issue around the world, of which Tor is just one small part.
Internet-freedom funders, like all organizations that need to justify their budgets, have a bias toward simple fixes that make good stories—the “there’s an app for that” school of countersurveillance. While tools matter, they are large, long-term infrastructural investments that require research and professional support. The current funding model—waving a fistful of checks around and seeing who jumps in with a shiny idea—has resulted in no coherent solutions for defending against mass surveillance, in the United States or globally. Shaping these solutions will require technical competence and long-term vision, not just more money. The changes to infrastructure that we need to protect ourselves from surveillance are a social good, but the decentralized mind-set required does not come naturally to companies or governments.
In the end, the culture change already taking place in the commercial-technology sector will determine whether we live under surveillance or get to retain our privacy. As demonstrated by the reaction of Google engineers to the revelations of NSA spying on their infrastructure, tech workers often understand this, but eventually management must lead. The current advertising-driven business model for many online services depends on capturing massive amounts of personal information in order to target the ads. In addition to being invasive in its own right, this has enabled a lot of government surveillance.
Silicon Valley companies must recognize that the law won’t do this work for them, and that if they want to avoid undermining freedom globally, it’s time to ditch the dated and dangerous ad model and start building decentralization and content and metadata privacy into everything they create. The result would be a more secure Internet for everyone. Collecting our data would become much more difficult and expensive for the government. We would be able to communicate without worrying that all of our content and metadata was being sorted through by the NSA. As Poitras has claimed, encryption does work, and it’s time that we put our faith, and our funding, toward math instead of our battered privacy regulations to keep us safe from prying eyes.
Also in This Issue
Leah Hunt-Hendrix and Astra Taylor, “‘Tech’ Is Political—How We Respond to It Needs to Be Just as Political”
Tim Shorrock, “How Private Contractors Have Created a Shadow NSA ”
Ingrid Burrington, “What Amazon Taught the Cops”
Astra Taylor and Jathan Sadowski, “How Companies Turn Your Facebook Activity Into a Credit Score ”
Jessica Bruder, “These Workers Have a New Demand: Stop Watching Us ”
Virginia Eubanks, “Want to Cut Welfare? There’s an App for That. ”