The Haves and Have-Nots of Big Data

The Haves and Have-Nots of Big Data

A “global firewall” is redrawing lines between countries—and people.

Facebook
Twitter
Email
Flipboard
Pocket

The use of facial recognition in airports has long been controversial, with citizens expressing outrage over the abuse of their privacy. This confrontation came to a head recently when San Francisco voted to ban facial-recognition software in law enforcement.

But why do state agencies like Customs & Border Protection (CBP) and the Transportation Security Administration have the right to take this data in the first place? What will they do with it—and what delimits their usage? Do we as citizens have the right to refuse?

These are important questions. But they only scratch the surface of the problem, which is the increased use of data—so-called Big Data—in statecraft writ large. Today, states see data as the best (and perhaps only) road towards guaranteeing security. This has led to a virtual land grab with states scrambling for more and better data. As one analyst put it: “Data is the new oil.”

This may make sense for security, but where does it leave us, as citizens? Security is often parasitic on rights. Unless we demand them, they will disappear.

The relationship between data and security is complex. We first began using fingerprints in criminal cases in about 1904, but the move to collect data on citizens took off after 9/11, when US intelligence agencies began to focus on individuals, rather than states or even non-state actors. For this they needed copious amounts of data—biometric and biographic, foreign and domestic—from which they could generate risk-ratings. These evaluations are the sine qua non of security, which increasingly include not only identity information, but expectations of future actions (i.e., “predictive analytics”). This makes sense: The greatest risks to states arise not from the threats we know, but those we don’t.

So when CBP collects your data at the airport, it isn’t simply trying to facilitate your travel; it is deciding whether to facilitate your travel. In particular, it is matching your file against terror watch lists: evaluating your risk level, and registering data for the future. Seen in this way, the use of facial recognition in airports is hardly a test; it’s been 20 years in the making. And soon it will be the norm, not the exception.

This is concerning for many reasons. Right now, biometric technologies are good, but not great, with accuracy rates above 95 percent. That’s probably fine when it comes to maintaining security, but definitely not enough to guarantee peoples’ rights won’t be violated. On an average day in JFK or LAX, thousands of people will be incorrectly identified. Countrywide, that means hundreds of thousands of people. These errors will disproportionately harm people of color who are more heavily watched, and, paradoxically (because of shortcomings in algorithmic design), more likely to be misidentified.

The question of storage also looms. Supposedly only data on foreigners is kept. But surely the government keeps information on some citizens, such as those with immigration records or suspected criminal activity—i.e., citizens of “interest.” But who defines what this interest is (and where it stops)? And if this data is so valuable to state agencies, why would they delete it? Unless citizens fight for this protection, it seems likely to fall away.

The dangers inherent to the state’s use of data are manifold, but they are not limited to the domestic stage. Indeed, while we focus our outrage on civil liberty concerns—an important aim, but a circumscribed one—we are missing important developments globally. As I detail in my book, The Politics of Borders: Sovereignty, Security, and the Citizen after 9/11, Big Data is changing the very face of the international bordering system, erasing the dividing lines we know, and replacing them with new ones. This geopolitical transformation demands attention. The boundaries of tomorrow are being drawn today.

The global dimension of Big Data is written into the very nature of the system. Because data-analysis systems need data on noncitizens, they require extensive data-sharing—after all, the whole point is to stop threats before they arrive at your door. This generates networks of exclusion. One data analyst, David Coleman at Novetta Mission Analytics, described this new kind of division to me in an interview as a global firewall—i.e., a virtual wall between states that share data and those that don’t.

“The question is what is the perimeter of what you consider trustworthy?… [In the West] there is truly a cloud of identity artifacts out there that holds you accountable for who you are.… We are basically setting up [a] firewall. It is not quite as free as the Schengen region, but it is a little bit broader. If you are in this [domain] you can move pretty easily; but it is really hard to get into it if you are coming from [some] place where your identity is purely social, and it is not stored in any electronic media.”

Inside the firewall, there is verifiable data. Outside it, there is unstructured data that isn’t placed into a defined model (think of Twitter: Words with hashtags are structured), or no data at all, as is common across the developing world. These lines of division exist between states, creating a Cold War–style bipolarity, but also between citizens themselves. A person’s ability to work and travel abroad is increasingly contingent on the data networks into which they were born. When I asked if this is creating a global caste system, Coleman, the analyst replied, “We are ghettoizing the world.”

Reducing the world to a structural form of identity and difference in many ways recreates the old imperial division between civilization and barbarism. This history is long and tortured and sits at the heart of Western thought. Aristotle believed the Greeks were naturally free and barbarians naturally slaves, justifying their mastery. In the modern era, John Stuart Mill argued that only civilized peoples can rule themselves and that “despotism is a legitimate mode of government in dealing with barbarians.”

And once a people are given the label of “barbarian,” it is hard to break. Indeed, we have a long history of using international law to legitimate some international actors and vilify others, casting them as rogues, or terrorists.

What happens to those on the wrong side of the firewall? In some deep sense, they are returned to a state of savageness, to bare humanity, arguably beyond human-rights protection. Borrowing from George Orwell, we might say that some people are more equal than others. And worse, the problem is not just that the data-less are excluded, but that their condition is nearly impossible to escape. By what measure can they prove their trustworthiness if they have no data trail?

This mirrors the analysis provided by Hannah Arendt in Origins of Totalitarianism about the problem of stateless peoples after WWI, when she wrote that “the loss of citizenship deprived people not only of protection, but also of all clearly established, officially recognized identity.” Today the data-less are doubly vulnerable. They do not have ID (the old problem), and what they do have is unverifiable (the new one).

The use of Big Data in statecraft is arguably the defining feature of our present era. If we want to protect our rights as national citizens, but also global ones the time to act is now. Certainly, inter- and multinational forms of collaboration may be part of the solution. But in the data age, we can’t trust global liberal institutions to put forward progressive solutions. We need democratic processes to create new and binding forms of state regulation: voting in norms against sharing data, mechanisms to protect individuals, systems of transparency and redress, and so forth.

Every generation has its most vulnerable. In the age of nation-states, it was the stateless; in the age of data, it is the data-less: those excluded, left in the dark. The concerns of people outside the global firewall may seem utterly different than our own, but they shouldn’t. The problem for US citizens is too much data; for those on the outside, it is not enough. But at core is the same concern: the status of our rights. The age of data-based statecraft and interminable data accumulation is upon us, and it will affect us all. If we don’t make an effort to define and defend data rights now, it will soon be too late.

Thank you for reading The Nation!

We hope you enjoyed the story you just read. It’s just one of many examples of incisive, deeply-reported journalism we publish—journalism that shifts the needle on important issues, uncovers malfeasance and corruption, and uplifts voices and perspectives that often go unheard in mainstream media. For nearly 160 years, The Nation has spoken truth to power and shone a light on issues that would otherwise be swept under the rug.

In a critical election year as well as a time of media austerity, independent journalism needs your continued support. The best way to do this is with a recurring donation. This month, we are asking readers like you who value truth and democracy to step up and support The Nation with a monthly contribution. We call these monthly donors Sustainers, a small but mighty group of supporters who ensure our team of writers, editors, and fact-checkers have the resources they need to report on breaking news, investigative feature stories that often take weeks or months to report, and much more.

There’s a lot to talk about in the coming months, from the presidential election and Supreme Court battles to the fight for bodily autonomy. We’ll cover all these issues and more, but this is only made possible with support from sustaining donors. Donate today—any amount you can spare each month is appreciated, even just the price of a cup of coffee.

The Nation does not bow to the interests of a corporate owner or advertisers—we answer only to readers like you who make our work possible. Set up a recurring donation today and ensure we can continue to hold the powerful accountable.

Thank you for your generosity.

Ad Policy
x