Toggle Menu

What Amazon Taught the Cops

Predictive policing is just another form of supply-chain efficiency.

Ingrid Burrington

May 27, 2015

The future of policing, it seems, will look a lot like the present of policing, just faster and with more math. Instead of using inherent bias and simplistic statistics to racially profile individuals on a street, cops of the future will be able to use complicated statistics to racially profile people in their homes. In the summer of 2013, for instance, the Chicago Police Department implemented a pilot program intended to reduce violent crime. It used an algorithm developed by an engineer at the Illinois Institute of Technology to generate a “heat list” of roughly 400 people who were most likely to become perpetrators or victims of violence. Cops tracked down some of these individuals, showed up at their homes, and warned them they were being watched. Similar programs using technology have been tested in recent years, all under the rubric of what’s been called “predictive policing.”

This approach has understandably caused concern and outrage among civil-liberties advocates—the very name “predictive policing” sounds like something out of Minority Report, just without psychics hanging out in a pool. As Jay Stanley, senior policy analyst at the ACLU, commented about the Chicago program: “Unfortunately, there are all too many reasons to worry that this program will veer towards the worst nightmares of those who have been closely watching the growth of the data-based society.”

These are real concerns. It’s easy to imagine how biased data could render the criminal-justice system even more of a black box for due process, replacing racist cops with racist algorithms. That said, the cases in which police attempt to predict individual behavior based on data analysis are still relatively rare among the many approaches that have been shoehorned under the heading of predictive policing. Thus far, in fact, predictive policing has been less Minority Report than Groundhog Day—that is, yet another iteration of the same data-driven policing strategies that have proliferated since the 1990s. As it’s currently implemented, predictive policing is more a management strategy than a crime-fighting tactic. Whether it works is perhaps not as useful a question as who it works for—its chief beneficiaries aren’t patrol cops or citizens, but those patrol cops’ bosses and the companies selling police departments a technical solution to human problems.

* * *

Part of predictive policing’s image problem is that it doesn’t actually have a very clear image to begin with. When a police department declares that it’s using predictive policing, it could be doing a whole range of things that vary greatly in their technical sophistication, effectiveness, and ethical concerns. The term itself came into fashion thanks to a 2009 symposium organized by William Bratton, who at the time was the police chief of Los Angeles, and the National Institute of Justice. Following the symposium, the NIJ distributed a series of grants that funded predictive-policing research by universities and pilot programs with several police departments, including in Los Angeles.

The NIJ funding led to a partnership between the LAPD and Jeffrey Brantingham, an anthropologist at the University of California, Los Angeles, who was studying the use of predictive modeling techniques for forecasting civilian deaths and counterinsurgency activities in Iraq. Brantingham’s research, which began in 2006, had been funded partly by grants from the Army Research Office. By 2009, when he began working with the LAPD, he’d apparently determined that data-driven urban warfare wasn’t all that different from data-driven policing. In 2012, Brantingham turned his research into PredPol, the software and company perhaps most associated with the public’s understanding of predictive policing.

PredPol’s aggressive marketing created plenty of buzz (and plenty of controversy), but the logic behind predictive policing isn’t new. Data-driven policing has been standard practice for at least two decades. Compstat—a management strategy developed under Bratton in 1994, during his first stint leading the NYPD—helped form the ideological foundation of predictive policing, and the academic research and development behind the strategy existed well before 2009.

Robert Cheetham is the CEO of the civic-tech company Azavea and chief architect of its predictive-policing software, HunchLab. In 1997, he was hired fresh out of graduate school as part of a two-man team building crime-mapping software for the Philadelphia Police Department. “We were hired at a time when no one in senior staff cared if we were there or not,” Cheetham notes. That changed six months later, when the city hired John Timoney, who had been first deputy commissioner of the NYPD under Bratton, as its new police commissioner. Upon arriving in Philadelphia, Timoney immediately pushed to implement Compstat—and to ramp up support for the crime-mapping tools that Cheetham was building. (Timoney went on to export the Compstat model and other questionable policing tactics to Miami as well, and since 2012 he has been a consultant to the government of Bahrain.)

Cheetham got inspired to build what would become HunchLab after watching Philadelphia cops work with weeks-old Compstat data. He created an early prototype in 1999, shortly before leaving the department, and eventually built the first iteration of HunchLab in 2006, with the support of a National Science Foundation grant.

* * *

HunchLab and PredPol work in similar ways. Both build on the already common practice of crime-mapping. Police departments perform statistical analyses of historical and recent crime data to predict the likelihood of crimes (in most cases, limited to property crimes) within a certain geographical area. Officers are given maps each day as guides for recommended places to patrol. Among other statistical tools, HunchLab adds a technique known as “risk-terrain modeling” that incorporates multiple additional data streams, including things like proximity to bars and local events, to determine the likelihood of crime in an area. The developers of both programs are quick to point out that they don’t use demographic or socioeconomic data about neighborhoods.

Whether any of this works to reduce crime is, of course, the general public’s main concern. But the answer remains unclear—in part because the majority of reports on the use of predictive policing have come from police departments and PredPol, which both have a stake in seeing positive results. The NIJ has funded the RAND Corporation to conduct analyses of two of its grantees, in Shreveport, Louisiana, and Chicago. The complete reports are forthcoming, but the preliminary analysis of Shreveport’s program, released in November, suggests that it’s too soon to tell if the technology works, or how well it works in conjunction with what police officers already do.

But there’s a deeper problem with measuring the success of algorithmically determined criminal justice: All of these applications assume the credibility of the underlying crime data—and the policing methods that generate that data in the first place. As countless scandals over quotas in police departments demonstrate, that is a huge assumption. Kade Crockford, director of the ACLU’s Technology for Liberty Project, describes much of what predictive policing offers as a “tech-washing of racially discriminatory law-enforcement practices” that, even in its most supposedly unproblematic forms, is merely “tinkering around the edges [of the criminal-justice system] and calling it revolutionary reform.”

While police chiefs may champion that supposedly revolutionary reform, it doesn’t appear to have a huge impact on the cops working under them. The average cop is unlikely to concede that a computer’s data-driven knowledge is superior to his or her street knowledge. In Cheetham’s experience, most officers want tools that are easy to use and that “make sense against personal experience”—and, accordingly, they expect the highlighted areas of the maps they’re given to change over time and reflect their actions. To the extent that predictive policing’s forecasts can reinforce patrol cops’ observations, it’s a useful tool. But for many cops, it’s also an irrelevant one. The majority of their day is spent responding to calls or being directed by a dispatcher; if they get around to patrolling the red square on their department-issued map, it’s in their limited downtime.

Cheetham did, however, describe enthusiastic feedback from senior staff and management, who appreciated the software’s ability to more precisely allocate officers’ time. In one city using HunchLab (a city Cheetham couldn’t name, due to a nondisclosure agreement), the police department gets GPS feeds from patrol cars to measure the amount of time that officers spend in HunchLab’s “mission areas” and to assess the impact of the patrol. The appeal of predictive policing as a micromanagement tool harks back to Compstat, where arrest numbers have become a tool for police accountability—if not to the public, then certainly to district commanders.

It’s telling that one of the first articles to promote predictive policing, a 2009 Police Chief Magazine piece by the LAPD’s Charlie Beck and consultant Colleen McCue, poses the question “What Can We Learn From Wal-Mart and Amazon About Fighting Crime in a Recession?” The article likens law enforcement to a logistics dilemma, in which prioritizing where police officers patrol is analogous to identifying the likely demand for Pop-Tarts. Predictive policing has emerged as an answer to police departments’ assertion that they’re being asked to do more with less. If we can’t hire more cops, the logic goes, we need these tools to deploy them more efficiently.

* * *

Civil-liberties watchdogs should certainly decry predictive policing’s potential harm, but it’s unrealistic to assume that algorithms give police officers powers they don’t already have. Cops haven’t previously needed multivariate logistic regressions, risk-terrain modeling, or algorithmic heat lists to violate civil liberties or murder with impunity. They’ve just needed a badge. Predictive policing does little to change street-level power dynamics, but it does have the potential to increase the power that police management has over cops on the street—and as recent events in Ferguson, Missouri, and New York City demonstrate, the tendency toward micromanagement too often leads to more petty arrests in pursuit of revenue and quotas.

It’s also misleading at best for police departments to cast the future as a choice between more software and more patrol officers. To begin with, both demands ignore the fact that crime rates in the United States remain dramatically low; it’s unclear why more of anything is required. Furthermore, neither option addresses the chief grievances of officers or the citizens they’re supposed to protect. Ironically, one of the biggest wastes of financial resources and time for public safety in America is the “war on drugs”; making its implementation more efficient is akin to making a Rube Goldberg machine move slightly faster. Decriminalization, on the other hand, frees up a lot of resources.

More to the point, activities that might improve police and community relations and allow for deeper investigations—for example, more time for cops to talk to people, or training that encourages them to treat people like human beings—are deliberately and by necessity inefficient. Due process is inefficient. Things that improve quality of life and are generally thought to decrease an individual’s likelihood of committing crimes—things like access to education, housing, food, and healthcare—are also pretty inefficient. There’s a word for law enforcement that emphasizes efficiency over humanity: fascism.

If indeed police departments are expected today to do more with less, perhaps that should actually mean more training, more public accountability, and less micromanagement in the service of perpetuating terrible policing dynamics. Also in This Issue

Leah Hunt-Hendrix and Astra Taylor, “‘Tech’ Is Political—How We Respond to It Needs to Be Just as Political

Tim Shorrock, “How Private Contractors Have Created a Shadow NSA

Eleanor Saitta, “The Key to Ending Mass Surveillance? Math.

Astra Taylor and Jathan Sadowski, “How Companies Turn Your Facebook Activity Into a Credit Score

Jessica Bruder, “These Workers Have a New Demand: Stop Watching Us

Virginia Eubanks, “Want to Cut Welfare? There’s an App for That.

Ingrid BurringtonIngrid Burrington is a writer living in New York.


Latest from the nation