The future of policing, it seems, will look a lot like the present of policing, just faster and with more math. Instead of using inherent bias and simplistic statistics to racially profile individuals on a street, cops of the future will be able to use complicated statistics to racially profile people in their homes. In the summer of 2013, for instance, the Chicago Police Department implemented a pilot program intended to reduce violent crime. It used an algorithm developed by an engineer at the Illinois Institute of Technology to generate a âheat listâ of roughly 400 people who were most likely to become perpetrators or victims of violence. Cops tracked down some of these individuals, showed up at their homes, and warned them they were being watched. Similar programs using technology have been tested in recent years, all under the rubric of whatâs been called âpredictive policing.â
This approach has understandably caused concern and outrage among civil-liberties advocatesâthe very name âpredictive policingâ sounds like something out of Minority Report, just without psychics hanging out in a pool. As Jay Stanley, senior policy analyst at the ACLU, commented about the Chicago program: âUnfortunately, there are all too many reasons to worry that this program will veer towards the worst nightmares of those who have been closely watching the growth of the data-based society.â
These are real concerns. Itâs easy to imagine how biased data could render the criminal-justice system even more of a black box for due process, replacing racist cops with racist algorithms. That said, the cases in which police attempt to predict individual behavior based on data analysis are still relatively rare among the many approaches that have been shoehorned under the heading of predictive policing. Thus far, in fact, predictive policing has been less Minority Report than Groundhog Dayâthat is, yet another iteration of the same data-driven policing strategies that have proliferated since the 1990s. As itâs currently implemented, predictive policing is more a management strategy than a crime-fighting tactic. Whether it works is perhaps not as useful a question as who it works forâits chief beneficiaries arenât patrol cops or citizens, but those patrol copsâ bosses and the companies selling police departments a technical solution to human problems.
* * *
Part of predictive policingâs image problem is that it doesnât actually have a very clear image to begin with. When a police department declares that itâs using predictive policing, it could be doing a whole range of things that vary greatly in their technical sophistication, effectiveness, and ethical concerns. The term itself came into fashion thanks to a 2009 symposium organized by William Bratton, who at the time was the police chief of Los Angeles, and the National Institute of Justice. Following the symposium, the NIJ distributed a series of grants that funded predictive-policing research by universities and pilot programs with several police departments, including in Los Angeles.
The NIJ funding led to a partnership between the LAPD and Jeffrey Brantingham, an anthropologist at the University of California, Los Angeles, who was studying the use of predictive modeling techniques for forecasting civilian deaths and counterinsurgency activities in Iraq. Brantinghamâs research, which began in 2006, had been funded partly by grants from the Army Research Office. By 2009, when he began working with the LAPD, heâd apparently determined that data-driven urban warfare wasnât all that different from data-driven policing. In 2012, Brantingham turned his research into PredPol, the software and company perhaps most associated with the publicâs understanding of predictive policing.
Popular
"swipe left below to view more authors"Swipe â
PredPolâs aggressive marketing created plenty of buzz (and plenty of controversy), but the logic behind predictive policing isnât new. Data-driven policing has been standard practice for at least two decades. Compstatâa management strategy developed under Bratton in 1994, during his first stint leading the NYPDâhelped form the ideological foundation of predictive policing, and the academic research and development behind the strategy existed well before 2009.
Robert Cheetham is the CEO of the civic-tech company Azavea and chief architect of its predictive-policing software, HunchLab. In 1997, he was hired fresh out of graduate school as part of a two-man team building crime-mapping software for the Philadelphia Police Department. âWe were hired at a time when no one in senior staff cared if we were there or not,â Cheetham notes. That changed six months later, when the city hired John Timoney, who had been first deputy commissioner of the NYPD under Bratton, as its new police commissioner. Upon arriving in Philadelphia, Timoney immediately pushed to implement Compstatâand to ramp up support for the crime-mapping tools that Cheetham was building. (Timoney went on to export the Compstat model and other questionable policing tactics to Miami as well, and since 2012 he has been a consultant to the government of Bahrain.)
Cheetham got inspired to build what would become HunchLab after watching Philadelphia cops work with weeks-old Compstat data. He created an early prototype in 1999, shortly before leaving the department, and eventually built the first iteration of HunchLab in 2006, with the support of a National Science Foundation grant.
* * *
HunchLab and PredPol work in similar ways. Both build on the already common practice of crime-mapping. Police departments perform statistical analyses of historical and recent crime data to predict the likelihood of crimes (in most cases, limited to property crimes) within a certain geographical area. Officers are given maps each day as guides for recommended places to patrol. Among other statistical tools, HunchLab adds a technique known as ârisk-terrain modelingâ that incorporates multiple additional data streams, including things like proximity to bars and local events, to determine the likelihood of crime in an area. The developers of both programs are quick to point out that they donât use demographic or socioeconomic data about neighborhoods.
Whether any of this works to reduce crime is, of course, the general publicâs main concern. But the answer remains unclearâin part because the majority of reports on the use of predictive policing have come from police departments and PredPol, which both have a stake in seeing positive results. The NIJ has funded the RAND Corporation to conduct analyses of two of its grantees, in Shreveport, Louisiana, and Chicago. The complete reports are forthcoming, but the preliminary analysis of Shreveportâs program, released in November, suggests that itâs too soon to tell if the technology works, or how well it works in conjunction with what police officers already do.
But thereâs a deeper problem with measuring the success of algorithmically determined criminal justice: All of these applications assume the credibility of the underlying crime dataâand the policing methods that generate that data in the first place. As countless scandals over quotas in police departments demonstrate, that is a huge assumption. Kade Crockford, director of the ACLUâs Technology for Liberty Project, describes much of what predictive policing offers as a âtech-washing of racially discriminatory law-enforcement practicesâ that, even in its most supposedly unproblematic forms, is merely âtinkering around the edges [of the criminal-justice system] and calling it revolutionary reform.â
While police chiefs may champion that supposedly revolutionary reform, it doesnât appear to have a huge impact on the cops working under them. The average cop is unlikely to concede that a computerâs data-driven knowledge is superior to his or her street knowledge. In Cheethamâs experience, most officers want tools that are easy to use and that âmake sense against personal experienceââand, accordingly, they expect the highlighted areas of the maps theyâre given to change over time and reflect their actions. To the extent that predictive policingâs forecasts can reinforce patrol copsâ observations, itâs a useful tool. But for many cops, itâs also an irrelevant one. The majority of their day is spent responding to calls or being directed by a dispatcher; if they get around to patrolling the red square on their department-issued map, itâs in their limited downtime.
Cheetham did, however, describe enthusiastic feedback from senior staff and management, who appreciated the softwareâs ability to more precisely allocate officersâ time. In one city using HunchLab (a city Cheetham couldnât name, due to a nondisclosure agreement), the police department gets GPS feeds from patrol cars to measure the amount of time that officers spend in HunchLabâs âmission areasâ and to assess the impact of the patrol. The appeal of predictive policing as a micromanagement tool harks back to Compstat, where arrest numbers have become a tool for police accountabilityâif not to the public, then certainly to district commanders.
Itâs telling that one of the first articles to promote predictive policing, a 2009 Police Chief Magazine piece by the LAPDâs Charlie Beck and consultant Colleen McCue, poses the question âWhat Can We Learn From Wal-Mart and Amazon About Fighting Crime in a Recession?â The article likens law enforcement to a logistics dilemma, in which prioritizing where police officers patrol is analogous to identifying the likely demand for Pop-Tarts. Predictive policing has emerged as an answer to police departmentsâ assertion that theyâre being asked to do more with less. If we canât hire more cops, the logic goes, we need these tools to deploy them more efficiently.
* * *
Civil-liberties watchdogs should certainly decry predictive policingâs potential harm, but itâs unrealistic to assume that algorithms give police officers powers they donât already have. Cops havenât previously needed multivariate logistic regressions, risk-terrain modeling, or algorithmic heat lists to violate civil liberties or murder with impunity. Theyâve just needed a badge. Predictive policing does little to change street-level power dynamics, but it does have the potential to increase the power that police management has over cops on the streetâand as recent events in Ferguson, Missouri, and New York City demonstrate, the tendency toward micromanagement too often leads to more petty arrests in pursuit of revenue and quotas.
Itâs also misleading at best for police departments to cast the future as a choice between more software and more patrol officers. To begin with, both demands ignore the fact that crime rates in the United States remain dramatically low; itâs unclear why more of anything is required. Furthermore, neither option addresses the chief grievances of officers or the citizens theyâre supposed to protect. Ironically, one of the biggest wastes of financial resources and time for public safety in America is the âwar on drugsâ; making its implementation more efficient is akin to making a Rube Goldberg machine move slightly faster. Decriminalization, on the other hand, frees up a lot of resources.
More to the point, activities that might improve police and community relations and allow for deeper investigationsâfor example, more time for cops to talk to people, or training that encourages them to treat people like human beingsâare deliberately and by necessity inefficient. Due process is inefficient. Things that improve quality of life and are generally thought to decrease an individualâs likelihood of committing crimesâthings like access to education, housing, food, and healthcareâare also pretty inefficient. Thereâs a word for law enforcement that emphasizes efficiency over humanity: fascism.
If indeed police departments are expected today to do more with less, perhaps that should actually mean more training, more public accountability, and less micromanagement in the service of perpetuating terrible policing dynamics.
Also in This Issue
Leah Hunt-Hendrix and Astra Taylor, ââTechâ Is PoliticalâHow We Respond to It Needs to Be Just as Politicalâ
Tim Shorrock, âHow Private Contractors Have Created a Shadow NSA â
Eleanor Saitta, âThe Key to Ending Mass Surveillance? Math.â
Astra Taylor and Jathan Sadowski, âHow Companies Turn Your Facebook Activity Into a Credit Score â
Jessica Bruder, âThese Workers Have a New Demand: Stop Watching Us â
Virginia Eubanks, âWant to Cut Welfare? Thereâs an App for That. â
