Beyond Data-Driven Policing

In his book The Rise of Big Data Policing, Andrew Guthrie Ferguson examines the good, the bad, and the ugly of law enforcement’s increasing reliance on big data analysis and its driving algorithms.

Computer Ethics Policy Excerpt Scientists Nightstand Social Science

Current Issue

This Article From Issue

November-December 2017

Volume 105, Number 6
Page 377

DOI: 10.1511/2017.105.6.377

Big data is undeniably a big deal. The process of computationally analyzing huge volumes of data to uncover patterns and associations has become a mainstay in fields as diverse as finance and health care. In his book The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement, Andrew Guthrie Ferguson, a law professor specializing in predictive policing, big data surveillance, and the Fourth Amendment, examines the good, the bad, and the ugly of law enforcement’s increasing reliance on big data analysis and its driving algorithms. Ferguson provides in-depth discussion of the often troubling ethics involved, particularly injustices stemming from implicit racial bias that all too easily becomes encoded in the algorithms. (For more on ethics and algorithms, see “A Peek at Proprietary Algorithms.”) In this passage, Ferguson takes a step back to consider how different municipalities and their police forces intervene to try to prevent violent crime and protect potential victims after individuals have been flagged as being at risk.

Ad Right

A knock on an apartment door. A man gives the prognosis to a worried mother. Your son might die. He is at grave risk. Others he knows have already succumbed. An algorithm has identified the most likely to be stricken. He is one of a few thousand young men who may die. In Chicago, Illinois, this scene has played out hundreds of times at hundreds of doors. The danger, however, is not some blood-borne pathogen. This is not a doctor giving a cancer diagnosis but a police detective giving a life diagnosis. Violence is contagious, and you are exposed. As a young man in Chicago, due to your friends, associates, and prior connection to violence, you have been predicted to be the victim or perpetrator of a shooting. Your name is on the “Strategic Suspects List,” also known as the “heat list,” and a detective is at your door with a social worker and a community representative to tell you the future is not only dark but deadly. A vaccine exists, but it means turning your life around now.

Data and Its Limitations

Only a tiny percentage of people commit violent crime. Police tend to know the people who commit those crimes, but the difficulty has always been getting “the criminals” to know that the police know they are committing the crimes. Enter “focused deterrence”—a theory that seeks to understand and dismantle the networks of criminal actors that drive violent crime. Focused deterrence involves a targeted and explicit message to a narrow slice of the population that police, prosecutors, and the community know who is engaged in violence and that the killings must end.

Person-based predictive policing involves the use of data to identify and investigate potential suspects or victims. Part public health approach to violence and part social network approach to risk assessment, big data can visualize how violence spreads like a virus among communities. The same data also can predict the most likely victims of violence. Police data is shaping who gets targeted and forecasting who gets shot.

While these predictive technologies are excitingly new, the concerns underlying them remain frustratingly old-fashioned. Fears of racial bias, a lack of transparency, data error, and the distortions of constitutional protections offer serious challenges to the development of workable person-based predictive strategies. Yet person-based policing systems are being used now, and people are being targeted

The Chicago Way

In Chicago, 1,400 young men have been identified through big data techniques as targets for a roster called the heat list. Software generates a rank-order list of potential victims and subjects with the greatest risk of violence. Based on an algorithm designed by Miles Wernick of the Illinois Institute of Technology, the heat list uses 11 variables to create risk scores from 1 to 500. The higher the score means the greater the risk of being a victim or perpetrator of gun violence. Who gets shot? The algorithm knows. And the heat-list algorithm has been tragically accurate. On a violent Mother’s Day weekend in 2016, 80 percent of the 51 people shot over two days had been correctly identified on Chicago’s heat list. On Memorial Day 2016, 78 percent of the 64 people shot were on the list. Using the heat list, police have prioritized youth violence to intervene in the lives of the most at-risk men.

The algorithm remains a police secret, but reportedly the factors include past criminal history, arrests, parole status, and whether the target has been identified as part of a gang. As described by the Chicago Police Department, “The software is generated based on empirical data that lists attributes of a person’s criminal record, including the record of violence among criminal associates, the degree to which his criminal activities are on the rise, and the types of intensity of criminal history.” The algorithm ranks these variables to come up with a predictive score of how “hot” individuals might be in terms of their risk of violence.

Selection to the heat list can be accompanied by a “custom notification visit.” As described earlier, it involves a home visit, usually by a senior police officer, a social worker, and a member of the community (perhaps a football coach or pastor). During the visit, police hand deliver a “custom notification letter” detailing what the police know about the individual’s criminal past, as well as a warning about the future. As described in another police department document, “The custom notification letter will be used to inform individuals of the arrest, prosecution, and sentencing consequences they may face if they choose to or continue to engage in public violence. The letter will be specific to the identified individual and incorporate those factors known about the individual inclusive of prior arrests, impact of known associates, and potential sentencing outcomes for future criminal acts.” These custom notification letters symbolize formal messages of deterrence written in black and white. Mess up and you will be prosecuted to the fullest extent of the law. The message is also quite personal. You—the person named in the letter—are known by police and are being watched.

Without social service interventions, the algorithm just became a targeting mechanism for police.

But the hard reality is that violence in Chicago has only increased. In fact, 2016 has seen a heartbreaking uptick in violent shootings, leading to public criticism of the model. Questions remain about the program’s effectiveness and in particular whether enough has been done to remedy the social and economic risks identified. For example, there exists the open question of whether the algorithm adequately distinguishes between targets who are “high risk” (those who might get shot) and those who are “high threat” (those who might shoot). Intensive surveillance and police intervention for those who might be victims may not be as important as targeting those who might engage in violence. But if the heat-list formula counts the risk and threat equally, police resources may be misdirected.

Stepping back, two important insights arise from the heat-list experiment. First, the public health approach of mapping social networks of violence may successfully identify those who might be involved in violence. Second, studying the data to forecast who might be engaged in violence does not automatically end the violence. Custom notifications, while well meaning, may not have the intended effect if not implemented with a focus on addressing underlying social needs. The second step of providing interventions, resources, and redirection must also accompany the risk identification. Without targeted (and funded) social service interventions, the algorithm just became a targeting mechanism for police. Plainly stated, mapping the social network of violence may be easier than ending the violence. Data identifies the disease but offers no cure.

Big Data Meets the Big Easy

An example of a more successful, holistic approach to reducing violence can be found in New Orleans, Louisiana, once the murder capital of the United States. In 2013, the Big Easy averaged 1.46 shootings a day, and Mayor Mitch Landrieu turned to data to get a handle on the societal problems driving violence in the city. Private technology company Palantir has worked with the mayor’s office to identify the 1 percent of violent crime drivers in the city. The city’s ambitious public health approach to violence has relied on these insights, as they illuminated largely hidden relationships in already-existing city databases.

Fotan/Alamy Stock Photo

Mayor Landrieu’s project—NOLA for Life—began with data. Because the data sources included large-scale city systems with continuously generating records, Palantir engineers had to carefully integrate existing police and public-safety data into the system. This data included police calls for service, electronic police reports, probation and parole records, sheriff’s office arrest and booking records, gang databases, field information cards, ballistics, and the current case-management system. The analysts also added community and infrastructure details, including the location of schools, hospitals, libraries, parks, police districts, liquor stores, and even streetlights.

Using crime-mapping software, particular violent hot spots were identified. Using social network analysis, particular individuals were identified as being most likely to be victims of violent crime. Analysts predicted they could identify between 35 percent and 50 percent of the likely shooting victims from a subpopulation of 3,900 high-risk individuals. In addition, linkages between these individuals in terms of rivalries, retaliations, and relationships could explain why these men were in danger. Analysts using Palantir systems identified 2,916 individuals from the general New Orleans population of 378,750 (the 2013 city estimate) most likely to be the victim of homicide.

The data insights, however, continued beyond people. On the basis of the Palantir analysis, the Fire Department increased its presence around particular schools, and the Department of Public Works repaired missing streetlights. The Health Department targeted high-risk schools for violence prevention, and police mapped gang territory to identify areas of tension. Alcoholic-beverage enforcement targeted liquor-store violations, and neighborhoods were targeted for street cleanup. All of these localized interventions came from the same data set that mapped crime, governmental services, and public infrastructure.

Building from the data, the city launched a holistic strategy to address violence reduction focused on those who were identified as being most at risk. Some focused-deterrence policies were implemented with call-ins, “stop the shooting meetings,” and police targeting of suspected offenders. Since 2013, a multiagency gang unit has indicted 110 targeted gang members. But a host of other, non-law-enforcement social-services programs also were enacted (and funded). These programs included violence-reduction measures involving mediators, violence interrupters, community first responders, and other individuals who worked to defuse conflict and thus prevent retaliatory shootings. City officials also improved social-services programs addressing family violence, mentoring, fatherhood classes, behavioral interventions, and other mental and physical health concerns for those who were at risk. Programs focused on redirecting tension in public school by addressing trauma and on building restorative justice principles into discipline systems. All told, New Orleans adopted 29 different programs focusing on family, school, job training, reentry, and community and economic development. The goal of all of the changes was to target those individuals most at risk of committing violence and then to provide alternatives, support, and an opportunity to change.

From 2011 to 2014, New Orleans saw a 21.9 percent reduction in homicide, a better statistic than could be found in similar big cities such as Baltimore, St. Louis, Newark, or Detroit. More impressively, the city saw a 55 percent reduction in group or gang-involved murders.

New Orleans’s holistic approach to predictive policing did much more than identify the high-risk people in a community. By using public data to address factors that created an environment for crime, the New Orleans project looked to widen the lens of big data technologies. Funding public resources to respond to the underlying economic and social problems appeared to offer more long-term solutions. Big data alone cannot stop the shooting without resources to address the underlying causes of violence.


From The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement, by Andrew Guthrie Ferguson. Copyright © 2017 by New York University Press. Used by permission of the publisher.

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.