Los Angeles Times

The era of predictive policing

Crime-targeting technology can enhance public safety. But it also raises privacy and profiling issues.

-

The Los Angeles Police Department embraced predictive policing in 2015, but it has taken until now for the department’s assortment of once-shadowy databased operations to be thoroughly vetted in public.

In the end, that’s the essential problem to be solved — the lack of transparen­cy and public accountabi­lity in deploying crime-targeting tools that could so easily be misused to oppress rather than protect neighborho­ods already struggling with both crime and heavy-handed policing. It took years of work by activists to bring programs like LASER (a data-crunching operation that identifies crime hot spots) and PredPol (a software program that predicts property crimes) into the light of day, and they are to be commended; but they are off base in their demands that police scrap the tools entirely.

Data, used properly, can enhance public safety. Police should be encouraged to use it, as long as they are open about what they are doing, and as long as they heed legitimate criticism and adjust their programs accordingl­y. Failure to carefully tailor predictive policing programs invites invasion of privacy, racial profiling and other unacceptab­le side effects.

Problems with the LAPD’s predictive policing project were outlined in a report presented to the Police Commission on Tuesday by Inspector General Mark Smith. Smith found that officers used inconsiste­nt criteria in targeting and tracking people they considered to be most likely to commit violent crimes.

The department is due to respond in full on April 9, but LAPD Chief Michel Moore already told the commission that he would make some adjustment­s to the program.

The importance of data in policing should be obvious, and in concept is nothing new. Police have always kept an eye out for felons who return to their old neighborho­ods after their release from prison. Gang leaders have long been watched to ensure that they don’t restart criminal enterprise­s that were shut down when they were incarcerat­ed. Now, with the advent of newer technologi­es, algorithms and other computer programs can even more effectivel­y predict which people bear closer scrutiny.

Promoters of computeriz­ed risk-assessment tools argue that the programs eliminate whatever idiosyncra­sies or biases that individual officers may bring when they act merely on their own hunches, and can help police more accurately target people most likely to commit violent crimes.

But that enthusiasm may be unwarrante­d. Opponents make a good case that, instead of eliminatin­g bias, algorithms actually enhance it.

Consider, for example, a program that measures a person’s likelihood to be arrested based on a set of factors that include how many times he’s been arrested previously, whether he is on probation or parole, and how many crimes have been committed in the neighborho­od where he lives. Using that data, police may find that people who have been arrested three times are likely to be arrested again, cueing officers that they should be tracking others in that situation.

The problem is that we also have data that show police arrest African Americans and Latinos more often than whites who have committed the same crimes, in part because their neighborho­ods are more heavily policed. They are also prosecuted more often for the same crimes, so end up in jail or on probation and parole more often for the same crimes. If the algorithm crunches arrest, incarcerat­ion and probation or parole data and then spits out a risk assessment, it will signal to cops that the black or Latino subjects — already subject to unequal criminal justice treatment — ought to be more closely watched. The cycle of inequity will be repeated, this time enhanced by the data “science” that is supposed to erase bias.

Many activists believe that the misuse of data is inevitable and therefore that these techniques ought to be scrapped entirely. But that would leave us with police who bumble in darkness. That’s no recipe for either equity or public safety.

Many of the same activists object to police use of drones. But the Police Commission studied the issue and crafted a policy that keeps tight reins on deployment, requires meticulous disclosure and monitoring, and still allows drone use in narrowly tailored situations. Surely the commission can do the same with predictive policing tools.

Newspapers in English

Newspapers from United States