San Francisco Chronicle

An imperfect tool

-

San Francisco is learning a hard reality: Algorithms aren’t flawless in predicting the risk of releasing a violent suspect. The court system was leaning heavily on the big-data notion until now, when it famously — and fatally — failed.

In a series of revelation­s, it turns out that all sides — prosecutor, judge and jail diversion officials — went along with an algorithm-produced prediction that a suspect with a troubling rap sheet was a candidate for release. The idea behind the overall program is an alternativ­e to cash bail, increasing­ly recognized as unfair to poor defendants.

But the result this time was a disaster. The suspect, 19-year old Lamonte Mims, ignored the terms of his release and five days later was involved in a robbery on Twin Peaks that ended in the murder of Edward French, police say. Mims was eventually arrested, along with an accomplice.

A string of Chronicle stories by reporter Vivian Ho is shining a harsh light on the the case. The best explanatio­n to date is that the data used to produce the errant risk assessment didn’t include the telltale ingredient that the suspect had been jailed in the past. Missing data produces flawed outcomes.

But that doesn’t mean that the program should be junked. If all the informatio­n had been included, then the algorithm might have worked by indicating trouble if the suspect was let go. The judge might have blocked release and French would be alive.

San Francisco is all in on the idea of reforming the bail system, with both City Attorney Dennis Herrera and District Attorney George Gascón opposing the longtime practice. That’s the right direction, and algorithms can help. The latest case demonstrat­es the fateful impact if it there are flaws. Demanding improvemen­ts is critical, but it’s still a promising initiative that should continue.

Newspapers in English

Newspapers from United States