Computer tool cited as factor in slaying
Algorithm advised release before Twin Peaks robbery
A computer program that assigns risk scores to San Francisco criminal defendants is itself under scrutiny after it helped free a 19-year-old man who, just days later, allegedly gunned down a 71-year-old stranger on Twin Peaks.
But in the aftermath of the slaying of Edward French, a photographer and film scout who was killed in a robbery, both the district attorney’s office and the public defender’s office are expressing caution — saying that while use of the tool may need to be studied and refined, one tragedy doesn’t necessarily render it broken.
The city, since May 2016, has been experimenting with the algorithm, which was designed by a foundation in Texas fighting for criminal justice reform and seeks to predict whether a defendant awaiting trial can be released safely without reoffending or fleeing. The program, used in dozens of counties around the nation, was offered for free.
The idea is to use cold, efficient data to improve the traditional system of cash bail, which
relies heavily on whether defendants can afford bail amounts that are assigned to them based primarily on their charges. Many San Francisco officials, including District Attorney George Gascón, believe the system is vulnerable to racial bias and penalizes the poor.
According to authorities, the new tool recommended that Lamonte Mims, a 19-year-old former resident of Patterson in Stanislaus County, be released through a special pretrial diversion program when he appeared in court July 11 to face charges of being a convicted felon in possession of a gun.
Mims was already on probation in two counties for car burglary and allegedly violated the terms of probation at least twice. But the algorithm determined he was a medium public safety and flight risk, officials said, and recommended he be released on condition that he check in routinely with the pretrial diversion unit.
Judges are not required to heed the algorithm’s advice. But Mims was granted release by Judge Sharon Reardon, who, according to a court representative, could not comment on the case due to judicial ethics rules.
Five days later, French was fatally shot. Investigators said they connected Mims and a co-defendant, 20-yearold Fantasy Decuir of San Francisco, to French’s killing after the two were arrested for robbing a man and woman at gunpoint near St. Mary’s Cathedral on Gough Street.
Some in the city’s criminal justice system were shocked to see a defendant in Mims’ predicament released, whether or not he received a favorable computer score.
“This guy had been given one chance and put on probation, and then another chance and put on probation, and now he’s caught in possession of a firearm?” said Bill Fazio, a former city prosecutor and candidate for district attorney. “Why would he even be considered for release?”
Though the program’s creator, the Laura and John Arnold Foundation, has declined to detail what goes into the algorithm, officials said researchers developed the tool using data from the criminal case histories of more than 1.5 million people, with an eye on how they performed when released.
The program is said to weigh a number of factors, including the pending charges, the person’s age and rap sheet, and their record of showing up to court.
“Our country’s pretrial justice system relies on judges to determine who stays in jail pending trial and who is released,” David Hebert, a foundation spokesman, said in a statement. He and others with the foundation declined interview requests.
“There is no perfect, foolproof way to make this determination,” Hebert said. “But we believe that as a society we can adopt practices that provide better information for judges to make more informed decisions — decisions that are likely to reduce the risk to our communities.”
San Francisco criminal justice officials could not provide statistics documenting how the algorithm has been used and whether it has been successful.
Critics of the program — among them prosecutors in Gascón’s office — point to deaths like French’s as proof of the tool’s flaws. Last month, a mother in New Jersey filed a federal lawsuit against the Arnold Foundation after her son was fatally shot, allegedly by a man who had just been released from custody.
The defendant in the New Jersey case had a lengthy criminal record, and like Mims, his latest charge was being a felon in possession of a weapon. According to the lawsuit, New Jersey law enforcement officers had expressed concern that the algorithm undervalued the danger of cases involving guns.
“This has been used and implemented in New Mexico, New Jersey and San Francisco, and it sounds like in all of these jurisdictions, there are failures,” said Eric Siddall, vice president of the Los Angeles Association of Deputy District Attorneys.
“We’re trying to use a method that hedge funds use to make money to make a determination of whether someone should be in custody or not. The problem is if a hedge fund makes a mistake, they lose money. If we make a mistake, someone dies.”
The philanthropist behind the program, John Arnold, made billions as a hedge fund manager.
While Gascón’s office acknowledged that some city prosecutors have disagreed with the tool’s assessments, the office remains in support of the program and any productive effort to move away from the money bail system.
“I think there is some disagreement to how certain scores are calculated,” said Max Szabo, a spokesman. “But I think it’s important to note that people who get out on bail also commit crimes.”
Another office spokesman, Alex Bastian, said, “If we are trying to enhance public safety and want to do so in an equitable way, then custody decisions based on risk are going to be better than those based on financial means. However, it is vital that risk is calculated as accurately as possible. That is why the system needs to constantly push itself to do the best it can in taking on the difficult task of predicting human behavior.”
Proponents of the tool stressed that judges maintain discretion over whether to release a defendant.
“It is an objective recommendation, but it is not the final word,” said Nancy Rubin, interim executive director of the Pretrial Diversion Project, a nonprofit group funded by the city Sheriff ’s Department and the mayor’s office. “It is presented to the judge and the district attorney and the defense, along with the rap sheet, and it is ultimately up to the court to decide.”
Attorneys with the public defender’s office have expressed skepticism of the algorithm ever since it was introduced. Public Defender Jeff Adachi, who is representing Mims’ codefendant in the Twin Peaks killing, said the concern is that clients be treated as individuals, not as pieces of a formula.
Nevertheless, Adachi said, French’s death “shouldn’t be used as an indictment of the assessment tool.”
“I’ve been around a long time and there are cases where people had been released and something happened that nobody anticipated,” he said. “While it’s certainly tragic, we shouldn’t make any assumptions here.”