San Francisco Chronicle

Algorithms help rule on jail time

Judge still makes decision, but they help sort data objectivel­y

- By Matt O’Brien and Dake Kang

CLEVELAND — The centuries-old process of releasing defendants on bail, long the province of judicial discretion, is getting a major assist ... courtesy of artificial intelligen­ce.

In late August, Hercules Shepherd Jr. walked up to the stand in a Cleveland courtroom, dressed in an orange jumpsuit. Two nights earlier, an officer had arrested him at a traffic stop with a small bag of cocaine, and he was about to be arraigned.

Judge Jimmy Jackson Jr. looked at Shepherd, then down at a computer-generated score on the front of the 18year-old’s case file. Two out of six for likelihood of committing another crime. One out of six for likelihood of skipping court. The scores marked Shepherd as a prime candidate for pretrial release with low bail.

“We ask the court to take that all into considerat­ion,” said Shepherd’s public defend- er, David Magee.

Not long ago, Jackson would have decided Shepherd’s nearterm future based on a reading of court files and his own intuition. But in Cleveland and a growing number of other local and state courts, judges are now guided by algorithms before ruling whether defendants can return to everyday life, or remain locked up awaiting trial.

Experts say the use of these risk assessment­s may be the biggest shift in courtroom decision-making since American judges began accepting social science and other expert evidence more than a century ago. Christophe­r Griffin, a

research director at Harvard Law School’s Access to Justice Lab, calls the new digital tools “the next step in that revolution.”

Critics, however, worry that such algorithms might end up supplantin­g judges’ own judgment, and possibly even perpetuate biases in ostensibly neutral form.

AI gets a lot of attention for the jobs it eradicates. That’s not happening to judges, at least not yet. But as in many other white-collar careers that require advanced degrees or other specialize­d education, AI is reshaping, if not eliminatin­g, some of judges’ most basic tasks — many of which can still have enormous consequenc­es for the people involved.

Cash bail, which is designed to ensure that people charged of crimes turn up for trial, has been part of the U.S. court system since its beginning. But forcing defendants to pony up large sums has drawn fire in recent years for keeping poorer defendants in jail while letting the wealthier go free. Studies have also shown it widens racial disparitie­s in pretrial incarcerat­ion.

A bipartisan bail reform movement looking for alternativ­es to cash bail has found it in statistics and computer science: AI algorithms that can scour through large sets of courthouse data to search for associatio­ns and predict how individual defendants might behave.

States such as Arizona, Kentucky and Alaska have adopted these tools, which aim to identify people most likely to flee or commit another crime. Defendants who receive low scores are recommende­d for release under court supervisio­n.

A year ago, New Jersey took an even bigger leap into algorithmi­c assessment­s by overhaulin­g its entire state court system for pretrial proceeding­s. The state’s judges now rely on what’s called the Public Safety Assessment score, developed by Houston’s Laura and John Arnold Foundation.

That tool is part of a larger package of bail reforms that took effect in January 2017, effectivel­y wiping out the bailbond industry, emptying many jail cells and modernizin­g the computer systems that handle court cases. “We’re trying to go paperless, fully automated,” said Judge Ernest Caposela, who helped usher in the changes at the busy Passaic County courthouse in Paterson, N.J.

New Jersey’s assessment­s begin as soon as a suspect is fingerprin­ted by police. That informatio­n flows to an entirely new office division, called Pretrial Services, where cubicle workers oversee how defendants are processed through the computeriz­ed system.

The first hearing happens quickly, and from the jailhouse — defendants appear by videoconfe­rence as their risk score is presented to the judge. If released, they get text alerts to remind them of court appearance­s. Caposela compares the automation to “the same way you buy something from Amazon. Once you’re in the system, they’ve got everything they need on you.”

All of that gives more time for judges to carefully deliberate based on the best informatio­n available, Caposela said, while also keeping people out of jail when they’re not a safety threat.

Among other things, the algorithm aims to reduce biased rulings that could be influenced by a defendant’s race, gender or clothing — or maybe just how cranky a judge might be feeling after missing breakfast. The nine risk factors used to evaluate a defendant include age and past criminal conviction­s. But they exclude race, gender, employment history and where a person lives. They also exclude a history of arrests, which can stack up against people more likely to encounter police — even if they’re not found to have done anything wrong.

The Arnold Foundation takes pains to distinguis­h the Public Safety Assessment from other efforts to automate judicial decisions — in particular, a proprietar­y commercial system called Compas that’s been used to help determine prison sentences for convicted criminals. An investigat­ive report by ProPublica found that Compas was falsely flagging black defendants as likely future criminals at almost twice the rate as white defendants.

Other experts have questioned those findings, and the U.S. Supreme Court last year declined to take up a case of an incarcerat­ed Wisconsin man who argued the use of gender as a factor in the Compas assessment violated his rights.

Arnold notes that its algorithm is straightfo­rward and open to inspection by anyone — although the underlying data it relies on is not. “There’s no mystery as to how a risk score is arrived at for a given defendant,” said Matt Alsdorf, who directed the foundation’s risk-assessment efforts until late last year.

Advocates of the new approach are quick to note that the people in robes are still in charge.

“This is not something where you put in a ticket, push a button and it tells you what bail to give somebody,” said Judge Ronald Adrine, who presides over the Cleveland Municipal Court. Instead, he says, the algorithmi­c score is just one among several factors for judges to consider.

But other experts worry the algorithms will make judging more automatic and rote over time — and that, instead of eliminatin­g bias, could perpetuate it under the mask of data-driven objectivit­y. Research has shown that when people receive specific advisory guidelines, they tend to follow them in lieu of their own judgment, said Bernard Harcourt, a law and political science professor at Columbia.

“Those forms of expertise have a real gravitatio­nal pull on decisionma­kers,” he said. “It’s naive to think people are simply going to not rely on them.”

And if that happens, judges — like all people — may find it easy to drop their critical thinking skills when presented with what seems like an easy answer, said Kristian Hammond, a Northweste­rn University computer scientist who has co-founded his own AI company.

The solution is to “refuse to build boxes that give you answers,” he says.” What judges really need are “boxes that give you answers and explanatio­ns and ask you if there’s anything you want to change.”

Before his arrest on Aug. 29, Hercules Shepherd had no criminal record.

Coaches were interested in recruiting the star high school basketball player for their college teams. Recruitmen­t would mean a big scholarshi­p that could help Shepherd realize his dreams of becoming an engineer. But by sitting in jail, Shepherd was missing two days of classes. If he missed two more, he could get kicked out of school.

Judge Jackson looked up. “Doing OK today, Mr. Shepherd?” he asked. Shepherd nodded.

“If he sits in jail for another month, and gets expelled from school, it has wider ramificati­ons,” Magee said.

“Duly noted. Mr. Shepherd? I’m giving you personal bond,” Jackson said. “Your opportunit­y to turn that around starts right now. Do so, and you’ve got the whole world right in front of you.” ( Jackson subsequent­ly lost an election in November and is no longer a judge; his winning opponent, however, also supports use of the pretrial algorithm.)

Smiling, Shepherd walked out of the courtroom. That night, he was led out of the Cuyahoga County Jail; the next day, he was in class. Shepherd says he wouldn’t have been able to afford bail. Shepherd’s mother is in prison, and his aging father is on Social Security.

His public defender said that Shepherd’s low score helped him. If he isn’t arrested again within a year, his record will be wiped clean.

 ?? Dake Kang / Associated Press ?? Judge Jimmy Jackson Jr. speaks on the first day of the use of risk-assessment software in Municipal Court in Cleveland.
Dake Kang / Associated Press Judge Jimmy Jackson Jr. speaks on the first day of the use of risk-assessment software in Municipal Court in Cleveland.
 ?? Dake Kang / Associated Press 2017 ?? Stephanie Pope-Earley sorts through defendant files scored with risk-assessment software for Jimmy Jackson Jr., a municipal court judge.
Dake Kang / Associated Press 2017 Stephanie Pope-Earley sorts through defendant files scored with risk-assessment software for Jimmy Jackson Jr., a municipal court judge.

Newspapers in English

Newspapers from United States