The Maui News

Prosecutor­s turn to AI to reduce racial bias

San Francisco to use system to scrub identifyin­g details from police reports

- By JOCELYN GECKER

SAN FRANCISCO — In a firstof-its kind experiment, San Francisco prosecutor­s are turning to artificial intelligen­ce to reduce racial bias in the courts, adopting a system that strips certain identifyin­g details from police reports and leaves only key facts to govern charging decisions.

District Attorney George Gascon announced Wednesday that his office will begin using the technology in July to “take race out of the equation” when deciding whether to accuse suspects of a crime.

Criminal-justice experts say they have never heard of any project like it, and they applauded the idea as a creative, bold effort to make charging practices more colorblind.

Gascon’s office worked with data scientists and engineers at the Stanford Computatio­nal Policy Lab to develop a system that takes electronic police reports and automatica­lly removes a suspect’s name, race and hair and eye colors. The names of witnesses and police officers will also be removed, along with specific neighborho­ods or districts that could indicate the race of those involved.

“The criminal-justice system has had a horrible impact on people of color in this country, especially African Americans, for generation­s,” Gascon said in an interview ahead of the announceme­nt. “If all

prosecutor­s took race out of the picture when making charging decisions, we would probably be in a much better place as a nation than we are today.”

Gascon said his goal was to develop a model that could be used elsewhere, and the technology will be offered free to other prosecutor­s across the country.

“I really commend them, it’s a brave move,” said Lucy Lang, a former New York City prosecutor and executive director of the Institute for Innovation in Prosecutio­n at John Jay College of Criminal Justice. “Any effort to try to minimize disparate outcomes are laudable.”

The technology relies on humans to collect the facts, which can still be influenced by racial bias. Prosecutor­s will make an initial charging decision based on the redacted police report. Then they will look at the entire report, with details restored, to see if there are any extenuatin­g reasons to reconsider the first decision, Gascon said.

“Hats off for trying new stuff,” said Phillip Atiba Goff, president for the Center for Policing Equity. “There are so many contextual factors that might indicate race and ethnicity that it’s hard to imagine how even a human could take that all out.”

Studies have shown that bias exists nationwide at all levels of the criminal justice system, from police making arrests and prosecutor­s deciding whether to charge suspects to court conviction­s and sentencing.

A 2017 study commission­ed by the San Francisco district attorney found “substantia­l racial and ethnic disparitie­s in criminal justice outcomes.” African Americans represente­d only 6 percent of the county’s population but accounted for 41 percent of arrests between 2008 and 2014.

The study found “little evidence of overt bias against any one race or ethnic group” among prosecutor­s who process criminal offenses. But Gascon said he wanted to find a way to help eliminate an implicit bias that could be triggered by a suspect’s race, an ethnic-sounding name or a crime-ridden neighborho­od where they were arrested.

The move comes after San Francisco last month became the first U.S. city to ban the use of facial recognitio­n by police and other city agencies. The decision reflected a growing backlash against AI technology as cities seek to regulate surveillan­ce by municipal agencies.

Newspapers in English

Newspapers from United States