San Francisco Chronicle

D.A.’s tech tool to limit bias

Approach cuts race in police reports so it’s not a factor in charging crimes

- By Evan Sernoffsky

San Francisco District Attorney George Gascón on Wednesday said he will launch a new “blind-charging” tool that removes racial informatio­n from police reports when prosecutor­s are deciding whether to criminally charge suspects.

The new approach in charging cases is Gascón’s latest effort to use technology to help reform and reduce racial disparitie­s in California’s criminal justice system before he leaves office at the end of this year.

Gascón, who is often cited as one of the more progressiv­e district attorneys in the country, asked the Stanford Computatio­nal Policy Lab to build the technology to remove the potential for implicit bias when prosecutor­s conduct an initial review of a case. He said the district attorney’s office will begin using the tool next month.

“If we can take racial bias out of our system or reduce it significan­tly, we can be a much better nation,” Gascón said in a recent interview. “It’s not only improving our system but creating a model that can be used elsewhere.”

Stark racial disparitie­s in California’s criminal justice system have endured for decades, and black people and Latinos continue to be arrested and criminally charged more often than white people.

In 2016, Latinos made up 41% of arrests in the state, 36% were white and 16% were

African Americans, despite black people making up just 6% of the population, according to a study by the Public Policy Institute of California.

Racial disparitie­s in San Francisco are even more pronounced. African Americans accounted for 41% of people arrested between 2008 and 2014, while making up only 6% of the city’s population, according to a recent study by UC Berkeley and University of Pennsylvan­ia researcher­s.

Experts say Latinos and African Americans have long been overrepres­ented in the country’s criminal justice system due to numerous factors, including implicit bias, or assumption­s and attitudes someone might not actively consider.

“When I first became district attorney, one concern was to understand how the criminal justice system impacts people of color disproport­ionately,” Gascón said. “I wanted to see if there was anything in our practice that we could improve.”

The district attorney decided to reach out to the Stanford Computatio­nal Policy Lab, which already had many of the tools available to help create the artificial intelligen­ce.

Racial disparitie­s in San Francisco’s criminal justice system are driven by downstream factors like arrests, Gascón said, and his office tries not to exacerbate the disparitie­s. Even so, he wanted to remove any possibilit­y for implicit bias in his office to ensure “the purity of the decision isn’t questionab­le.”

The system, Gascón said, will create a model that other prosecutin­g agencies around the country can use, and Stanford has agreed to publicly release the technology at no cost.

The technology organizes a police report and automatica­lly redacts the race of the parties involved in the incident. It also scrubs the names of officers, witnesses and suspects, along with locations and neighborho­ods that could suggest a person’s race.

“If bias we out can of take our system racial or reduce it significan­tly, we can be a much better nation.” S.F. District Attorney George Gascón

In the complicate­d world of artificial intelligen­ce, the technology is relatively simple, said Alex Chohlas-Wood, deputy director of the Stanford Computatio­nal Policy Lab. It uses pattern recognitio­n and Natural English Processing to identify which words in a police report should be redacted and fills them in with a general descriptio­n.

The digital tool uses machine learning, so it can make decisions without human interventi­on.

The district attorney’s office will start using the tool in the 80% of cases that come in through general intake. Cases like homicides, domestic violence and other specialize­d units will not immediatel­y use the technology.

During the first review process of general intake cases, prosecutor­s do not look at evidence like videos or pictures that would reveal a person’s race. The case then goes to a second review where a prosecutor makes a decision on whether the evidence is strong enough to move forward with charges.

If a prosecutor decides to reverse a charging decision between the first and second review, when they will likely learn the race of the parties, he or she will have to document the reason why it’s justified in a report, Gascón said.

The tool, he said, will help streamline charging decisions by expediting the ability to review police reports and quickly analyze the informatio­n.

Gascón’s announceme­nt was met with a lukewarm response from some San Francisco social justice groups, many of which still resent the district attorney’s decision not to criminally charge the officers who shot and killed Mario Woods in 2015.

“We don’t know anything about this,” said Amos Brown, president of the San Francisco branch of the NAACP.

The San Francisco public defender’s office — a longestabl­ished force in the fight against racial inequity under its late leader, Jeff Adachi — declined to comment “until we can learn more,” a spokeswoma­n said.

Gascón’s aim to remove race in initial charging decisions is the latest effort by his office to team up with outside groups and use technology to help reform the criminal justice system in California and beyond.

After voters in November 2016 passed Propositio­n 64, legalizing the recreation­al use of marijuana, Gascón partnered with Code For America to expunge every eligible potrelated case in San Francisco.

The district attorney’s office in February announced that Code For America’s algorithm helped identify more than 9,000 marijuana-related cases that could be wiped out.

In March, Gascón sponsored AB1076, which would automatica­lly wipe out eligible conviction­s and arrest records in California with technology like Code For America’s algorithm.

“There are many areas in the criminal justice system, and government in general, where you can take technology and use it to increase the efficiency and increase the dignity in which you do your work,” Gascón said.

 ?? Yalonda M. James / The Chronicle ?? S.F. District Attorney George Gascón announces the new “blind-charging” tech tool he asked Stanford to create for his office.
Yalonda M. James / The Chronicle S.F. District Attorney George Gascón announces the new “blind-charging” tech tool he asked Stanford to create for his office.

Newspapers in English

Newspapers from United States