Pros­e­cu­tors turn to AI to re­duce racial bias

San Fran­cisco to use sys­tem to scrub iden­ti­fy­ing de­tails from po­lice re­ports

The Maui News - - FRONT PAGE - By JO­CE­LYN GECKER

SAN FRAN­CISCO — In a firstof-its kind ex­per­i­ment, San Fran­cisco pros­e­cu­tors are turn­ing to ar­ti­fi­cial in­tel­li­gence to re­duce racial bias in the courts, adopt­ing a sys­tem that strips cer­tain iden­ti­fy­ing de­tails from po­lice re­ports and leaves only key facts to gov­ern charg­ing de­ci­sions.

Dis­trict At­tor­ney Ge­orge Gas­con an­nounced Wed­nes­day that his of­fice will be­gin us­ing the tech­nol­ogy in July to “take race out of the equa­tion” when de­cid­ing whether to ac­cuse suspects of a crime.

Crim­i­nal-jus­tice ex­perts say they have never heard of any project like it, and they ap­plauded the idea as a creative, bold ef­fort to make charg­ing prac­tices more col­or­blind.

Gas­con’s of­fice worked with data sci­en­tists and engi­neers at the Stan­ford Com­pu­ta­tional Pol­icy Lab to de­velop a sys­tem that takes elec­tronic po­lice re­ports and au­to­mat­i­cally re­moves a sus­pect’s name, race and hair and eye col­ors. The names of wit­nesses and po­lice of­fi­cers will also be re­moved, along with spe­cific neigh­bor­hoods or dis­tricts that could in­di­cate the race of those in­volved.

“The crim­i­nal-jus­tice sys­tem has had a hor­ri­ble im­pact on peo­ple of color in this country, es­pe­cially African Amer­i­cans, for gen­er­a­tions,” Gas­con said in an in­ter­view ahead of the an­nounce­ment. “If all

pros­e­cu­tors took race out of the pic­ture when mak­ing charg­ing de­ci­sions, we would prob­a­bly be in a much bet­ter place as a na­tion than we are to­day.”

Gas­con said his goal was to de­velop a model that could be used elsewhere, and the tech­nol­ogy will be of­fered free to other pros­e­cu­tors across the country.

“I re­ally com­mend them, it’s a brave move,” said Lucy Lang, a for­mer New York City pros­e­cu­tor and ex­ec­u­tive di­rec­tor of the In­sti­tute for In­no­va­tion in Pros­e­cu­tion at John Jay Col­lege of Crim­i­nal Jus­tice. “Any ef­fort to try to min­i­mize dis­parate out­comes are laud­able.”

The tech­nol­ogy re­lies on hu­mans to col­lect the facts, which can still be in­flu­enced by racial bias. Pros­e­cu­tors will make an ini­tial charg­ing de­ci­sion based on the redacted po­lice re­port. Then they will look at the en­tire re­port, with de­tails re­stored, to see if there are any ex­ten­u­at­ing rea­sons to re­con­sider the first de­ci­sion, Gas­con said.

“Hats off for try­ing new stuff,” said Phillip Atiba Goff, pres­i­dent for the Cen­ter for Polic­ing Eq­uity. “There are so many con­tex­tual fac­tors that might in­di­cate race and eth­nic­ity that it’s hard to imag­ine how even a hu­man could take that all out.”

Stud­ies have shown that bias ex­ists na­tion­wide at all lev­els of the crim­i­nal jus­tice sys­tem, from po­lice mak­ing ar­rests and pros­e­cu­tors de­cid­ing whether to charge suspects to court con­vic­tions and sen­tenc­ing.

A 2017 study com­mis­sioned by the San Fran­cisco dis­trict at­tor­ney found “sub­stan­tial racial and eth­nic dis­par­i­ties in crim­i­nal jus­tice out­comes.” African Amer­i­cans rep­re­sented only 6 per­cent of the county’s pop­u­la­tion but ac­counted for 41 per­cent of ar­rests be­tween 2008 and 2014.

The study found “lit­tle ev­i­dence of overt bias against any one race or eth­nic group” among pros­e­cu­tors who process crim­i­nal of­fenses. But Gas­con said he wanted to find a way to help elim­i­nate an im­plicit bias that could be trig­gered by a sus­pect’s race, an eth­nic-sound­ing name or a crime-rid­den neigh­bor­hood where they were ar­rested.

The move comes af­ter San Fran­cisco last month be­came the first U.S. city to ban the use of facial recog­ni­tion by po­lice and other city agen­cies. The de­ci­sion re­flected a grow­ing back­lash against AI tech­nol­ogy as cities seek to reg­u­late surveil­lance by mu­nic­i­pal agen­cies.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.