Pittsburgh Post-Gazette

Can algorithms be fair?

Pitt group discusses potential bias

- By Kate Giammarise

If you have thoughts about how local government­s should use the data they collect about you and your fellow citizens, a University of Pittsburgh task force wants to hear from you.

The group — convened by Pitt’s Institute for Cyber Law, Policy, and Security — is looking at algorithms used by local government­s for potential bias. A public meeting that had been scheduled for this week has been postponed, though comments can still be submitted online.

Algorithms are used by government agencies to predict decisions using past data, said Christophe­r Deluzio, policy director at Pitt Cyber and a task force member.

“We’re worried about locking in and perpetuati­ng inequality and bias” if algorithms are used without proper oversight, he said, speaking at a public meeting of the task force last week.

For instance, neighborho­ods that have been historical­ly policed more might have data that reflects more crime there, because of a greater police presence.

At a meeting last week at the Homewood-Brushton YMCA, several dozen people heard from task force members, asked questions

and discussed three algorithms used by local government­s: the Allegheny Family Screening Tool, predictive policing and algorithms used in bail decisions.

The Allegheny Family Screening Tool is used in screening calls for potential child neglect and has been used by the county’s Department of Human Services since 2016.

Predictive policing by Pittsburgh police uses a “hot spot” tool to predict where crime might occur based on 911 and prior crime data.

Judges in Allegheny County use data such as past criminal history, age and driving record in tools that are used to make bail decisions.

Much of the discussion last week focused on the uses and potential uses of algorithms in criminal justice settings, such bail and policing. The crowd was generally skeptical that the tools could be applied without bias.

“We live in a world that’s, unfortunat­ely, biased,” said Tricina Cash, who attended the meeting.

Several in attendance lamented that a person might not know when an algorithm is being used to make a decision impacting them.

“I think there needs to be more transparen­cy,” Richard Morris said.

Black youth already face harsher treatment by police than their white peers, said Tim Stevens, chairman of the Black Political Empowermen­t Project

“That’s why I have an issue with all this,” Mr. Stevens said.

More informatio­n is online at https://www.cyber.pitt.edu/algorithms.

 ?? Steve Mellon/Post-Gazette ?? Facilitato­r Ryan Scott, left, talks algorithms with Martell Covington, of Homewood, during Tuesday's public meeting.
Steve Mellon/Post-Gazette Facilitato­r Ryan Scott, left, talks algorithms with Martell Covington, of Homewood, during Tuesday's public meeting.
 ?? Steve Mellon/Post-Gazette ?? Eric Hulsey, of Regent Square, left, and facilitato­r LaTrenda Leonard Sherrill discuss algorithms during a public meeting on bias in algorithms Tuesday at the Homewood-Brushton Branch YMCA on Tuesday. At right is Kenny Chen of Partnershi­p to Advance Responsibl­e Technology (PART).
Steve Mellon/Post-Gazette Eric Hulsey, of Regent Square, left, and facilitato­r LaTrenda Leonard Sherrill discuss algorithms during a public meeting on bias in algorithms Tuesday at the Homewood-Brushton Branch YMCA on Tuesday. At right is Kenny Chen of Partnershi­p to Advance Responsibl­e Technology (PART).

Newspapers in English

Newspapers from United States