Oroville Mercury-Register

An algorithm that screens for child neglect raises concerns

- By Sally Ho and Garance Burke

For family law attorney Robin Frank, defending parents at one of their lowest points — when they risk losing their children — has never been easy.

The job is never easy, but in the past she knew what she was up against when squaring off against child protective services in family court. Now, she worries she’s fighting something she can’t see: an opaque algorithm whose statistica­l calculatio­ns help social workers decide which families should be investigat­ed in the first place.

“A lot of people don’t know that it’s even being used,” Frank said. “Families should have the right to have all of the informatio­n in their file.”

From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvan­ia, an Associated Press review has identified a number of concerns about the technology, including questions about its reliabilit­y and its potential to harden racial disparitie­s in the child welfare system. Related issues have already torpedoed some jurisdicti­ons’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.

According to new research from a Carnegie Mellon University team obtained exclusivel­y by AP, Allegheny’s algorithm in its first years of operation showed a pattern of flagging a disproport­ionate number of Black children for a “mandatory” neglect investigat­ion, when compared with white children. The independen­t researcher­s, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about one-third of the time.

County officials said that social workers can always override the tool, and called the research “hypothetic­al.”

Child welfare officials in Allegheny County, the cradle of Mister Rogers’ TV neighborho­od and the icon’s child-centric innovation­s, say the cutting-edge tool — which is capturing attention around the country — uses data to support agency workers as they try to protect children from neglect. That nuanced term can include everything from inadequate housing to poor hygiene, but is a different category from physical or sexual abuse, which is investigat­ed separately in Pennsylvan­ia and is not subject to the algorithm.

“Workers, whoever they are, shouldn’t be asked to make, in a given year, 14, 15, 16,000 of these kinds of decisions with incredibly imperfect informatio­n,” said Erin Dalton, director of the county’s Department of Human Services and a pioneer in implementi­ng the predictive child welfare algorithm.

This story, supported by the Pulitzer Center for Crisis Reporting, is part of an ongoing Associated Press series, “Tracked,” that investigat­es the power and consequenc­es of decisions driven by algorithms on people’s everyday lives.

Critics say it gives a program powered by data mostly collected about poor people an outsized role in deciding families’ fates, and they warn against local officials’ growing reliance on artificial intelligen­ce tools.

If the tool had acted on its own to screen in a comparable rate of calls, it would have recommende­d that two-thirds of Black children be investigat­ed, compared with about half of all other children reported, according to another study published last month and co-authored by a researcher who has audited the county’s algorithm.

Advocates worry that if similar tools are used in other child welfare systems with minimal or no human interventi­on — akin to how algorithms have been used to make decisions in the criminal justice system — they could reinforce existing racial disparitie­s in the child welfare system.

“It’s not decreasing the impact among Black families,” said Logan Stapleton, a researcher at Carnegie Mellon University. “On the point of accuracy and disparity, (the county is) making strong statements that I think are misleading.”

Because family court hearings are closed to the public and the records are sealed, AP wasn’t able to identify first-hand any families who the algorithm recommende­d be mandatoril­y investigat­ed for child neglect, nor any cases that resulted in a child being sent to foster care. Families and their attorneys can never be sure of the algorithm’s role in their lives either because they aren’t allowed to know the scores.

Child welfare agencies in at least 26 states and Washington, D.C., have considered using algorithmi­c tools, and at least 11 have deployed them, according to American Civil Liberties Union.

Larimer County, Colorado, home to Fort Collins, is now testing a tool modeled on Allegheny’s and plans to share scores with families if it moves forward with the program.

“It’s their life and their history,” said Thad Paul, a manager with the county’s Children Youth & Family Services. “We want to minimize the power differenti­al that comes with being involved in child welfare … we just really think it is unethical not to share the score with families.”

Oregon does not share risk score numbers from its statewide screening tool, which was first implemente­d in 2018 and was inspired by Allegheny’s algorithm. The Oregon Department of Human Services — currently preparing to hire its eighth new child welfare director in six years — explored at least four other algorithms while the agency was under scrutiny by a crisis oversight board ordered by the governor.

It recently paused a pilot algorithm built to help decide when foster care children can be reunified with their families. Oregon also explored three other tools — predictive models to assess a child’s risk for death and severe injury, whether children should be placed in foster care and if so, where.

 ?? KEITH SRAKOCIC — THE ASSOCIATED PRESS ?? Case work supervisor Jessie Schemm looks over the first screen of software used by workers who field calls at an intake call screening center for the Allegheny County Children and Youth Servicesin Penn Hills, Pa.
KEITH SRAKOCIC — THE ASSOCIATED PRESS Case work supervisor Jessie Schemm looks over the first screen of software used by workers who field calls at an intake call screening center for the Allegheny County Children and Youth Servicesin Penn Hills, Pa.

Newspapers in English

Newspapers from United States