The Commercial Appeal

Child welfare algorithm faces DOJ scrutiny

- Sally Ho and Garance Burke

PITTSBURGH – The Justice Department has been scrutinizi­ng a controvers­ial artificial intelligen­ce tool used by a Pittsburgh-area child protective services agency following concerns that it could result in discrimina­tion against families with disabiliti­es, The Associated Press has learned.

The interest from federal civil rights attorneys comes after an AP investigat­ion revealed potential bias and transparen­cy issues about the opaque algorithm that is designed to assess a family’s risk level when they are reported for child welfare concerns in Allegheny County.

Several civil rights complaints were filed in the fall about the Allegheny Family Screening Tool, which is used to help social workers decide which families to investigat­e, AP has learned.

Two sources said that attorneys in the Justice Department’s Civil Rights Division cited the AP investigat­ion when urging them to submit formal complaints detailing their concerns about how the algorithm could harden bias against people with disabiliti­es, including families with mental health issues.

A third person told AP that the same group of federal civil rights attorneys also spoke with them in November as part of a broad conversati­on about how algorithmi­c tools could potentiall­y exacerbate disparitie­s, including for people with disabiliti­es. That conversati­on explored the design and constructi­on of Allegheny’s influentia­l algorithm, though the full scope of the Justice Department’s interest is unknown.

All three sources spoke to AP on the condition of anonymity, saying the Justice Department asked them not to discuss the confidenti­al conversati­ons, and two said they also feared profession­al retaliatio­n.

Wyn Hornbuckle, a Justice Department spokesman, declined to comment.

Algorithms use pools of informatio­n to turn data points into prediction­s, whether that’s for online shopping, identifyin­g crime hot spots or hiring workers. Many child welfare agencies in the U.S. are considerin­g adopting such tools as part of their work with children and families.

Though there’s been widespread debate over the moral consequenc­es of using artificial intelligen­ce in child protective services, the Justice Department’s interest in the pioneering Allegheny algorithm marks a significan­t turn toward possible legal implicatio­ns.

Supporters see algorithms as a promising way to make a strained child protective services system both more thorough and efficient, saying child welfare officials should use all tools at their disposal to make sure children aren’t maltreated. But critics worry that including data points collected largely from people who are poor can automate discrimina­tion against families based on race, income, disabiliti­es or other external characteri­stics.

Robin Frank, a veteran family law attorney in Pittsburgh and vocal critic of the Allegheny algorithm, said she also filed a complaint with the Justice Department in October on behalf of a client with an intellectu­al disability who is fighting to get his daughter back from foster care. The AP obtained a copy of the complaint, which raised concerns about how the Allegheny Family Screening Tool assesses a family’s risk.

“I think it’s important for people to be aware of what their rights are and to the extent that we don’t have a lot of informatio­n when there seemingly are valid questions about the algorithm, it’s important to have some oversight,” Frank said.

Mark Bertolet, spokesman for the Allegheny County Department of Human Services, said by email that the agency had not heard from the Justice Department and declined interview requests.

“We are not aware of any concerns about the inclusion of these variables from research groups’ past evaluation or community feedback on the (Allegheny Family Screening Tool),” the county said, describing previous studies and outreach regarding the tool.

Allegheny County said its algorithm has used data points tied to disabiliti­es in children, parents and other members of local households because they can help predict the risk that a child will be removed from their home after a maltreatme­nt report. The county added that it has updated its algorithm several times and has sometimes removed disabiliti­es-related data points.

The Allegheny Family Screening Tool was specifical­ly designed to predict the risk that a child will be placed in foster care in the two years after the family is investigat­ed. It has used a trove of detailed personal data collected from child welfare history, as well as birth, Medicaid, substance abuse, mental health, jail and probation records, among other government data sets. When the algorithm calculates a risk score of 1 to 20, the higher the number, the greater the risk. The risk score alone doesn’t determine what happens in the case.

The AP first revealed racial bias and transparen­cy concerns in a story last April that focused on the Allegheny tool and how its statistica­l calculatio­ns help social workers decide which families should be investigat­ed for neglect – a nuanced term that can include everything from inadequate housing to poor hygiene, but is a different category from physical or sexual abuse, which is investigat­ed separately in Pennsylvan­ia and is not subject to the algorithm.

A child welfare investigat­ion can result in vulnerable families receiving more support and services, but it can also lead to the removal of children for foster care and ultimately, the terminatio­n of parental rights.

The county has said that hotline workers determine what happens with a family’s case and can always override the tool’s recommenda­tions. It has also underscore­d that the tool is only applied to the beginning of a family’s potential involvemen­t with the child welfare process. A different social worker who later conducts the investigat­ions, as well as families and their attorneys, aren’t allowed to know the scores.

Allegheny’s algorithm, in use since 2016, has at times drawn from data related to Supplement­al Security Income, a Social Security Administra­tion program that provides monthly payments to adults and children with a disability; as well as diagnoses for mental, behavioral and neurodevel­opmental disorders, including schizophre­nia or mood disorders, AP found.

The county said that when the disabiliti­es data is included, it “is predictive of the outcomes” and “it should come as no surprise that parents with disabiliti­es ... may also have a need for additional supports and services.” The county added that there are other risk assessment programs that use data about mental health and other conditions that may affect a parent’s ability to care for a child.

The AP obtained records showing hundreds of specific variables that are used to calculate the risk scores for families who are reported to child protective services, including the public data that powers the Allegheny algorithm and similar tools deployed in child welfare systems elsewhere in the U.S.

The AP’S analysis of Allegheny’s algorithm and those inspired by it in Los Angeles County, California, Douglas County, Colorado, and in Oregon reveals a range of controvers­ial data points that have measured people with low incomes and other disadvanta­ged demographi­cs, at times measuring families on race, zip code, disabiliti­es and their use of public welfare benefits.

Since the AP’S investigat­ion published, Oregon dropped its algorithm due to racial equity concerns and the White House Office of Science and Technology Policy emphasized that parents and social workers

needed more transparen­cy about how government agencies were deploying algorithms as part of the nation’s first “AI Bill of Rights.”

The Justice Department has shown a broad interest in investigat­ing algorithms in recent years, said Christy Lopez, a Georgetown University law professor who previously led some of the Justice Department’s civil rights division litigation and investigat­ions.

In a keynote about a year ago, Assistant Attorney General Kristen Clarke warned that AI technologi­es had “serious implicatio­ns for the rights of people with disabiliti­es,” and her division more recently issued guidance to employers saying using AI tools in hiring could violate the Americans with Disabiliti­es Act.

“They are doing their jobs as civil rights investigat­ors to get to the bottom of what’s going on,” Lopez said of the Justice Department scrutiny of Allegheny’s tool.

“It appears to me that this is a priority for the division, investigat­ing the extent to which algorithms are perpetuati­ng discrimina­tory practices.”

Traci Laliberte, a University of Minnesota expert on child welfare and disabiliti­es, said the Justice Department’s inquiry stood out to her, as federal authoritie­s have largely deferred to local child welfare agencies.

“The Department of Justice is pretty far afield from child welfare,” Laliberte said. “It really has to rise to the level of pretty significan­t concern to dedicate time and get involved.”

Emily Putnam-hornstein and Rhema Vaithianat­han, the two developers of Allegheny’s algorithm and other tools like it, deferred to Allegheny County’s answers about the algorithm’s inner workings.

They said in an email that they were unaware of any Justice Department scrutiny relating to the algorithm.

Researcher­s and community members have long raised concerns that some of the data powering child welfare algorithms may heighten historical biases against marginaliz­ed people within children protective services.

That includes parents with disabiliti­es, a community that is a protected class under federal civil rights law.

The Americans with Disabiliti­es Act prohibits discrimina­tion on the basis of disability, which can include a wide spectrum of conditions, from diabetes, cancer and hearing loss to intellectu­al disabiliti­es and mental and behavioral health diagnosis like ADHD, depression and schizophre­nia.

This story, supported by the Pulitzer Center on Crisis Reporting, is part of an ongoing Associated Press series, “Tracked,” that investigat­es the power and consequenc­es of decisions driven by algorithms on people’s everyday lives.

 ?? Robin Frank Family law attorney in Pittsburgh ?? “We don’t have a lot of informatio­n when there seemingly are valid questions about the algorithm, it’s important to have some oversight.”
Robin Frank Family law attorney in Pittsburgh “We don’t have a lot of informatio­n when there seemingly are valid questions about the algorithm, it’s important to have some oversight.”

Newspapers in English

Newspapers from United States