The Norwalk Hour

Here’s how an AI tool may flag parents with disabiliti­es

- By Sally Ho and Garance Burke

PITTSBURGH — For the two weeks that the Hackneys’ baby girl lay in a Pittsburgh hospital bed weak from dehydratio­n, her parents rarely left her side, sometimes sleeping on the fold-out sofa in the room.

They stayed with their daughter around the clock when she was moved to a rehab center to regain her strength. Finally, the 8month-old stopped batting away her bottles and started putting on weight again.

“She was doing well and we started to ask when can she go home,” Lauren Hackney said. “And then from that moment on, at the time, they completely stonewalle­d us and never said anything.”

The couple was stunned when child welfare officials showed up, told them they were negligent and took their daughter away.

“They had custody papers and they took her right there and then,” Lauren Hackney recalled. “And we started crying.”

Daughter in foster care

More than a year later, their daughter, now 2, remains in foster care. The Hackneys, who have developmen­tal disabiliti­es, are struggling to understand how taking their daughter to the hospital when she refused to eat could be seen as so neglectful that she’d need to be taken from her home.

They wonder if an artificial intelligen­ce tool that the Allegheny County Department of Human Services uses to predict which children could be at risk of harm singled them out because of their disabiliti­es.

The U.S. Justice Department is asking the same question. The agency is investigat­ing the county’s child welfare system to determine whether its use of the influentia­l algorithm discrimina­tes against people with disabiliti­es or other protected groups, The Associated Press has learned. Later this month, federal civil rights attorneys will interview the Hackneys and Andrew Hackney’s mother, Cynde Hackney-Fierro, the grandmothe­r said.

Lauren Hackney has attention-deficit hyperactiv­ity disorder that affects her memory, and her husband, Andrew, has a comprehens­ion disorder and nerve damage from a stroke suffered in his 20s. Their baby girl was just 7 months old when she began refusing to drink her bottles. Facing a nationwide shortage of formula, they traveled from Pennsylvan­ia to West Virginia looking for some and were forced to change brands. The baby didn’t seem to like it.

Her pediatrici­an first reassured them that babies sometimes can be fickle with feeding and offered ideas to help her get back her appetite, they said.

When she grew lethargic days later, they said, the same doctor told them to take her to the emergency room. The Hackneys believe medical staff alerted child protective services after they showed up with a baby who was dehydrated and malnourish­ed.

That’s when they believe their informatio­n was fed into the Allegheny Family Screening Tool, which county officials say is standard procedure for neglect allegation­s. Soon, a social worker appeared to question them, and their daughter was sent to foster care.

Over the past six years, Allegheny County has served as a real-world laboratory for testing AIdriven child welfare tools that crunch reams of data about local families to try to predict which children are likely to face danger in their homes. Today, child welfare agencies in at least 26 states and Washington, D.C., have considered using algorithmi­c tools, and jurisdicti­ons in at least 11 have deployed them, according to the American Civil Liberties Union.

The Hackneys’ story — based on interviews, internal emails and legal documents — illustrate­s the opacity surroundin­g these algorithms. Even as they fight to regain custody of their daughter, they can’t question the “risk score” Allegheny County’s tool may have assigned to her case because officials won’t disclose it to them. And neither the county nor the people who built the tool have ever explained which variables may have been used to measure the Hackneys’ abilities as parents.

“It’s like you have an issue with someone who has a disability,” Andrew Hackney said in an interview from their apartment in suburban Pittsburgh. “In that case … you probably end up going after everyone who has kids and has a disability.”

Emerging technologi­es

As part of a yearlong investigat­ion, the AP obtained the fields of data underpinni­ng several algorithms deployed by child welfare agencies, including some marked “CONFIDENTI­AL,” offering rare insight into the mechanics driving these emerging technologi­es. Among the factors they have used to calculate a family’s risk, whether outright or by proxy: race, poverty rates, disability status and family size. They include whether a mother smoked before she was pregnant and whether a fa mily had previous child abuse or neglect complaints.

What they measure matters. A recent analysis by ACLU researcher­s found that when Allegheny’s algorithm flagged people who accessed county services for mental health and other behavioral health programs, that could add up to three points to a child’s risk score, a significan­t increase on a scale of 20.

Allegheny County spokesman Mark Bertolet declined to address the Hackney case and did not answer detailed questions about the status of the federal probe or critiques of the data powering the tool, including by the ACLU.

“As a matter of policy, we do not comment on lawsuits or legal matters,” Bertolet said in an email.

Justice Department spokeswoma­n Aryele Bradford declined to comment.

Vast amounts of data

Child welfare algorithms plug vast amounts of public data about local families into complex statistica­l models to calculate what they call a risk score. The number that’s generated is then used to advise social workers as they decide which families should be investigat­ed, or which families need additional attention — a weighty decision that can sometimes mean life or death.

Anu mber of local leaders have tapped into AI technology while under pressure to make systemic changes, such as in Oregon during a foster care crisis and in Los Angeles County after a series of high-profile child deaths in one of the nation’s largest county child welfare systems.

LA County’s Department of Children and Family Services Director Brandon Nichols says algorithms can help identify high-risk families and improve outcomes in a deeply strained system. Yet he could not explain how the screening tool his agency uses works.

“We’re sort of the social work side of the house, not the IT side of the house,” Nichols said in an interview. “How the algorithm functions, in some ways is, I don’t want to say is magic to us, but it’s beyond our expertise and experience.”

Nichols and officials at two other child welfare agencies referred detailed questions about their AI tools to the outside developers who created them.

In Pennsylvan­ia, California and Colorado, county officials have opened up their data systems to the two academic developers who select data points to build their algorithms. Rhema Vaithianat­han, a professor of

health economics at New Zealand’s Auckland University of Technology, and Emily Putnam-Hornstein, a professor at the University of North Carolina at Chapel Hill’s School of Social Work, said in an email that their work is transparen­t and that they make their computer models public.

Vaithianat­han and Putnam-Hornstein’s work has been hailed in reports published by UNICEF and the Biden administra­tion alike for devising computer models that promise to lighten caseworker­s’ loads by drawing from a set of simple factors. They have described using such tools as a moral imperative, insisting that child welfare officials should draw from all data at their disposal to make sure children aren’t maltreated.

Through tracking their work across the country, however, the AP found their tools can set families up for separation by rating their risk based on personal characteri­stics they cannot change or control, such as race or disability, rather than just their actions as parents.

In Allegheny County, a sprawling county of 1.2 million near the Ohio border, the algorithm has accessed an array of external data, including jail, juvenile probation, Medicaid, welfare, health and birth records, all held in a vast countywide “data warehouse.” The tool uses that informatio­n to predict the risk that a child will be placed in foster care two years after a family is first investigat­ed.

County officials have told the AP they’re proud of their cutting-edge approach, and even expanded their work to build another algorithm focused on newborns. They have said they monitor their risk scoring tool closely and update it over time, including removing variables such as welfare benefits and birth records.

Vaithianat­han and Putnam-Hornstein declined the AP’s repeated interview requests to discuss how they choose the specific data that powers their models. But in a 2017 report, they detailed the methods used to build the first version of Allegheny’s tool, including a footnote that described a statistica­l cutoff as “rather arbitrary but based on trial and error.”

“This footnote refers to our exploratio­n of more than 800 features from Allegheny’s data warehouse more than five years ago,” the developers said by email.

That approach is borne out in their design choices, which differ from county to county.

In the same 2017 report, the developers acknowledg­ed that using race data didn’t substantiv­ely improve the model’s accuracy, but they continued to study it in Douglas County, Colorado, though they ultimately opted against including it in that model.

To address community concerns that a tool could harden racial bias in Los Angeles County, the developers excluded people’s criminal history, ZIP code and geographic indicators, but have continued to use those data points in the Pittsburgh area.

When asked about the inconsiste­ncies, the developers pointed to their published methodolog­y documents.

“We detail various metrics used to assess accuracy — while also detailing ‘external validation­s,’” the developers said via email.

With no answers on when they could get their daughter home, the Hackneys’ lawyer in October filed a federal civil rights complaint on their behalf that questioned how the screening tool was used in their case.

Before algorithms were in use, the child welfare system had long distrusted parents with disabiliti­es. Into the 1970s, they were regularly sterilized and institutio­nalized, LaLiberte said. A landmark federal report in 2012 noted parents with psychiatri­c or intellectu­al disabiliti­es lost custody of their children as much as 80% of the time.

Across the U.S., it’s extremely rare for any child welfare agencies to require disabiliti­es training for social workers, LaLiberte’s research has found. The result: Parents with disabiliti­es are often judged by a system that doesn’t understand how to assess their capacity as caregivers, she said.

The Hackneys experience­d this firsthand. When a social worker asked Andrew Hackney how often he fed the baby, he answered literally: two times a day. The worker seemed appalled, he said, and scolded him, saying babies must eat more frequently. He struggled to explain that the girl’s mother, grandmothe­r and aunt also took turns feeding her each day.

Running out of money

Meanwhile, they say they’re running out of money in the fight for their daughter. With barely enough left for food from Andrew Hackney’s wages at a local grocery store, he had to shut off his monthly cell phone service. They’re struggling to pay for the legal fees and gas money needed to attend appointmen­ts required of them.

All they have for now are twice-weekly visits that last a few hours before she’s taken away again. Lauren Hackney’s voice breaks as she worries her daughter may be adopted and soon forget her own family.

“I really want to get my kid back. I miss her, and especially holding her. And of course, I miss that little giggly laugh,” Andrew Hackney said, as his daughter sprang toward him with excitement during a recent visit. “It hurts a lot. You have no idea how bad.”

 ?? Jessie Wardarski/Associated Press ?? Andrew and Lauren Hackney sit in their daughter’s room with their dog Scrappy after one of their twice-weekly supervised visits in Oakdale, Pennsylvan­ia. When their daughter was 7 months old, the couple had difficulty feeding her and brought her to the children’s hospital in Pittsburgh. The Hackneys and their lawyer believe the Allegheny County Family Screening tool may have flagged the couple as dangerous because of their disabiliti­es.
Jessie Wardarski/Associated Press Andrew and Lauren Hackney sit in their daughter’s room with their dog Scrappy after one of their twice-weekly supervised visits in Oakdale, Pennsylvan­ia. When their daughter was 7 months old, the couple had difficulty feeding her and brought her to the children’s hospital in Pittsburgh. The Hackneys and their lawyer believe the Allegheny County Family Screening tool may have flagged the couple as dangerous because of their disabiliti­es.

Newspapers in English

Newspapers from United States