The Morning Call

Algorithms don’t counter child welfare bias; they magnify it

- Richard Wexler is executive director of the National Coalition for Child Protection Reform, www.nccpr.org.

“Can an algorithm tell when children are in danger?” asks a headline in the March 12 edition of The Morning Call. The answer is no.

Predictive analytics, the latest fad in child welfare, doesn’t counteract human bias; it magnifies such bias. It only increases the chances that children will be needlessly torn from everyone they know and love, and consigned to the chaos of foster care.

The best indication of this is the extent to which proponents of the algorithm now in use in Allegheny County — the model for the one in Northampto­n County — have misreprese­nted both the model and how it has been evaluated.

The first thing to understand is that most cases seen by child welfare agencies are nothing like the horror stories that make headlines. Far more common are cases in which family poverty is confused with neglect. Consider the three hypothetic­al examples at the start of the Morning Call story:

“My neighbor’s house is dirty, and her daughter looks disheveled,” or “I think my son’s classmate might be homeless,” or “I heard my ex-girlfriend is smoking weed around my kid.”

You don’t need an algorithm to know that none of these calls should be the business of an agency that has the power to take away children — and make no mistake, general protective services cases are investigat­ed by the same workers who have the same nearly unchecked power to destroy a family as cases labeled “abuse.”

In the first case, call an anti-poverty agency; in the second case, call a housing agency. In the third case, do what you’d do if you read about a rich mother bragging about her pot-smoking in a Facebook group — nothing.

The second thing to understand is that algorithms don’t predict actual child abuse. Rather they predict whether a family will be involved with the system. But the system is biased from the start.

Mandated reporters, as defined by the state Department of Human Services, of child abuse are far more likely to call in families that are poor, especially if they’re also nonwhite. Then, even when all else is equal, caseworker­s are more likely to substantia­te allegation­s in such cases, and more likely to take away children if the family is Black.

The algorithm then gives a higher risk score to a family just because it was involved with a biased system.

■ When proponents of the Allegheny County algorithm said that it curbed racial bias, it was only because the algorithm led to investigat­ing more white families, not sparing more Black families from the enormous trauma of needless investigat­ions.

■ The claim that not telling investigat­ors the precise risk score coughed up by the algorithm avoids “confirmati­on bias” is equally disingenuo­us. Even without the score, the workers know a “scientific” algorithm, in effect, sent them to investigat­e, and deciding nothing is wrong means countering the “science.” Like racial bias and class bias, confirmati­on bias is built into the model.

■ The claim that the Northampto­n County algorithm is somehow less likely to wrongly target people because it uses only child welfare records is equally absurd. How many rich people ever get entangled with a Children Youth and Families agency?

So it’s no wonder that professor Virginia Eubanks of the State University of New York devoted an entire chapter of her landmark book, “Automating Inequality,” to showing how the Allegheny algorithm is really “poverty profiling.”

Professor Dorothy Roberts of the University of Pennsylvan­ia Law School (and a member of my group’s board of directors) expanded the analysis to show how such algorithms are racially biased as well.

Child welfare cases in general and general protective services cases in particular almost always are rooted in poverty. Study after study finds that even small amounts of money significan­tly reduce what agencies call neglect.

So if you really must use an algorithm, here’s a better one:

■ Find where the poor people are.

■ Send money.

 ?? SHUTTERSTO­CK ?? Northampto­n County may join a handful of municipali­ties across the nation using the science of predictive analytics to decide which children are at risk of abuse.
SHUTTERSTO­CK Northampto­n County may join a handful of municipali­ties across the nation using the science of predictive analytics to decide which children are at risk of abuse.
 ??  ?? Richard Wexler
Richard Wexler

Newspapers in English

Newspapers from United States