Chicago Sun-Times

Child welfare decisions should not be made by computer algorithms

- BY JEFFERY M. LEVING Jeffery M. Leving is founder and president of the Law Offices of Jeffery M. Leving Ltd., and is an advocate for the rights of fathers.

The power of computers has become essential in all our lives. Computers, and specifical­ly computer algorithms, largely make all of our lives easier.

Simply put, algorithms are nothing more than a set of rules or instructio­ns used by computer programs to streamline processes — from internet search engines to programmin­g traffic signals and scheduling bus routes. Algorithms influence and help us all in ways that we don’t often realize.

However, it is imperative that we realize that algorithms, like any computer program, are designed by humans and thus will have the same biases as the humans who designed them. This fact may be benign when it comes to searching for the best pizza place in Chicago on Google, but can be dangerous when relied on for serious matters.

Yet, several states are now relying on algorithms to screen for child neglect under the guise of “assisting” child welfare agencies that are often overburden­ed with cases — and a market once estimated to be worth $270 million to these companies.

Who among us would allow a computer to decide the fate of our children?

A recent report from the Associated Press and the Pulitzer Center for Crisis Reporting has pointed out several concerns regarding these systems, including that they are not reliable — sometimes missing serious abuse cases — and perpetuate racial disparitie­s in the child welfare system. Both outcomes are exactly what the creators of these systems often profess to combat.

The children and families affected most by child welfare agencies are largely poor, and largely members of minority groups. Translatio­n: They are the most powerless people in America, which is all the more reason for more privileged citizens to speak up and speak out against using algorithms to make critical decisions in child welfare cases.

In Illinois, the state’s Department of Children and Family Services used a predictive analytics tool from 2015 to 2017 to identify children reported for maltreatme­nt who were most at risk of serious harm or even death. But DCFS ended the program after the agency’s then-director said it was unreliable.

While Illinois wisely stopped using algorithms, at least 26 states and Washington, D.C., have considered using them, and at least 11 have deployed them, according to a 2021 ACLU white paper cited by AP.

The stakes of determinin­g which children are at risk of injury or death cannot be higher, and it is of vital importance to get this right. It is also important to realize that the same system that determines whether a child is at risk for injury or death often separates families.

It is easy for outsiders to say things like “better safe than sorry.” However, it is not a small point to realize that once a child or family comes into contact with an investigat­or, the chance of that child being removed and the family separated is increased. Simply put, the road to separation should not be initiated by computers that have proven to be fallible.

The AP report also found that algorithm-based systems flag a disproport­ionate number of Black children for mandatory neglect investigat­ions and gave risk scores that social workers disagreed with about one-third of the time.

California pursued using predictive risk modeling for two years and spent nearly $200,000 to develop a system, but ultimately scrapped it because of questions about racial equity. Currently, three counties in that state are using it.

Sadly, the demand for algorithmi­c tools has only increased since the pandemic. I fear that more and more municipali­ties will turn to them for child welfare issues without vetting them for problems, and without investigat­ing conflicts of interest with politician­s.

This technology, while no doubt helpful in many aspects of our lives, is still subject to human biases and simply not mature enough to be used for life-altering decisions. Government agencies that oversee child welfare should be prohibited from using algorithms.

 ?? KEITH SRAKOCIC/AP ?? Workers field calls at an intake call screening center for the Allegheny County Children and Youth Services office in Penn Hills, Pa.
KEITH SRAKOCIC/AP Workers field calls at an intake call screening center for the Allegheny County Children and Youth Services office in Penn Hills, Pa.

Newspapers in English

Newspapers from United States