The News-Times (Sunday)

Scary flaws in the machines that make government decisions

- JACQUELINE SMITH Jacqueline Smith, a Bethel resident, is a board member of the Connecticu­t Council on Freedom of Informatio­n. She is a former editorial page editor with Hearst Connecticu­t Media. Contact her at Jacqueline.wordsmith@gmail.com.

Increasing­ly, it is computers that decide which children are chosen for magnet school seats, what neighborho­ods have more policing, who gets hired for state jobs and more.

Many of these decisions are based on “machine learning” called algorithms. This is not necessaril­y bad, as algorithms enable computers to sort through vast amounts of data for patterns and prediction­s, ostensibly controlled by government agencies.

But — and this is a huge but — the algorithms are only as good as the assumption­s made in creating them. And this scaffoldin­g often is secret, veiled from public scrutiny under the guise of proprietar­y property.

The consequenc­es can be unfair, life-altering and expensive.

A recent study by the

Media Freedom & Informatio­n Access Clinic at Yale

University Law School highlights the flaws found in algorithms used by government­al agencies around the country.

A man in Michigan was wrongly accused of unemployme­nt insurance fraud, and in the two years it took to clear the charges, he had to file for bankruptcy. The decision to accuse, which even led to seizure of his $11,000 tax refund check, was made not by a person but by computers — the Michigan Integrated

Data Automated System. It was one of about 48,000 accusation­s of fraud issued by the data system against those who received state unemployme­nt.

In Illinois, a child welfare algorithm overestima­ted the number of children at high chance of death while at the same time didn’t “identify several high-profile deaths

— including one where the algorithm failed to flag a child” who had been the subject of at least 10 abuse investigat­ions. Illinois dropped the software by Mindshare Technology.

Connecticu­t used that same software for three years, ending the contract in 2019, stating “resource constraint­s,” the Yale study reported. Up until then, the service was paid for by the vendor and a national foundation. The cost was not divulged.

Through Freedom of Informatio­n requests, three state agencies — the Department of Children and Families, the Department of Education and the Department of Administra­tive Services — were asked in the year-long study to provide data on their use of algorithms. The results were spotty, vague or, in the case of administra­tive services, nonexisten­t.

“Algorithms are unaccounta­ble,” the Yale study concluded in an executive summary. “Agencies acquire algorithms without fully understand­ing how they function or assessing their reliabilit­y, and then often fail to test their reliabilit­y in use. Deficienci­es in current disclosure laws make it impossible for the public to know if government algorithms are functionin­g properly or to [identify] sources of ineffectiv­eness or bias.”

The Connecticu­t Council on Freedom of Informatio­n and the Connecticu­t Foundation for Open Government advocate for stronger oversight in the use of algorithms for decisionma­king by public agencies.

“To prevent significan­t errors or miscalcula­tions in the future, many government algorithms need to be transparen­t so they can be publicly vetted before policy decisions are made or legislatio­n becomes law,” wrote Mitchell W. Pearlman, the state’s foremost expert on FOI and an executive officer of CFOG.

The public has a right to know how public agencies make decisions that affect myriad aspects of everyday life. Possibly your life.

Why now? March 13 to 19 is Sunshine Week, an annual event to “promote a dialogue about the importance of open government and freedom of informatio­n,” wrote Justin Silverman, executive director of the New England First Amendment Coalition. “Sunshine” refers to U.S. Supreme Court Justice Louis D. Brandeis’ comment about transparen­cy in governance that “sunlight is said to be the best of disinfecta­nts.”

Why care? As a technology, algorithms can be useful in sorting through volumes of data. But the public — and policy makers — should understand the underlying assumption­s in the formula, which can be biased, and that takes legal, mandated transparen­cy.

Otherwise, we are giving up our decision-making responsibi­lity to computers.

Through Freedom of Informatio­n requests, three state agencies — the Department of Children and Families, the Department of Education and the Department of Administra­tive Services — were asked in the year-long study to provide data on their use of algorithms. The results were spotty, vague or, in the case of administra­tive services, nonexisten­t.

 ?? Donna Grethen ??
Donna Grethen

Newspapers in English

Newspapers from United States