Scary flaws in the machines that make government decisions
Increasingly, it is computers that decide which children are chosen for magnet school seats, what neighborhoods have more policing, who gets hired for state jobs and more.
Many of these decisions are based on “machine learning” called algorithms. This is not necessarily bad, as algorithms enable computers to sort through vast amounts of data for patterns and predictions, ostensibly controlled by government agencies.
But — and this is a huge but — the algorithms are only as good as the assumptions made in creating them. And this scaffolding often is secret, veiled from public scrutiny under the guise of proprietary property.
The consequences can be unfair, life-altering and expensive.
A recent study by the
Media Freedom & Information Access Clinic at Yale
University Law School highlights the flaws found in algorithms used by governmental agencies around the country.
A man in Michigan was wrongly accused of unemployment insurance fraud, and in the two years it took to clear the charges, he had to file for bankruptcy. The decision to accuse, which even led to seizure of his $11,000 tax refund check, was made not by a person but by computers — the Michigan Integrated
Data Automated System. It was one of about 48,000 accusations of fraud issued by the data system against those who received state unemployment.
In Illinois, a child welfare algorithm overestimated the number of children at high chance of death while at the same time didn’t “identify several high-profile deaths
— including one where the algorithm failed to flag a child” who had been the subject of at least 10 abuse investigations. Illinois dropped the software by Mindshare Technology.
Connecticut used that same software for three years, ending the contract in 2019, stating “resource constraints,” the Yale study reported. Up until then, the service was paid for by the vendor and a national foundation. The cost was not divulged.
Through Freedom of Information requests, three state agencies — the Department of Children and Families, the Department of Education and the Department of Administrative Services — were asked in the year-long study to provide data on their use of algorithms. The results were spotty, vague or, in the case of administrative services, nonexistent.
“Algorithms are unaccountable,” the Yale study concluded in an executive summary. “Agencies acquire algorithms without fully understanding how they function or assessing their reliability, and then often fail to test their reliability in use. Deficiencies in current disclosure laws make it impossible for the public to know if government algorithms are functioning properly or to [identify] sources of ineffectiveness or bias.”
The Connecticut Council on Freedom of Information and the Connecticut Foundation for Open Government advocate for stronger oversight in the use of algorithms for decisionmaking by public agencies.
“To prevent significant errors or miscalculations in the future, many government algorithms need to be transparent so they can be publicly vetted before policy decisions are made or legislation becomes law,” wrote Mitchell W. Pearlman, the state’s foremost expert on FOI and an executive officer of CFOG.
The public has a right to know how public agencies make decisions that affect myriad aspects of everyday life. Possibly your life.
Why now? March 13 to 19 is Sunshine Week, an annual event to “promote a dialogue about the importance of open government and freedom of information,” wrote Justin Silverman, executive director of the New England First Amendment Coalition. “Sunshine” refers to U.S. Supreme Court Justice Louis D. Brandeis’ comment about transparency in governance that “sunlight is said to be the best of disinfectants.”
Why care? As a technology, algorithms can be useful in sorting through volumes of data. But the public — and policy makers — should understand the underlying assumptions in the formula, which can be biased, and that takes legal, mandated transparency.
Otherwise, we are giving up our decision-making responsibility to computers.
Through Freedom of Information requests, three state agencies — the Department of Children and Families, the Department of Education and the Department of Administrative Services — were asked in the year-long study to provide data on their use of algorithms. The results were spotty, vague or, in the case of administrative services, nonexistent.