Hartford Courant (Sunday)

State must protect the public from AI abuse by government

- Kevin Rennie

The Connecticu­t legislatur­e last week provided a telling contrast between the temptation­s of attention-seeking and the imperative­s of understand­ing and addressing change.

The Judiciary Committee held a Wednesday hearing on a proposal to pardon the men and women convicted of witchcraft in the 17th century. The day before, legislator­s of both parties were engaged in serious discussion­s of how to protect the data that state government collects about, and from, all of us. One received attention, the other was discussed by serious legislator­s.

State Sen. James Maroney, D-Milford, and House Republican Leader Vincent Candelora, R-North Branford, want to protect the public while allowing government to employ 21st century advances in the collection and analysis of knowledge.

State agencies purchase computer programs from private companies. The agencies add mountains of data about all of us to those programs to make critical decisions. The programs’ data are often used by algorithms programmed into the software to predict human behavior. To an unnerving extent, decisions are made in full or in part by computers, not by humans. The state voluntaril­y embraces this Artificial Intelligen­ce — but resists revealing details to the public.

The consequenc­es can be monumental, Colleen Murphy the executive director of the Freedom of Informatio­n Commission said Thursday. State agencies must be transparen­t in their use of algorithms and other forms of artificial intelligen­ce, Murphy insisted. The Connecticu­t Advisory Committee to the U.S. Commission on Civil Rights met in January to raise urgent concerns about the misuse of data that can guide government agency decisions about child welfare, law enforcemen­t, and government benefits, the AP reported.

Only meaningful public scrutiny will curb the inevitable abuse.

“There are different instances of unintended consequenc­es, whether it’s discrimina­tion sometimes in hiring,” Maroney pointed out in the same AP report. “It can discrimina­te against age. We’ve seen other examples where it’s discrimina­ted against people based on basically being poor… And then also there are unfortunat­ely racial disparitie­s in some of the decisions made when using automated decision-making processes.”

The public needs to have the right to know the elements of technology shaping or making decisions about the people government serves. So far, our state government disagrees.

Yale Law School’s Medial Freedom & Informatio­n Access Clinic’s Algorithmi­c Accountabi­lity Project undertook “an exhaustive study of the novel challenges to transparen­cy and accountabi­lity posed by state agencies’ increasing use of algorithms” in Connecticu­t state government. Its experience was much like others seeking public informatio­n under the Lamont administra­tion.

The most alarming revelation­s of the Yale project was the refusal of the Department of Administra­tion’s refusal to provide any

informatio­n in response to a request for documents under the Freedom of Informatio­n Act.

“DAS provided no response to our request for informatio­n concerning a new algorithm used in hiring state employees and contractor­s,” the report reveals. A half-dozen follow-up calls failed to elicit any documents in response to the clinic’s request.

This would be dismaying under ordinary circumstan­ces. This is no ordinary time. We live in an age of technologi­cal innovation and the tumult that accompanie­s it.

Secrecy remains the highest virtue in the Lamont administra­tion. DAS Commission­er Michelle Gilman is the chief keeper of the flame in the temple of the clandestin­e that the governor has erected. Gilman made repeated claims during her February confirmati­on hearing that she could not answer some questions posed by legislator­s because of an ongoing federal criminal investigat­ion into corruption into the agency she heads.

The secrecy that stains DAS may be one reason some agencies have resisted enabling DAS to become the central repository of data. Even Lamont loyalists can see the danger in giving more power to the-politics-before-the-public leadership at DAS.

The legislatur­e should adopt Maroney’s bill to establish and fund a task force to propose detailed and broad legislatio­n to protect the public from AI abuse by government, create a privacy bill of rights, and secure our right to know how our government is using data and technology.

And it should address one more critical issue. Candelora raised an issue that merited more attention than it received during the fraught days of the pandemic. When the state contracted with Sema4 to perform COVID tests, the company included a confusing consent to conduct research with the test sample, including storing the DNA of the people tested. Science did not require option, commerce did. Protecting the public from a similar intrusions should be included in a privacy bill of rights.

 ?? ??

Newspapers in English

Newspapers from United States