Jamaica Gleaner

California testing new generative AI tools for government services

-

GENERATIVE ARTIFICIAL intelligen­ce tools will soon be used by California’s government.

Democratic Governor Gavin Newsom’s administra­tion announced Thursday the state will partner with five companies to develop and test generative AI tools that could improve public service.

California is among the first states to roll out guidelines on when and how state agencies can buy AI tools as lawmakers across the country grapple with how to regulate the emerging technology.

Generative AI is a branch of artificial intelligen­ce that can create new content such as text, audio and photos in response to prompts. It’s the technology behind ChatGPT, the controvers­ial writing tool launched by Microsoft-backed OpenAI. The San Francisco-based company Anthropic, with backing from Google and Amazon, is also in the generative AI game.

Newsom, a Democrat, touts California as a global hub for AI technology, noting that 35 of the world’s top 50 AI companies are located in the state.

California envisions using this type of technology to help cut down on customer call wait times at state agencies, and to improve traffic and road safety, among other things.

Initially, four state department­s will test generative AI tools: The Department of Tax and Fee Administra­tion, the California Department of Transporta­tion, the Department of Public Health, and the Health and Human Services Department.

SIX-MONTH TRIAL

The tax and fee agency administer­s more than 40 programmes and took more than 660,000 calls from businesses last year, director Nick Maduros said. The state hopes to deploy AI to listen in on those calls and pull up key informatio­n on state tax codes in real time, allowing call-centre workers to more quickly answer questions because they don’t have to look up the informatio­n themselves.

In another example, the state wants to use the technology to provide people with informatio­n about health and social service benefits in languages other than English.

The public doesn’t have access to these tools quite yet, but possibly will in the future. The state will start a six-month trial, during which the tools will be tested by state workers internally. In the tax example, the state plans to have the technology analyse recordings of calls from businesses and see how the AI handles them afterwards — rather than have it run in real-time, Maduros said.

Not all the tools are designed to interact with the public though. For instance, the tools designed to help improve highway congestion and road safety would only be used by state officials to analyse traffic data and brainstorm potential solutions.

State workers will test and

evaluate their effectiven­ess and risks. If the tests go well, the state will consider deploying the technology more broadly.

COST UNCLEAR

The ultimate cost is unclear. For now, the state will pay each of the five companies US$1 to start a six-month internal trial. Then, the state can assess whether to sign new contracts for long-term use of the tools.

“If it turns out it doesn’t serve the public better, then we’re out a dollar,” Maduros said. “And I think that’s a pretty good deal for the citizens of California.”

The state currently has a massive budget deficit, which could make it harder for Newsom to make the case that such technology is worth deploying.

Administra­tion officials said they didn’t have an estimate on what such tools would eventually cost the state, and they did not immediatel­y release copies of the agreements with the five companies that will test the technology on a trial basis. Those companies are: Deloitte Consulting LLP, INRIX Inc, Accenture LLP, Ignyte Group LLC, and SymSoft Solutions LLC.

The rapidly growing technology has also raised concerns about job loss, misinforma­tion, privacy and automation bias.

State officials and academic experts say generative AI has significan­t potential to help government agencies become more efficient but there’s also an urgent need for safeguards and oversight.

While state government­s in the United States are struggling to regulate AI in the private sector, many are exploring how public agencies can leverage the powerful technology for public good. California’s approach, which also requires companies to disclose what large language models they use to develop AI tools, is meant to build public trust, officials said.

Testing the tools on a limited basis is one way to limit potential risks, said Meredith Lee, chief technical adviser for UC Berkeley’s College of Computing, Data Science, and Society.

But, she added, the testing can’t stop after six months. The state must have a consistent process for testing and learning about the tools’ potential risks if it decides to deploy them on a wider scale.

 ?? AP ?? Governor of California Gavin Newsom.
AP Governor of California Gavin Newsom.

Newspapers in English

Newspapers from Jamaica