San Francisco Chronicle - (Sunday)
State deficit could hamper AI regulation
SACRAMENTO — Some California lawmakers want the state to lead the nation in regulating artificial intelligence, but a looming budget deficit could hamstring their efforts.
They’ve already announced a flurry of AI bills, with more on the way. Their proposals include efforts to require the state to set new safety standards, create an AI research hub and develop protections against deepfake videos and photos that look real but have been digitally altered to mislead the viewer.
Gov. Gavin Newsom is also interested in preventing malicious uses of the new technology, his Deputy Chief of Staff Jason Elliott said. Preventing AI harms is important to ensure the safety of Californians, Elliott said, but also because Newsom wants the industry to grow as an economic force in the state.
“The governor is actually enthusiastic about this as an industry in California,” Elliott said. “What’s happening in San Francisco right now is exciting. We want more of that, not less. In order for that to take hold and flourish, more good things need to happen than bad things.”
But the budget deficit could prompt Newsom to veto AI bills with high price tags. The newly introduced bills have not yet been given cost estimates, but any requirements to hire people to implement the legislation or other related costs will likely run up against the reality of a massive budget deficit. The nonpartisan Legislative Analyst’s Office has estimated the state faces a $68 billion shortfall this year, which will likely require Newsom to propose cuts when he releases his budget plan next week.
Elliott suggested AI bills with significant costs should be considered as part of the budget process, or risk what he called the “boilerplate veto language” Newsom often uses to explain why he rejects proposed laws: They are too expensive.
The governor has also signaled he might be interested in a slower approach than some of the lawmakers. He unveiled an executive order in the fall directing state agencies to evaluate the risks AI poses to California’s infrastructure, develop guidelines for the state to purchase AI technology and to draft a report on AI’s potential benefits.
Elliott said the administration is talking to OpenAI and other tech companies, academics, consumer protection advocates, scientists and other experts. He said the governor intentionally didn’t propose legislation as part of the order because the process of determining whether new laws are needed is ongoing.
Newsom has told legislative leaders they should ensure any bills they send him are “cohesive,” Elliott said.
“If you get 10 different bills that pull in 10 different directions at 10 different orders of magnitude — so one is trying to set up a regulatory structure and the other is trying to legislate
a very, very specific end use — that’s challenging,” he said.
In the meantime, lawmakers have already proposed many new laws, including:
• Sen. Steve Padilla, D-Chula Vista (San Diego County), on Thursday introduced bills to require the state to set safety, privacy and nondiscrimination standards for any AI technology it buys and to create an AI research hub in partnership with universities.
• Assembly Member Akilah Weber, D-La Mesa (San Diego County), said she plans to introduce legislation to set a standard for identifying generative AI in an effort to protect against deepfakes.
• Sen. Bill Dodd, D-Napa, introduced a bill Thursday to require the state to assess the potential risks and benefits of AI and to require state agencies to disclose when they use AI to communicate with the public.
• Sen. Scott Wiener, D-San Francisco, plans to introduce a bill to set industry-wide safety and transparency standards for AI.
• Assembly Member Rebecca
Bauer-Kahan, D-Orinda, plans to reintroduce a bill to ban companies from using AIpowered algorithms that discriminate against people after an earlier version died last year.
• Assembly Member Ash Kalra, D-San Jose, has introduced a measure to limit the ability of movie studios and other entertainment companies to use AI-generated versions of human performers.
Padilla praised Newsom for addressing AI through his executive order, but said the Legislature also needs to pass laws to regulate the technology.
“This needs to be codified beyond any single administration,” he said. “This needs to be a legislative framework, not just dealt with in an EO.”
Padilla acknowledged that his proposals will cost some money, but he said he thinks addressing AI needs to be prioritized in the budget.
“We have to move,” he said, noting how quickly the technology is developing. “We’re behind already.”
For AI regulation to be effective, some academic experts caution that government needs to hire more people with specific expertise in AI, which is a challenge.
A recent paper published by Stanford researchers argues low salaries for government workers, as well as long, burdensome hiring processes, are causing an “impending workforce crisis that undermines efforts to address climate change, cybersecurity, and tech governance.”
Daniel Ho, who co-authored the paper and also serves as an AI adviser to the Biden administration, said the staffing problems threaten state government’s ability to regulate AI.
“The talent pipeline into government is probably the single most important thing to get right to craft effective forms of AI regulation,” he told the Chronicle. “Government cannot govern AI if it does not understand AI.”
Ho said creating partnerships where academics already working at colleges and universities do rotations or fellowships working for government could help address that issue without costing the state more. He also said creating a public reporting system about problems that arise from AI, similar to ones already used by the government to monitor health issues, could help keep the government informed about issues it should respond to related to the new technology.
Bauer-Kahan said she hopes the state will find a solution but said she is worried the budget deficit will make it harder for the state to hire the people it needs to regulate the emerging industry.
“We need new talent in our agencies to really get this issue right, and we’re hampered in our ability to do that,” she said. “I’m hopeful that we can partner with the folks that will help us figure this out, and at the same time hire folks who can get this right.”