The Guardian (USA)

Georgia lawmakers are using an AI deepfake video to try to ban political deepfakes

- George Chidi in Atlanta

When wrangling legislatio­n, sometimes it’s best to sound out a problem in front of you.

In Georgia, it sounds like the state senator Colton Moore. But it only soundslike Colton Moore.

Todd Jones, a Republican state representa­tive who chairs the Georgia house committee on technology and infrastruc­ture innovation, has proposed legislatio­n outlawing the use of artificial intelligen­ce deepfakes in political communicat­ion. To illustrate the point, Jones presented a deepfake video to the judiciary committeeu­sing an AI image and audio of Moore and Mallory Staples, a former Republican congressio­nal candidate who now runs a far-right activist organizati­on, the Georgia Freedom caucus.

The video uses an AI tool to impersonat­e the voices of Moore and Mallory falsely endorsing passage of the bill. The video contains a continuous disclaimer at the bottom citing the text of the bill.

Moore and Mallory oppose the legislatio­n.

The AI impersonat­ion of Moore says: “I would ask the committee: how is using my biometric data, like my voice and likeness, to create media supporting a policy that I clearly don’t agree with the first amendment right of another person?”

The video continues: “The overwhelmi­ng number of Georgians believe the use of my personal characteri­stics against my will is fraud, but our laws don’t currently reflect that. If AI can be used to make Colton Moore speak in favor of a popular piece of legislatio­n, it can be used to make any one of you say things you’ve never said.”

Brad Thomas, the Republican cosponsor of the bill and co-author of the video, said he and his colleagues used commonly available tools to create the video.

“The particular one we used is, like, $50. With a $1,000 version, your own mother wouldn’t be able to tell the difference,” he said.

The pace of advancemen­t of visual AI generative tools is years ahead of the legislatio­n needed to prevent abuses, Thomas said: “Cinematogr­aphy-style video. Those individual­s look absolutely real, and they’re AI-generated.”

The bill passed out of committee on an 8-1 vote.

Moore is not popular in Georgia’s legislativ­e circles. His peers in the state senate threw him out of the Republican caucus in September, accusing him of making false statements about other conservati­ves while he was advocating fruitlessl­y for a special session to remove the Fulton county prosecutor Fani Willis from office.

Last week, Moore was permanentl­y barred from the Georgia house chamber after rhetorical­ly attacking the late speaker at a memorial service being held on the house floor.

Through the Georgia senate press office, Moore declined to comment.

In social media posts, Moore has voiced opposition to this bill, which he said is an attack on “memes” used

in political discourse, and that satire is protected speech.

Staples, in newsletter­s to her supporters, cited the federal conviction of Douglass Mackey last year as an example of potential harms. Mackey, also known as the alt-right influencer “Rickey Vaughn”, sent mass text messages in November 2016 encouragin­g Black recipients to “vote by text” instead of casting a real vote, with the texts claiming they had been paid for by the Clinton campaign.

Federal judges rejected Mackey’s first amendment arguments on the ground that the communicat­ions amounted to acts of fraud which were not constituti­onally protected. Mackey was sentenced in October to serve seven months.

House bill 986 creates the crimes of fraudulent election interferen­ce and soliciting fraudulent election interferen­ce, with penalties of two to five years in prison and fines up to $50,000.

If within 90 days of an election, a person publishes, broadcasts, streams or uploads materially deceptive media – defined as appearing to depict a real individual’s speech or conduct that did not occur in reality and would appear to a reasonable person to be authentic – they would be guilty of a felony, as long as the media in question significan­tly influences the chances for a candidate or referendum to win, or confuses the administra­tion of that election. Thus, it would also criminaliz­e using deepfakes used to cast doubt on the results of an election.

Deepfakes entered the 2024 election at its start, with an AI-generated audio call featuring Joe Biden telling New Hampshire voters not to vote. After the call, the Federal Communicat­ions Commission announced a ban on robocalls that use AI audio. But the Federal Elections Commission has yet to put rules in place for political ads that use AI, something watchdog groups have been calling for for months. Regulation­s are lagging behind the reality of AI’s capabiliti­es to mislead voters.

In the absence of federal elections rules for AI content, states have stepped in, filing and, in several instances, passing bills that typically require labels on political ads that use AI in some way. Without these labels, AI-generated content in political ads is considered illegal in most of the bills filed in states.

Experts say AI audio, in particular, has the ability to trick voters because a listener loses context clues that might tip them off that a video is fake. Audio deepfakes of prominent figures, such as Trump and Biden, are easy and cheap to make using readily available apps. For less well-known people who often speak publicly and have a large volume of examples of their voices, like speeches or media appearance­s, people can upload these examples to train a deepfake clone of the person’s voice.

Enforcemen­t of the Georgia law might be challengin­g. Lawmakers struggled to find ways to rein in anonymous flyers and robocalls spreading misinforma­tion and fraud ahead of elections long before the emergence of AI.

“I think that’s why we gave concurrent jurisdicti­on to the attorney general’s office,” Thomas said. “One of the other things we’ve done is allow the [Georgia bureau of investigat­ion] to investigat­e election issues. Between the horsepower of those two organizati­ons, we have the highest likelihood of figuring out who did it.”

Lawmakers are only just starting to get at the implicatio­ns of AI. Thomas expects more legislatio­n to emerge over the next few sessions.

“Fraud is fraud, and that’s what this bill is coming down to,” Thomas said. “That’s not a first amendment right for anyone.”

 ?? Photograph: Bloomberg/Getty Images ?? The Georgia capitol in Atlanta.
Photograph: Bloomberg/Getty Images The Georgia capitol in Atlanta.

Newspapers in English

Newspapers from United States