Texarkana Gazette

Experts warn of misinforma­tion impact on elections

- ALI SWENSON AND CHRISTINE FERNANDO

NEW YORK — Nearly three years after rioters stormed the U.S. Capitol, the false election conspiracy theories that drove the violent attack remain prevalent on social media and cable news: suitcases filled with ballots, late-night ballot dumps, dead people voting.

Experts warn it will likely be worse in the coming presidenti­al election contest. The safeguards that attempted to counter the bogus claims the last time are eroding, while the tools and systems that create and spread them are only getting stronger.

Many Americans, egged on by former President Donald Trump, have continued to push the unsupporte­d idea that elections throughout the U.S. can’t be trusted. A majority of Republican­s (57%) believe Democrat Joe Biden was not legitimate­ly elected president.

Meanwhile, generative artificial intelligen­ce tools have made it far cheaper and easier to spread the kind of misinforma­tion that can mislead voters and potentiall­y influence elections. And social media companies that once invested heavily in cor- recting the record have shifted their priorities.

“I expect a tsunami of misinfor- mation,” said Oren Etzioni, an artificial intelligen­ce expert and professor emeritus at the University of Wash- ington. “I can’t prove that. I hope to be proven wrong. But the ingredient­s are there, and I am completely terrified.”

AI DEEPFAKES GO MAINSTREAM

Manipulate­d images and videos surroundin­g elections are nothing new, but 2024 will be the first U.S. presidenti­al election in which sophistica­ted AI tools that can produce convincing fakes in seconds are just a few clicks away.

The fabricated images, videos and audio clips known as deepfakes have started making their way into experiment­al presidenti­al campaign ads. More sinister versions could easily spread without labels on social media and fool people days before an election, Etzioni said.

“You could see a political candidate like President Biden being rushed to a hospital,” he said. “You could see a candidate saying things that he or she never actually said. You could see a run on the banks. You could see bombings and violence that never occurred.”

High-tech fakes already have affected elections around the globe, said Larry Norden, senior director of the elections and government program at the Brennan Center for Justice. Just days before Slovakia’s recent elections, Ai-generated audio recordings impersonat­ed a liberal candidate discussing plans to raise beer prices and rig the election. Fact-checkers scrambled to identify them as false, but they were shared as real across social media regardless.

These tools might also be used to target specific communitie­s and hone misleading messages about voting. That could look like persuasive text messages, false announceme­nts about voting processes shared in different languages on Whatsapp, or bogus websites mocked up to look like official government ones in your area, experts said.

Faced with content that is made to look and sound real, “everything that we’ve been wired to do through evolution is going to come into play to have us believe in the fabricatio­n rather than the actual reality,” said misinforma­tion scholar Kathleen Hall Jamieson, director of the Annenberg Public Policy Center at the University of Pennsylvan­ia.

Republican­s and Democrats in Congress and the Federal Election Commission are exploring steps to regulate the technology, but they haven’t finalized any rules or legislatio­n. That’s left states to enact the only restrictio­ns so far on political AI deepfakes.

A handful of states have passed laws requiring deepfakes to be labeled or banning those that misreprese­nt candidates. Some social media companies, including Youtube and Meta, which owns Facebook and Instagram, have introduced AI labeling policies. It remains to be seen whether they will be able to consistent­ly catch violators.

SOCIAL MEDIA GUARDRAILS FADE

It was just over a year ago that Elon Musk bought Twitter and began firing its executives, dismantlin­g some of its core features and reshaping the social media platform into what’s now known as X.

Since then, he has upended its verificati­on system, leaving public officials vulnerable to impersonat­ors. He has gutted the teams that once fought misinforma­tion on the platform, leaving the community of users to moderate itself. And he has restored the accounts of conspiracy theorists and extremists who were previously banned.

The changes have been applauded by many conservati­ves who say Twitter’s previous moderation attempts amounted to censorship of their views. But pro-democracy advocates argue the takeover has shifted what once was a flawed but useful resource for news and election informatio­n into a largely unregulate­d echo chamber that amplifies hate speech and misinforma­tion.

Twitter used to be one of the “most responsibl­e” platforms, showing a willingnes­s to test features that might reduce misinforma­tion even at the expense of engagement, said Jesse Lehrich, co-founder of Accountabl­e Tech, a nonprofit watchdog group.

“Obviously now they’re on the exact other end of the spectrum,” he said, adding that he believes the company’s changes have given other platforms cover to relax their own policies. X didn’t answer emailed questions from The Associated Press, only sending an automated response.

In the run-up to 2024, X, Meta and Youtube have together removed 17 policies that protected against hate and misinforma­tion, according to a report from Free Press, a nonprofit that advocates for civil rights in tech and media.

X, Meta and Youtube also have laid off thousands of employees and contractor­s since 2020, some of whom have included content moderators.

The shrinking of such teams, which many blame on political pressure, “sets the stage for things to be worse in 2024 than in 2020,” said Kate Starbird, a misinforma­tion expert at the University of Washington.

THE TRUMP FACTOR

Trump’s front-runner status in the Republican presidenti­al primary is top of mind for misinforma­tion researcher­s who worry that it will exacerbate election misinforma­tion and potentiall­y lead to election vigilantis­m or violence.

The former president still falsely claims to have won the 2020 election.

“Donald Trump has clearly embraced and fanned the flames of false claims about election fraud in the past,” Starbird said. “We can expect that he may continue to use that to motivate his base.”

Without evidence, Trump has already primed his supporters to expect fraud in the 2024 election, urging them to intervene to ” guard the vote ” to prevent vote rigging in diverse Democratic cities. Trump has a long history of suggesting elections are rigged if he doesn’t win and did so before voting in 2016 and 2020.

That continued wearing away of voter trust in democracy can lead to violence, said Bret Schafer, a senior fellow at the nonpartisa­n Alliance for Securing Democracy, which tracks misinforma­tion.

“If people don’t ultimately trust informatio­n related to an election, democracy just stops working,” he said. “If a misinforma­tion or disinforma­tion campaign is effective enough that a large enough percentage of the American population does not believe that the results reflect what actually happened, then Jan. 6 will probably look like a warm-up act.”

ELECTION OFFICIALS RESPOND

Election officials have spent the years since 2020 preparing for the expected resurgence of election denial narratives. They’ve dispatched teams to explain voting processes, hired outside groups to monitor misinforma­tion as it emerges and beefed up physical protection­s at vote-counting centers.

In Colorado, Secretary of State Jena Griswold said informativ­e paid social media and TV campaigns that humanize election workers have helped inoculate voters against misinforma­tion.

“This is an uphill battle, but we have to be proactive,” she said. “Misinforma­tion is one of the biggest threats to American democracy we see today.”

 ?? (AP photo/john Froschauer) ?? Oren Etzioni poses for photos at the Allen Institute for Artificial Intelligen­ce where he serves as advisor and board member,. Experts are warning that the spread of misinforma­tion could get worse in the coming presidenti­al election contest. The safeguards that attempted to counter the bogus claims the last time are eroding, while the tools and systems that create and spread them are only getting stronger.
(AP photo/john Froschauer) Oren Etzioni poses for photos at the Allen Institute for Artificial Intelligen­ce where he serves as advisor and board member,. Experts are warning that the spread of misinforma­tion could get worse in the coming presidenti­al election contest. The safeguards that attempted to counter the bogus claims the last time are eroding, while the tools and systems that create and spread them are only getting stronger.

Newspapers in English

Newspapers from United States