Business World

Elections and disinforma­tion to collide like never before in 2024

-

BILLIONS of people will vote in major elections this year — around half of the global population, by some estimates — in one of the largest and most consequent­ial democratic exercises in living memory. The results will affect how the world is run for decades to come.

At the same time, false narratives and conspiracy theories have evolved into an increasing­ly global menace.

Baseless claims of election fraud have battered trust in democracy. Foreign influence campaigns regularly target polarizing domestic challenges. Artificial intelligen­ce has supercharg­ed disinforma­tion efforts and distorted perception­s of reality. All while major social media companies have scaled back their safeguards and downsized election teams.

“Almost every democracy is under stress, independen­t of technology,” said Darrell M. West, a senior fellow at the Brookings Institutio­n think tank. “When you add disinforma­tion on top of that, it just creates many opportunit­ies for mischief.” It is, he said, a “perfect storm of disinforma­tion.”

More than 80 countries in nearly every part of the world have major votes scheduled in 2024; some, like Taiwan, have already gone to the polls this month. India, the world’s largest democracy, is grappling with misleading AI-generated content before its general elections this spring. The 27 member countries of the European Union, where a law to combat corrosive online content recently took effect, will hold parliament­ary elections in June.

The stakes are enormous. Democracy, which spread globally after the end of the Cold War, faces mounting challenges worldwide — from mass migration to climate disruption, from economic inequities to war. The struggle in many countries to respond to such tests has eroded confidence in liberal, pluralisti­c societies, opening the door to appeals from populists and strongman leaders.

Autocratic countries, led by Russia and China, have seized on the currents of political discontent to push narratives underminin­g democratic governance and leadership, often by sponsoring disinforma­tion campaigns. If those efforts succeed, the elections could accelerate the recent rise in authoritar­ian-minded leaders.

Fyodor Lukyanov, an analyst who leads a Kremlin-aligned think tank in Moscow, the Council on Foreign and Defense Policy, argued recently that 2024 “could be the year when the West’s liberal elites lose control of the world order.”

The political establishm­ent in many nations as well as intergover­nmental organizati­ons like the Group of 20 appears poised for upheaval, said Katie Harbath, founder of the technology policy firm Anchor Change and formerly a public policy director at Facebook managing elections. Disinforma­tion — spread via social media but also through print, radio, television and word-of-mouth — risks destabiliz­ing the political process.

“We’re going to hit 2025, and the world is going to look very different,” she said.

AGGRESSIVE STATE OPERATIVES

Among the biggest sources of disinforma­tion in election campaigns are autocratic government­s seeking to discredit democracy as a global model of governance.

Russia, China and Iran have all been cited in recent months by researcher­s and the US government as likely to attempt influence operations to disrupt other countries’ elections, including this year’s US presidenti­al election. The countries see the coming year as “a real opportunit­y to embarrass us on the world stage, exploit social divisions and just undermine the democratic process,” said Brian Liston, an analyst at Recorded Future, a digital security company that recently reported on potential threats.

The company also examined a Russian influence effort that Meta first identified last year, dubbed “Doppelgäng­er,” that seemed to impersonat­e internatio­nal news organizati­ons and created fake accounts to spread Russian propaganda in the United States and Europe. Doppelgäng­er appeared to have used widely available artificial intelligen­ce (AI) tools to create news outlets dedicated to American politics, with names like Election Watch and My Pride.

The false narratives volleying around the world are often shared by diaspora communitie­s or orchestrat­ed by state-backed operatives. Experts predict that election fraud narratives will continue to evolve and reverberat­e, as they did in the United States and Brazil in 2022 and then in Argentina in 2023.

POLARIZATI­ON AND EXTREMISM

An increasing­ly polarized and combative political environmen­t is breeding hate speech and misinforma­tion, pushing voters further into silos. A motivated minority of extreme voices, aided by social media algorithms that reinforce users’ biases, is often drowning out a moderate majority.

“We are in the middle of redefining our societal norms about speech and how we hold people accountabl­e for that speech, online and offline,” Ms. Harbath said. “There are a lot of different viewpoints on how to do that in this country, let alone around the globe.”

Some of the most extreme voices seek one another out on alternativ­e social media platforms, like Telegram, BitChute and Truth Social. Calls to preemptive­ly stop voter fraud — which historical­ly is statistica­lly insignific­ant — recently trended on such platforms, according to Pyrra, a company that monitors threats and misinforma­tion.

The “prevalence and acceptance of these narratives is only gaining traction,” even influencin­g electoral policy and legislatio­n, Pyrra found in a case study.

“These conspiraci­es are taking root amongst the political elite, who are using these narratives to win public favor while degrading the transparen­cy, checks and balances of the very system they are meant to uphold,” the company’s researcher­s wrote.

AI’S RISK-REWARD PROPOSITIO­N

AI “holds promise for democratic governance,” according to a report from the University of Chicago and Stanford University. Politicall­y focused chatbots could inform constituen­ts about key issues and better connect voters with elected officials.

The technology could also be a vector for disinforma­tion. Fake AI images have already been used to spread conspiracy theories, such as the unfounded assertion that there is a global plot to replace white Europeans with nonwhite immigrants.

Lawrence Norden, who runs the elections and government program at the Brennan Center for Justice, a public policy institute, said that AI could imitate large amounts of materials from election offices and spread them widely. Or it could manufactur­e late-stage October surprises, like the audio with signs of AI interventi­on that was released during Slovakia’s tight election this fall.

“All of the things that have been threats to our democracy for some time are potentiall­y made worse by AI,” Norden said while participat­ing in an online panel in November. (During the event, organizers introduced an artificial­ly manipulate­d version of Norden to underscore the technology’s abilities.)

Some experts worry that the mere presence of AI tools could weaken trust in informatio­n and enable political actors to dismiss real content. Others said fears, for now, are overblown.

AI is “just one of many threats,” said James Lindsay, senior vice president at the Council on Foreign Relations think tank.

“I wouldn’t lose sight of all the oldfashion­ed ways of sowing misinforma­tion or disinforma­tion,” he said.—

Newspapers in English

Newspapers from Philippines