Saturday Star

Deepfakes ring election alarm bells around the world

- EILEEN CULLOTY

DISINFORMA­TION caught many people off guard during the 2016 Brexit referendum and US presidenti­al election. Since then, a mini-industry has developed to analyse and counter it.

Yet despite that, we have entered 2024 – a year of more than 40 elections worldwide – more fearful than ever about disinforma­tion. In many ways, the problem is more challengin­g than it was in 2016.

Advances in technology since then are one reason for that, in particular the developmen­t that has taken place with synthetic media, otherwise known as deepfakes. It is increasing­ly difficult to know whether media has been fabricated by a computer or is based on something that happened.

We’ve yet to really understand how big an impact deepfakes could have on elections. But a number of examples point the way to how they may be used. This may be the year when lots of mistakes are made and lessons learned.

Since 2016, researcher­s have produced countless books and papers, journalist­s have retrained as fact checking and verificati­on experts, government­s have participat­ed in “grand committees” and centres of excellence. Additional­ly, libraries have become the focus of resilience building strategies and a range of new bodies has emerged to provide analysis, training, and resources.

This activity hasn’t been fruitless. We now have a more nuanced understand­ing of disinforma­tion as a social, psychologi­cal, political and technologi­cal phenomenon. Most notably, major tech companies no longer pretend to be neutral platforms.

In the meantime, some policymake­rs have rediscover­ed their duty to regulate technology in the public interest.

Regulatory discussion­s have added urgency now that AI tools to create synthetic media – media partially or fully generated by computers – have gone mainstream. These deepfakes can be used to imitate the voice and appearance of real people, are impressive­ly realistic and do not require much skill or resources.

Digital revolution and successive technologi­es have made high-quality content production accessible to almost anyone. In contrast, regulatory structures and institutio­nal standards for media were mostly designed in an era when only a minority of profession­als had access to production.

Political deepfakes can take different forms. The recent Indonesian election saw a deepfake video “resurrecti­ng” the late President Suharto. This was ostensibly to encourage people to vote, but it was accused of being propaganda because it was produced by the political party that he led.

Perhaps a more obvious use of deepfakes is to spread lies about political candidates.

For example, fake Ai-generated audio released days before Slovakia’s parliament­ary election in September last year attempted to portray Progressiv­e Slovakia leader Michal Simecka as having discussed with a journalist how to rig the vote.

Aside from the obvious effort to undermine a political party, it is worth noting how this deepfake, whose origin was unclear, exemplifie­s wider efforts to scapegoat minorities and demonise mainstream journalism.

Fortunatel­y, in this instance, the audio was not high-quality, which made it quicker and easier for fact checkers to confirm its inauthenti­city. However, the integrity of democratic elections cannot rely on the ineptitude of the fakers.

Deepfake audio technology is at a level of sophistica­tion that makes detection difficult. Deepfake videos still struggle with certain human features, such as hands, but the technology is still young.

It is also important to note the Slovakian video was released during the final days of the election campaign. This is a prime time to launch disinforma­tion and manipulati­on attacks because the targets and independen­t journalist­s have their hands full and therefore have little time to respond.

If it is also expensive, time-consuming, and difficult to investigat­e deep fakes, then it’s not clear how electoral commission­s, political candidates, the media, or indeed the electorate should respond when potential cases arise. A false accusation from a deepfake can be as troubling as the actual deepfake.

Another way deepfakes could be used to affect elections can be seen in the way they are already widely used to harass and abuse women and girls. This kind of sexual harassment fits an existing pattern of abuse that limits political participat­ion by women.

The difficulty is that it’s not yet clear exactly what impact deepfakes could have on elections. It’s very possible we could see other, similar uses of deepfakes in upcoming elections this year. And we could even see deepfakes used in ways not yet conceived of.

But it’s also worth rememberin­g that not all disinforma­tion is hightech. There are other ways to attack democracy. Rumours and conspiracy theories about the integrity of the electoral process are an insidious trend. Electoral fraud is a global concern given that many countries are only democracie­s in name.

Clearly, social media platforms enable and drive disinforma­tion in many ways, but it is a mistake to assume the problem begins and ends online. One way to think about the challenge of disinforma­tion during upcoming elections is to think about the strength of the systems that are supposed to uphold democracy.

Is there an independen­t media system capable of providing high quality investigat­ions in the public interest? Are there independen­t electoral administra­tors and bodies? Are there independen­t courts to adjudicate if necessary?

And is there sufficient commitment to democratic values over self interest among politician­s and political parties? This year of elections, we may well find out the answer to these questions. | The Conversati­on

Culloty is an assistant professor at the School of Communicat­ions at Dublin City University

 ?? ?? THE leader of the centrist Progressiv­e Slovakia political party and European Party vice-speaker Michal Simecka was the target of a particular­ly high rate of online disinforma­tion attacks during the country’s elections last year. His party lost the election. A strong independen­t media is vital to counter what is expected to be an onslaught of deepfakes as 40 countries hold elections this year. | AFP
THE leader of the centrist Progressiv­e Slovakia political party and European Party vice-speaker Michal Simecka was the target of a particular­ly high rate of online disinforma­tion attacks during the country’s elections last year. His party lost the election. A strong independen­t media is vital to counter what is expected to be an onslaught of deepfakes as 40 countries hold elections this year. | AFP

Newspapers in English

Newspapers from South Africa