The Indian Express (Delhi Edition)

HOW TO SPOT AI-POWERED POLITICAL DISINFORMA­TION IN POLL SEASON

- ANKITA KISHOR DESHKAR

THE first phase of voting for the Lok Sabha elections is on Friday. Over the past few weeks, there has been a deluge of disinforma­tion and manipulate­d media online.

Two videos of actor Aamir Khan went viral this week. Both were manipulate­d versions of a promo for Khan’s popular TV show, Satyamev Jayate. In one, Khan appears to be explicitly supporting the Congress party, while in the other, he is seen speaking about nyay (justice) — a key Congress talking point in recent years, and the title of its manifesto (Nyay Patra or ‘Document [for] Justice’)

Recently, actor Ranveer Singh too was a victim of deepfake technology, when a manipulate­d video of him criticisin­g Prime Minister Narendra Modi on the issues of unemployme­nt and inflation was widely shared. In the original clip, however, Ranveer was actually praising the prime minister.

Here is how these deepfake videos are made — and how you can spot them.

Voice swap technology

itisaar.ai,anaidetect­iontooldev­eloped in collaborat­ion with IIT Jodhpur, shows that these videos were generated using ‘voice swap’ technology.

As the name suggests, this refers to the process of using an AI algorithm to either alter or mimic an individual’s voice. The technology also allows the creators to change the characteri­stics of a voice, such asaccent,tone,pitch,andspeechp­atterns to make the videos more realistic.

Currently, there are several easy-touse AI voice swap tools available for free. The creator has to simply upload or record the audio sample that she wants to replace, and then customise the settings to make the uploaded sample sound as realistic as possible.

Spotting deepfakes

While it is not easy to spot well-produced deepfakes, here are some tips to keep in mind while scrolling through social media, especially during election time.

Be cautious of audio or video content from unfamiliar sources, especially if it seems controvers­ial or sensationa­l. Verify the authentici­ty of any suspicious post by cross-referencin­g with reliable sources, and trustworth­y media organisati­ons.

Deepfake audio may exhibit subtle anomalies, such as the voice’s unnatural tenor, slightly robotic speech, and irregular pauses. Listen closely for these telltale signs of manipulate­d or synthetic speech.

Deepfake audio is often accompanie­d by manipulate­d visual content, such as altered video footage. Check both audio and visuals elements for any discrepanc­ies or inconsiste­ncies. For instance, if lips do not move in sync with the speech, the video you are seeing may be manipulate­d.

Staying updated about day-to-day news and events is key to recognisin­g the risks associated with deepfakes. It is harder to fool people who have general awareness of what is happening around them.

Afew AI detectors, such as Optic’s ‘AI or Not’ are available to be used for free. You can upload any suspicious audio or video onto such detectors, which will tell you the authentici­ty of any content.

 ?? Youtube screengrab ?? A manipulate­d version of this original Satyamev Jayate video has appeared online
Youtube screengrab A manipulate­d version of this original Satyamev Jayate video has appeared online

Newspapers in English

Newspapers from India