GQ (South Africa)

Deepfakes: genius or terrifying?

- Words by Shannon Manuel

We deep dive into deepfake, the unsettling technology with a high potential to deceive. Is this new frontier in artificial intelligen­ce a feat of genius, a terrifying global disaster in the making, or simply the beginning of a never-ending real-life Black Mirror episode?

It’s good fun, not to mention impressive. Equally remarkable applicatio­ns in the pipeline could make quick work out of once painstakin­g tasks: filling in gaps and scratches in damaged images or videos; turning satellite photos into maps; creating realistic videos of streetscap­es to train autonomous vehicles; giving a natural-sounding voice to those who’ve lost theirs; turning Hollywood actors into their older or younger selves; and more. Many people have labelled deepfakes as the future of content creation, and a range of industries are already using the technology commercial­ly.

Last year, an Ai-powered virtual news anchor resembling a real-life female presenter appeared on the South Korean TV channel MBN. The virtual anchor was created using 10 hours of footage of the channel’s

Tom Cruise looks at The Camera. ‘I’m going to show you some magic,’ he says, holding up a coin. ‘It’s the real thing,’ the Hollywood actor insists, giving his trademark laugh and making the coin disappear. ‘It’s all the real thing’

Earlier this year, a viral video series on social media app Tiktok appearing to show the actor playing golf, telling jokes and performing magic tricks made headlines. Millions watched it, many of whom initially wondered if the 59-year-old Mission: Impossible star had joined the video-sharing platform, and the account gained hundreds of thousands of followers after posting just a few videos. Except it wasn’t Cruise but a deepfake – a video-generated by artificial intelligen­ce (AI).

Belgian visual effects artist Chris Ume had used a Tom

Cruise impersonat­or combined with machine learning software to achieve a technologi­cal trickery that had experts impressed by its quality. While the clips may seem harmless, they sparked concerns about how far the technology has evolved in just a few years.

Deepfake technology has been put to a broad range of uses that have, for the most part, been for entertainm­ent purposes – Star Wars fans modified footage from 2018’s Solo: A Star Wars Story so lead actor Alden Ehrenreich resembled a young Harrison Ford. A Back to the Future deepfake featured Michael J. Fox’s face replaced with Avengers’s Tom Holland’s, and Robert Downey Jr stood in for Christophe­r Lloyd. There’s even a digitally altered version of the Queen’s Speech in which, instead of her usual message of goodwill, Her Majesty makes snide jokes about the year’s events before jumping onto her desk for a festive boogie. And as for the many, many Nicolas Cage face-swapping videos, well, talk about Face Off! star anchorwoma­n, Kim Ju Ha. The virtual Kim lookalike debuted as an “Ai-caster” during a news programme last November. Difference­s between the two Kims’ voices, facial expression­s and gestures were barely discernibl­e. It was reported that the broadcaste­r hopes the avatar will be particular­ly useful during disaster emergencie­s when it’d be difficult to quickly assemble presenters, staff and equipment necessary to produce a breaking news segment. The virtual anchor doesn’t need makeup, outfits or rehearsals. It takes just a minute to produce virtual anchor footage for a news piece up to 1 000 words long.

The site myheritage.com allows you to bring the dead back to life for 10 seconds using the Deep Nostalgia feature. The remarkable technology for animating photos was licensed by Myheritage from D-ID, a company that specialise­s in video reenactmen­t using deep learning. Myheritage integrated this technology to animate the faces in historical photos and create high-quality, realistic video footage. The Deep Nostalgia feature uses several drivers prepared by Myheritage. Each driver is a video consisting of a fixed sequence of movements and gestures. Deep Nostalgia can accurately apply the drivers to a face in your still photo, creating a short video you can share with your friends and family. The driver guides the movements in the animation, so you can see your ancestors smile, blink and turn their heads.

Another memorable example is the David Beckham deepfake used in the “Malaria No More” campaign, created by British company Synthesia. In the video, the football icon “speaks” nine languages as he invites people to add their voices to help end one of the world’s deadliest diseases. Using emerging artificial intelligen­ce video synthesis technology, the viewer hears, via Beckham, the voices of real men and women, including malaria survivors and doctors fighting the disease, from the UK to China and Nigeria.

And who can forget the infamous 2018 Buzzfeed video of former US President Barack Obama that appeared to show him cursing and insulting former US President Donald Trump? The clip was fabricated using emerging deepfake technology and was a public service announceme­nt about online video manipulati­on.

People are concerned about the applicatio­n of deepfake technology. You can use it for the greater good, to raise awareness of important issues, elevate industries, or make alternativ­e versions of movies featuring your favourite celebritie­s to entertain. But AI technology can also allow you to create perfectly realistic fake video and audio of people, which is bad. After all, let’s not forget that deepfakes originated from the use of AI to input female celebrity faces onto porn performers’ bodies. The questions raised carry significan­t validity – what happens if people use deepfakes to create disinforma­tion? What if they use them »

 ??  ??
 ??  ??

Newspapers in English

Newspapers from South Africa