CBC Edition

Fake photos, but make it fashion. Why the Met Gala pics are just the beginning of AI deception

- Natalie Stechyson

Actor Jared Leto carrying around his own head as an accessory? Real. Rapper Lil Nas X, painted head to toe in silver, his body en‐ crusted with pearls and crystals, wearing only a metallic Dior thong? It hap‐ pened. Actor and singer Bil‐ ly Porter, wearing a catsuit, carried into the event by six shirtless men in gold pants? Yes.

If there's any event where it might be difficult to discern reality from fantasy, it's the Met Gala, where Grimes once brandished a sword and Lady Gaga once stripped through four different outfits until she was wearing only a black lin‐ gerie set, go-go boots, and pulling a pink wagon behind her on the carpet.

But this year, people weren't tripped up by the fashion choices (which were relatively tame, naked dresses aside). Instead, they were confused about which celebritie­s were actually there, thanks to AI-generated images during fashion's biggest night.

And while the AI photos swirling online of celebritie­s like Katy Perry and Rihanna might seem harmless, ex‐ perts note that each instance of people being misled by generative AI underlines growing concerns around the misuse of this technology.

It's particular­ly concerning regarding disinforma­tion and the potential to carry out scams, identity theft or pro‐ paganda, and even election manipulati­on, they said.

"It used to be that seeing is believing, and now seeing is not believing," Cayce My‐ ers, a professor and director of graduate studies at Vir‐ ginia Tech's School of Com‐ munication, told the Associ‐ ated Press.

"[If] even a mother can be fooled into thinking that the image is real, that shows you the level of sophistica­tion that this technology now has."

WATCH | Can you ever really verify a photo?:

Katy Perry wasn't at the Met Gala

AI-generated images de‐ picting a handful of big names, including Perry and Rihanna, at the Metropolit­an

Museum of Art's annual fundraiser quickly spread on‐ line Monday and early Tues‐ day. Perry and Rihanna didn't attend the gala, and it's un‐ clear where exactly the pho‐ tos of them originated from.

Perry re-posted two of the images to her Instagram Monday night, writing that she "couldn't make it to the Met, had to work." She also included a video of herself in the studio, and a screenshot of a text from her mom com‐ plimenting her outfit.

"lol mom the AI got to you too, BEWARE!" Perry respon‐ ded in the text screenshot.

By Wednesday morning, the post contained a warning from fact-checker PolitiFact that the Met photo was AIgenerate­d.

Fake photos of other celebritie­s from the gala cir‐ culated online, as well, in‐ cluding one of Rihanna in a white gown, and another of singer Jason Derulo falling down a flight of stairs. On X, formerly Twitter, a warning was added to the photos of Rihanna, saying they're AI.

The fact-checking website VerifyThis confirmed Derulo did not fall down the stairs the photo shared isn't of him, and was in fact taken at the Cannes Film Festival in 2011.

But first, the pictures and others like them were shared and liked across social media platforms. An influencer on TikTok even gave a glowing review of a fake outfit on Kim Kardashian. (As Forbes pointed out, Kardashian was at the event, but wearing something else.) The video has 9.6 million views.

In a follow-up video, the influencer reacted to Perry's fake image, saying "the fact that her own mother, her own blood, thought this was Katy Perry ... the entire in‐ ternet was fooled."

The problemati­c rise of photo manipulati­on and AI

This is far from the first time we've seen generative AI, a branch of AI that can create something new, used to create phony content.

Image, video and audio deepfakes of prominent fig‐ ures, from Pope Francis to Taylor Swift, have gained loads of traction online be‐ fore.

It's also far from the first photo manipulati­on that had people questionin­g what's real, and has media outlets looking at photos with re‐ newed scrutiny.

In March, for instance, two images released by Kens‐ ington Palace were found to have been digitally altered the now-infamous photo of Catherine, the Princess of Wales, surrounded by her smiling children; and a 2022 image of Queen Elizabeth with her grandchild­ren and great-grandchild­ren.

But AI has taken these concerns to the next level.

"The implicatio­ns here go far beyond the safety of the individual - and really does touch on things like the safety of the nation, the safety of [our] whole society," David Broniatows­ki, an asso‐ ciate professor at George Washington University and lead principal investigat­or of the Institute for Trustworth­y AI in Law & Society at the school, told the Associated

Press.

Earlier this year, sexually explicit and abusive fake im‐ ages of Swift, for example, began circulatin­g online. Re‐ search also shows that expli‐ cit AI-generated material overwhelmi­ngly harms women and children - includ‐ ing cases of AI-generated nudes circulatin­g through high schools.

WATCH | Making an AIgenerate­d image takes sec‐ onds:

In March, the Center for Countering Digital Hate (CCDH), a U.K. non-profit that monitors online hate speech, released a report showing how AI image generators can threaten election integrity and democracy.

The centre used genera‐ tive AI tools to create images of U.S. President Joe Biden ly‐ ing in a hospital bed and election workers smashing voting machines, raising wor‐ ries about potential false‐ hoods ahead of the U.S. pres‐ idential election in Novem‐ ber.

"The potential for such AIgenerate­d images to serve as 'photo evidence' could exac‐ erbate the spread of false claims, posing a significan­t challenge to preserving the integrity of elections," CCDH researcher­s said in the re‐ port.

And that's where the fake Met gala photos are making some people nervous.

"The AI generated fake photos from the Met Gala are a low-stakes prelude for what's going to happen be‐ tween now and the elec‐ tions," a user wrote on X.

"Watching everyone get fooled in real-time by the AI Rihanna Met Gala look should make us all quake in fear about the upcoming election coverage. I'm so tired already," wrote another.

 ?? ??

Newspapers in English

Newspapers from Canada