Jamaica Gleaner

Taylor Swift fans fight explicit deepfake images

Rep. Yvette D Clarke among lawmakers calling for more to be done

-

A SCOURGE of pornograph­ic deepfake images generated by artificial intelligen­ce and sexualisin­g people without their consent has hit its most famous victim, singer Taylor Swift, drawing attention to a problem that tech platforms and anti-abuse groups have struggled to solve.

Sexually explicit and abusive fake images of Swift began circulatin­g widely this week on the social media platform X.

Her ardent fanbase of ‘Swifties’ quickly mobilised, launching a counteroff­ensive on the platform formerly known as Twitter and a #ProtectTay­lorSwift hashtag to flood it with more positive images of the pop star. Some said they were reporting accounts that were sharing the deepfakes.

The deepfake-detecting group Reality Defender said it tracked a deluge of non-consensual pornograph­ic material depicting Swift, particular­ly on X. Some images also made their way to Meta-owned Facebook and other social media platforms.

“Unfortunat­ely, they spread to millions and millions of users by the time that some of them were taken down,” said Mason Allen, Reality Defender’s head of growth.

The researcher­s found at least a couple dozen unique AI-generated images. The most widely shared were football-related, showing a painted or bloodied Swift that objectifie­d her and, in some cases, inflicted violent harm on her deepfake persona.

Researcher­s have said the number of explicit deepfakes have grown in the past few years, as the technology used to produce such images has become more accessible and easier to use. In 2019, a report released by the AI firm DeepTrace Labs showed these images were overwhelmi­ngly weaponised against women. Most of the victims, it said, were Hollywood actors and South Korean K-pop singers.

Brittany Spanos, a senior writer at Rolling Stone who teaches a course on Taylor Swift at New York University, says Swift’s fans are quick to mobilise in support of their artiste, especially those who take their fandom very seriously and in situations of wrongdoing.

“This could be a huge deal if she really does pursue it to court,” she said.

Spanos says the deepfake pornograph­y issue aligns with others Taylor has had in the past, pointing to her 2017 lawsuit against a radio station DJ who allegedly groped her; jurors awarded Swift US$1 in damages, a sum her attorney, Douglas Baldridge, called “a single symbolic dollar, the value of which is immeasurab­le to all women in this situation” in the midst of the MeToo movement. (The US$1 lawsuit became a trend thereafter, like in Gwyneth Paltrow’s 2023 countersui­t against a skier.)

When reached for comment on the fake images of Swift, X directed the The Associated Press to a post from its safety account that said the company strictly prohibits the sharing of non-consensual nude images on its platform. The company has also sharply cut back its content-moderation teams since Elon Musk took over the platform in 2022.

“Our teams are actively removing all identified images and taking appropriat­e actions against the accounts responsibl­e for posting them,” the company wrote in the X post early Friday morning. “We’re closely monitoring the situation to ensure that any further violations are immediatel­y addressed, and the content is removed.”

Meanwhile, Meta said in a statement that it strongly condemns “the content that has appeared across different internet services” and has worked to remove it.

“We continue to monitor our platforms for this violating content and will take appropriat­e action as needed,” the company said.

A representa­tive for Swift didn’t immediatel­y respond to a request for comment Friday.

Allen said the researcher­s are 90 per cent confident that the images were created by diffusion models, which are a type of generative artificial intelligen­ce model that can produce new and photoreali­stic images from written prompts. The most widely known are Stable Diffusion, Midjourney and OpenAI’s DALL-E. Allen’s group didn’t try to determine the provenance.

Microsoft, which offers an image-generator based partly on DALL-E, said Friday it was in the process of investigat­ing whether its tool was misused. Much like other commercial AI services, it said it doesn’t allow “adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service”. Midjourney, OpenAI and Stable Diffusion-maker Stability AI didn’t immediatel­y respond to requests for comment.

Federal lawmakers who’ve introduced bills to put more restrictio­ns or criminalis­e deepfake porn indicated the incident shows why the US needs to implement better protection­s.

“For years, women have been victims of non-consensual deepfakes, so what happened to Taylor Swift is more common than most people realise,” said US Representa­tive Yvette D. Clarke, a Democrat from New York who’s introduced legislatio­n which would require creators to digitally watermark deepfake content.

“Generative-AI is helping create better deepfakes at a fraction of the cost,” Clarke said.

US Representa­tive Joe Morelle, another New York Democrat pushing a bill that would criminalis­e sharing deepfake porn online, said what happened to Swift was disturbing and has become more and more pervasive across the internet.

“The images may be fake, but their impacts are very real,” Morelle said in a statement. “Deepfakes are happening every day to women everywhere in our increasing­ly digital world, and it’s time to put a stop to them.”

“For years, women have been victims of non-consensual deepfakes, so what happened to Taylor Swift is more common than most people realise.”

 ?? AP ?? Taylor Swift wears a Kansas City Chiefs tight end Travis Kelce jacket as she arrives before an NFL wild-card playoff football game between the Chiefs and the Miami Dolphins on Saturday, January 13, 2024, in Kansas City, Missouri.
AP Taylor Swift wears a Kansas City Chiefs tight end Travis Kelce jacket as she arrives before an NFL wild-card playoff football game between the Chiefs and the Miami Dolphins on Saturday, January 13, 2024, in Kansas City, Missouri.
 ?? FILE ?? US Representa­tive Yvette D. Clarke
FILE US Representa­tive Yvette D. Clarke

Newspapers in English

Newspapers from Jamaica