States move to ban deepfake nudes
>> Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.
A few weeks later, she and other female students learned that a male classmate was circulating fake nude images of girls who had attended the dance, sexually explicit pictures that he had fabricated using an artificial intelligence app designed to automatically “strip” clothed photos of real girls and women.
Mullet, 15, alerted her father, Mark, a Democratic Washington state senator. Although she was not among the girls in the pictures, she asked if something could be done to help her friends, who felt “extremely uncomfortable” that male classmates had seen simulated nude images of them.
Soon, Mark Mullet and a colleague in the state House proposed legislation to prohibit the sharing of AI-generated sexuality explicit depictions of real minors.
“I hate the idea that I should have to worry about this happening again to any of my female friends, my sisters or even myself,” Caroline Mullet told state lawmakers during a hearing on the bill in January.
The state Legislature passed the bill without opposition. Gov Jay Inslee, a Democrat, signed it last month.
States are on the front lines of a rapidly spreading new form of peer sexual exploitation and harassment in schools.
Boys across the United States have used widely available “nudification” apps to surreptitiously concoct sexually explicit images of their female classmates and then circulated the simulated nudes via group chats on apps like Snapchat and Instagram.
Now, spurred in part by troubling accounts from teenage girls like Mullet, federal and state lawmakers are rushing to enact protections in an effort to keep pace with exploitative AI apps.
Since early last year, at least two dozen states have introduced bills to combat AI-generated sexually explicit images — known as deepfakes — of people under 18, according to the National Center for Missing & Exploited Children, a nonprofit organisation. And several states have enacted the measures.
Among them, South Dakota this year passed a law that makes it illegal to possess, produce or distribute AI-generated sexual abuse material depicting real minors. Last year, Louisiana enacted a deepfake law that criminalises AI-generated sexually explicit depictions of minors.
“I had a sense of urgency hearing about these cases and just how much harm was being done,” said Rep Tina Orwall, a Democrat who drafted Washington state’s explicit-deepfake law after hearing about incidents like the one at Issaquah High.
Some lawmakers and child protection experts say such rules are urgently needed because the easy availability of AI nudification apps is enabling the mass production and distribution of false, graphic images that can potentially circulate online for a lifetime, threatening girls’ mental health, reputations and physical safety.
“One boy with his phone in the course of an afternoon can victimise 40 girls, minor girls,” said Yiota Souras, chief legal officer for the centre, “and then their images are out there.”
Over the past two months, deepfake nude incidents have spread in schools — including in Richmond, Illinois, and Beverly Hills and Laguna Beach, California.
The US legal code prohibits the distribution of computer-generated child sexual abuse material depicting identifiable minors engaged in sexually explicit conduct.
Last month, the FBI issued an alert warning that such illegal material included realistic child sexual abuse images generated by AI.
Now states are working to pass laws to halt exploitative AI images. This month, California introduced a bill to update a state ban on child sexual abuse material to specifically cover AI-generated abusive material.
And Massachusetts lawmakers are wrapping up legislation that would criminalise the non-consensual sharing of explicit images, including deepfakes.
It would also require a state entity to develop a diversion programme for minors who shared explicit images to teach them about issues like the “responsible use of generative artificial intelligence.”
Punishments can be severe. Under the new Louisiana law, any person who knowingly creates, distributes, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.