If it can happen to Taylor Swift …
Some days it’s just hard to get ahead of the bad guys — but it’s time for Massachusetts law to at least catch up with them. Even as Massachusetts lawmakers are wrestling with legislation to recognize revenge porn as a crime — something that is taking far too long for this state to do — the bad guys are using the latest in artificial intelligence to get another step ahead.
So in the run-up to this year’s Super Bowl, one of the unfortunate sidelights was the treatment of Travis Kelce’s girlfriend — yes, that would be the Taylor Swift — who became the victim of AI-generated sexually explicit images on social media. She’s hardly the first celebrity to be subject to such abuse. But as the technology gets better so do the opportunities for exploitation. One deep fake image of Swift was reportedly viewed over 45 million times on X before X ordered the content taken down and “appropriate actions against the accounts responsible for posting them.”
OK, but that’s because it’s Taylor Swift. What about mere mortals? And what about explicit images, including real ones, that were willingly shared with onetime romantic partners — never meant for the eyes of strangers — now made public as an act of revenge?
Massachusetts has the dubious distinction of being one of only two states (the other is South Carolina) where revenge porn is not in and of itself a crime.
And too many women in Massachusetts have already paid a price for that big blank hole in state law.
This week the Senate gets a chance to put its stamp on a bill to provide an enforceable ban on revenge porn, increase penalties for criminal harassment, and provide diversion programs for teen sexting and educational efforts aimed at prevention.
“So many people who have been victims of this, who have been survivors of this, will feel that they’ve been heard, and that going forward, when this does happen, there will be recourse,” state Senator John Keenan, a sponsor of the bill, told State House News Service. “Hopefully, it will also have a deterrent effect. People [will] understand the gravity of this type of behavior and know that it’s criminal behavior in certain circumstances.”
The Senate bill that emerged from committee last week is similar to House-generated legislation, which passed unanimously in that branch last January
in its most important aspects. It establishes a new criminal offense for the distribution without consent of “visual material” — including “visual material produced by digitization” — depicting another person who is nude, partially nude, or engaged in sexual conduct, and makes it punishable by up to
2 ½ years in prison or a fine of up to $10,000 or both. Subsequent convictions would bring longer prison terms and larger fines.
It also adds the concept of “coercive control” to the definition of abuse for purposes of obtaining a domestic violence restraining order in court. That new definition includes the threat of sharing sexually explicit images.
The Senate bill differs from the House version in giving primary responsibility for juvenile diversion programs — aimed at providing an off-ramp for youthful offenders accused of “sexting” — to the Office of the Child Advocate rather than the attorney general’s office. It requires only that the Office of the Child Advocate consult with the AG, district attorneys, and the Department of Youth Services.
The two chambers will have to hash out those slightly different approaches. The sad fact is, however, that this legislation has already been the subject of too much Beacon Hill gamesmanship — which killed it two years ago. The House passed a similar bill in May 2022, but the Senate failed to act on it until the waning days of the session, and it died in conference committee.
That lapse was unfortunate then. It would be shameful in 2024.
One deep fake image of Swift was reportedly viewed over 45 million times on X before X ordered the content taken down and ‘appropriate actions against the accounts responsible for posting them.’