FACING ONLINE NIGHTMARE
Poet Helen Mort is among the women sharing their experiences of image-based abuse to highlight the loopholes leaving victims in limbo. Sarah Wilson reports.
My first instinct was to hide but I wanted to put my name to this experience – I want to say this did happen to me and I shouldn’t feel ashamed. The person who did it should feel ashamed. Helen Mort, on why she decided to speak out after fake graphic images on her were put online.
WHEN AN acquaintance of Helen Mort’s said they had some bad news to deliver late last year, the Sheffield-based poet envisioned all manner of worst-case scenarios.
“I got really panicky... I remember thinking, ‘Maybe something’s happened to my son, maybe something really dreadful has happened,” she recalls.
The revelation, however, turned out to be something Helen could never have imagined: the acquaintance had seen what looked like explicit images of Helen on the internet.
“My first reaction was, ‘That’s impossible’,” she explains. “I have never ever shared any kind of intimate photo of myself anywhere or with anyone – so how could I possibly have ended up on a porn site?”
When she finally plucked up the courage to look for herself, Helen discovered that someone posing as her “boyfriend” had taken images of her face from various social-media accounts and superimposed them on to explicit images, saying he wanted to see Helen “abused and humiliated” and inviting others to make further “fakes”.
“Some of them were almost comically bad”, she recalls. “But there were a few which looked much more realistic. Those were the most disturbing of all.”
Helen had unknowingly become a victim of “deepfake porn”, an emerging form of image-based abuse in which a person’s photograph is manipulated into explicit, sometimes violent, videos or stills. The discovery led to extreme anxiety, stress and paranoia, with Helen suffering recurring nightmares about the images and feeling temporarily unable to leave the house.
Yet when she approached the police, they were unable to take any action. Currently no law exists against making or sharing deepfake images, no matter how realistic.
Helen’s story is just one example of the myriad ways current legislation is failing victims of image-based abuse, a behaviour now so endemic that the UK’s Revenge Porn Helpline recorded an average of nine reports per day in 2020 – marking their busiest year on record.
So-called “revenge porn”, which entails the sharing of private sexual photos or videos without consent, was outlawed in England and Wales in 2015.
The very first conviction, however, exposed glaring holes in the new legislation: namely that revenge porn is classed as a communications offence, not a sexual offence, meaning victims are not granted an automatic right to anonymity.
Charged in Yorkshire, the victim of the first man to be convicted of “revenge porn” in England thus saw her name and details of the case emerge in the press.
“Her name and situation was spread all over the national media, causing significant distress and anxiety”, explains Julia Mulligan, the North Yorkshire Crime Commissioner who subsequently spearheaded the #NoMoreNaming campaign, calling for anonymity for victims.
In some cases, says Dr Kelly Johnson, an Assistant Professor at the University of Durham and expert in image-based abuse, this lack of anonymity can retraumatise victims if cases go to court.
“There have been examples where cases have gone to court and a victim’s intimate images have gone viral again, because all you need to know is the victim’s name and you can search and access them online”.
Some victims of “revenge porn” are unable to even get this far, says Dr Johnson, thanks to the law’s focus on the motivations of the perpetrator in sharing the intimate images.
“The law has a motivation requirement attached to it, meaning you have to prove the perpetrator was motivated by causing the victim distress,” she explains.
“Police have been unable to take cases forward because the perpetrator will say ‘I didn’t do it to upset her’ or ‘I did it for lad points’ and that’s a viable defence – because of the motivation requirement.”
For people like Kitty Wenham, this focus on the perpetrator’s motivations, rather than the victim’s distress, has prevented any form of criminal justice for image-based abuse. Earlier this year, Kitty’s mental health was left in tatters after intimate images of her were shared across social-media platforms without her consent.
When she reported the incident to police, however, she was told there was little they could do. “They basically said it didn’t count as revenge porn because the person posting them wasn’t asking for anything, wasn’t blackmailing me,” she explains.
Aside from being unable to escalate her case, Kitty was further discouraged by the fact the police seemed inadequately trained for dealing with victims like herself, with officers “not seeming to understand how all the social-media sites worked. I had to explain Instagram at one point”.
She adds that, after supplying screenshots showing the nonconsensually shared images to the police, she suspected they were “looked at in a room with other people in it... that made me really uncomfortable”.
Victims of image-based abuse report a “mixed bag” of experiences in reporting to the police, says Dr Johnson, with negative experiences often leading, as in Kitty’s instance, to victims dropping their case.
Only 24 per cent of victims even report incidents in the first place. Negative experiences are often driven by a lack of training among police, says Dr Johnson, with a 2017 survey revealing 95 per cent of police had no training on “revenge porn” legislation whatsoever.
In some cases, there simply isn’t legislation to be trained on, with no law in place against the “deepfake porn” Helen was a victim of, nor against threats to share intimate images online, something one in seven British women aged 18 to 34 have been the victim of.
“The law simply isn’t fit for purpose,” Julia Mulligan explains, citing the loopholes and omissions of current legislation as the reason why, in spite of rising cases, prosecutions under the legislation have fallen dramatically.
Dr Johnson says a huge cultural shift is required. “We have to invest in good education programmes and preventative programmes, particularly for things like sexual ethics, victim blaming... too much education out there focuses on the taking of the images rather than the sharing without consent.”
It’s a sentiment Helen echoes, saying that old-school misogyny plays a significant role in driving incidents and attaching shame to victims.
“My first instinct was to hide, to think ‘Maybe I shouldn’t have had a presence on social media’ – but that’s exactly the same logic as telling rape victims ‘You shouldn’t have gone out dressed like that’.
“That’s why I wanted to put my name to this experience. I want to say, ‘Actually, this did happen to me and I shouldn’t feel ashamed about it. The person who did it should feel ashamed’.”