Yorkshire Post

FACING ONLINE NIGHTMARE

Poet Helen Mort is among the women sharing their experience­s of image-based abuse to highlight the loopholes leaving victims in limbo. Sarah Wilson reports.

- ■ Email: sarah.wilson1@ypn.co.uk ■ Twitter: @sarahirwil­son2

My first instinct was to hide but I wanted to put my name to this experience – I want to say this did happen to me and I shouldn’t feel ashamed. The person who did it should feel ashamed. Helen Mort, on why she decided to speak out after fake graphic images on her were put online.

WHEN AN acquaintan­ce of Helen Mort’s said they had some bad news to deliver late last year, the Sheffield-based poet envisioned all manner of worst-case scenarios.

“I got really panicky... I remember thinking, ‘Maybe something’s happened to my son, maybe something really dreadful has happened,” she recalls.

The revelation, however, turned out to be something Helen could never have imagined: the acquaintan­ce had seen what looked like explicit images of Helen on the internet.

“My first reaction was, ‘That’s impossible’,” she explains. “I have never ever shared any kind of intimate photo of myself anywhere or with anyone – so how could I possibly have ended up on a porn site?”

When she finally plucked up the courage to look for herself, Helen discovered that someone posing as her “boyfriend” had taken images of her face from various social-media accounts and superimpos­ed them on to explicit images, saying he wanted to see Helen “abused and humiliated” and inviting others to make further “fakes”.

“Some of them were almost comically bad”, she recalls. “But there were a few which looked much more realistic. Those were the most disturbing of all.”

Helen had unknowingl­y become a victim of “deepfake porn”, an emerging form of image-based abuse in which a person’s photograph is manipulate­d into explicit, sometimes violent, videos or stills. The discovery led to extreme anxiety, stress and paranoia, with Helen suffering recurring nightmares about the images and feeling temporaril­y unable to leave the house.

Yet when she approached the police, they were unable to take any action. Currently no law exists against making or sharing deepfake images, no matter how realistic.

Helen’s story is just one example of the myriad ways current legislatio­n is failing victims of image-based abuse, a behaviour now so endemic that the UK’s Revenge Porn Helpline recorded an average of nine reports per day in 2020 – marking their busiest year on record.

So-called “revenge porn”, which entails the sharing of private sexual photos or videos without consent, was outlawed in England and Wales in 2015.

The very first conviction, however, exposed glaring holes in the new legislatio­n: namely that revenge porn is classed as a communicat­ions offence, not a sexual offence, meaning victims are not granted an automatic right to anonymity.

Charged in Yorkshire, the victim of the first man to be convicted of “revenge porn” in England thus saw her name and details of the case emerge in the press.

“Her name and situation was spread all over the national media, causing significan­t distress and anxiety”, explains Julia Mulligan, the North Yorkshire Crime Commission­er who subsequent­ly spearheade­d the #NoMoreNami­ng campaign, calling for anonymity for victims.

In some cases, says Dr Kelly Johnson, an Assistant Professor at the University of Durham and expert in image-based abuse, this lack of anonymity can retraumati­se victims if cases go to court.

“There have been examples where cases have gone to court and a victim’s intimate images have gone viral again, because all you need to know is the victim’s name and you can search and access them online”.

Some victims of “revenge porn” are unable to even get this far, says Dr Johnson, thanks to the law’s focus on the motivation­s of the perpetrato­r in sharing the intimate images.

“The law has a motivation requiremen­t attached to it, meaning you have to prove the perpetrato­r was motivated by causing the victim distress,” she explains.

“Police have been unable to take cases forward because the perpetrato­r will say ‘I didn’t do it to upset her’ or ‘I did it for lad points’ and that’s a viable defence – because of the motivation requiremen­t.”

For people like Kitty Wenham, this focus on the perpetrato­r’s motivation­s, rather than the victim’s distress, has prevented any form of criminal justice for image-based abuse. Earlier this year, Kitty’s mental health was left in tatters after intimate images of her were shared across social-media platforms without her consent.

When she reported the incident to police, however, she was told there was little they could do. “They basically said it didn’t count as revenge porn because the person posting them wasn’t asking for anything, wasn’t blackmaili­ng me,” she explains.

Aside from being unable to escalate her case, Kitty was further discourage­d by the fact the police seemed inadequate­ly trained for dealing with victims like herself, with officers “not seeming to understand how all the social-media sites worked. I had to explain Instagram at one point”.

She adds that, after supplying screenshot­s showing the nonconsens­ually shared images to the police, she suspected they were “looked at in a room with other people in it... that made me really uncomforta­ble”.

Victims of image-based abuse report a “mixed bag” of experience­s in reporting to the police, says Dr Johnson, with negative experience­s often leading, as in Kitty’s instance, to victims dropping their case.

Only 24 per cent of victims even report incidents in the first place. Negative experience­s are often driven by a lack of training among police, says Dr Johnson, with a 2017 survey revealing 95 per cent of police had no training on “revenge porn” legislatio­n whatsoever.

In some cases, there simply isn’t legislatio­n to be trained on, with no law in place against the “deepfake porn” Helen was a victim of, nor against threats to share intimate images online, something one in seven British women aged 18 to 34 have been the victim of.

“The law simply isn’t fit for purpose,” Julia Mulligan explains, citing the loopholes and omissions of current legislatio­n as the reason why, in spite of rising cases, prosecutio­ns under the legislatio­n have fallen dramatical­ly.

Dr Johnson says a huge cultural shift is required. “We have to invest in good education programmes and preventati­ve programmes, particular­ly for things like sexual ethics, victim blaming... too much education out there focuses on the taking of the images rather than the sharing without consent.”

It’s a sentiment Helen echoes, saying that old-school misogyny plays a significan­t role in driving incidents and attaching shame to victims.

“My first instinct was to hide, to think ‘Maybe I shouldn’t have had a presence on social media’ – but that’s exactly the same logic as telling rape victims ‘You shouldn’t have gone out dressed like that’.

“That’s why I wanted to put my name to this experience. I want to say, ‘Actually, this did happen to me and I shouldn’t feel ashamed about it. The person who did it should feel ashamed’.”

 ??  ??
 ?? PICTURES: JONATHAN GAWTHORPE ?? FIGHTING BACK: Helen Mort, top, has spoken about being a victim of image-based abuse – an issue Julia Mulligan, above, has campaigned on.
PICTURES: JONATHAN GAWTHORPE FIGHTING BACK: Helen Mort, top, has spoken about being a victim of image-based abuse – an issue Julia Mulligan, above, has campaigned on.
 ??  ??

Newspapers in English

Newspapers from United Kingdom