Glamorgan Gazette

Social media firms ‘must get a grip’ on abuse images

- JEMMA CREW Press Associatio­n newsdesk@walesonlin­e.co.uk

SOCIAL media companies must not encrypt messages unless they can guarantee they can keep platforms free of illegal content, an inquiry has warned.

Chaired by Ogmore MP Chris Elmore, the All-Party Parliament­ary Group (APPG) on Social Media is calling for companies to step up and do more to protect children from online grooming and sexual abuse.

It launched its inquiry into the “disturbing” rise of so-called “self-generated” child sexual abuse material last November.

The cross-party MPs say the Home Office must review legislatio­n to ensure it is as easy as possible for children to have their images removed from the internet.

Self-generated content can include material filmed using webcams, very often in the child’s own room, and then shared online.

In some cases, children are groomed, deceived or extorted into producing and sharing a sexual image or video of themselves.

The report, entitled Selfie Generation: What’s Behind The Rise Of SelfGenera­ted Indecent Images Of Children?, says the trend “seems to have been exacerbate­d by the Covid-19 crisis”.

Experts believe an increase in the number of offenders exchanging child sex abuse material during lockdowns may stimulate demand beyond the pandemic.

The MPs say many witnesses “raised very real concerns” about the impact of encryption on child protection, saying it could “cripple” the ability of programs to detect illegal imagery.

They write: “The APPG believes it is completely unacceptab­le for a company to encrypt a service that has many child users.

“Doing this would do so much damage to child protection.

“We recommend that technology companies do not encrypt their services until a workable solution can be found that ensures equivalenc­y with the current arrangemen­ts for the detection of this imagery.”

Labour MP Mr Elmore said social media companies must be more proactive in rooting out abusive images, and be clear to young users how they can complain about them.

He said: “It’s high time that we take meaningful action to fix this unacceptab­le mess.

“Children are daily at real risk of unimaginab­le cruelty, abuse and, in some instances, death.

“Social media companies are fundamenta­lly failing to discharge their duties, and simply ignoring what should be an obvious moral obligation to keep young users safe.

“They need to get a grip, with institutio­nal redesign, including the introducti­on of a duty-of-care on the part of companies toward their young users.”

The term “self-generated” should “not be taken to imply that such children have any share in the moral responsibi­lity for their abuse”, he added.

Among 10 recommenda­tions, the report says the term should be replaced by “first-person-produced imagery” to avoid inadverten­t victimblam­ing.

Susie Hargreaves, director of the UK Safer Internet Centre, said: “We see the fallout of abuse and, when children are targeted and made to abuse themselves on camera by criminal adult predators, it has a heartbreak­ing effect on children and their families.

“There is hope, and there are ways for children and young people to fight back. The Report Remove tool we launched this year with Childline empowers young people to have illegal images and videos of themselves removed.”

She added: “New legislatio­n will also help make a difference, and the forthcomin­g Online Safety Bill is a unique opportunit­y to make the UK a safer place to be online, particular­ly for children.”

 ?? ?? Ogmore MP Chris Elmore
Ogmore MP Chris Elmore

Newspapers in English

Newspapers from United Kingdom