The Guardian (USA)

Meta’s new parental tools will not protect vulnerable children, experts say

- Katie McQue and Mei-Ling McNamara

Social media giant Meta this week introduced new parental supervisio­n tools, but child protection and antisex traffickin­g organizati­ons say the new measures offer little protection to the children most vulnerable to exploitati­on, and divert the responsibi­lity from the company to keep its users safe.

On Tuesday, Meta launched new features aimed at increasing parents’ awareness of their children’s activities on its platforms. For Messenger, its private message service, parents can now view and receive updates on their child’s contacts list and monitor who views any stories their child posts. On Instagram, the company has introduced a new notice to alert parents if their child has blocked somebody.

But safety features that rely on engaged families may mean that new measures may not protect children who lack consistent supervisio­n of a parent or guardian, such as those in the child welfare system and living in group homes, experts warn.

“An approach to safety that puts the onus on parents and carers is not enough on its own. Many young people may not be able to speak to a parent about online concerns, particular­ly children in care,” said Rani Govender, senior child safety online policy officer at the National Society for the Prevention of Cruelty to Children (NSPCC), a UK-based child protection charity. “Many parents will not have the technical knowledge or time to supervise their child’s social media use.

A 2020 report from the Human Traffickin­g Institute (HTI), which includes the most recent child traffickin­g statistics across social media, found Facebook to be the site most often used to recruit and groom child traffickin­g victims (65%), with Instagram and Snapchat ranking second and third. Child sex traffickin­g is defined as the sexual exploitati­on of a child specifical­ly as part of a commercial transactio­n and according to US law, minors under 18 cannot consent to their own exploitati­on.

“Exploiters look for children online. In earlier days, they would look for them at the mall, but now they are looking for them on social media. Then they target that person and build a relationsh­ip with them,” said Lisa Goldblatt Grace, co-founder and director of My Life My Choice, a Boston-based non-profit organizati­on supporting survivors of child sex traffickin­g.

In 2022, 84% of the trafficked children the organizati­on served were in the care of the child welfare system, she says. Across the US, the National Foster Youth Institute estimates that as much as 60% of total child sex traffickin­g victims have been in foster care or other group homes.

“When it comes to commercial sexual exploitati­on of children, we know that young people who do not have safe and invested parents are disproport­ionately at risk,” said Goldblatt Grace.

A Guardian investigat­ion in April revealed how Meta is failing to report or detect the use of Facebook and Instagram for child traffickin­g and uncovered how Messenger is being used as a platform for trafficker­s to communicat­e to buy and sell children.

“These new tools assume they have a parent or guardian watching them on social media,” said Tina Frundt, the founder of Courtney’s House, an organizati­on supporting minority victims of child sex traffickin­g in Washington, DC. “Sex trafficker­s have groups all over Instagram and this is how they find kids. These are kids who are the most vulnerable in society, who may have a lack of parental support, mental health issues or little self-esteem.”

In June, Meta disclosed it had set up a taskforce to investigat­e Instagram’s role in the distributi­on and sale of child sexual abuse material.

However, Meta has undergone several rounds of layoffs since November amid plans to eliminate about 21,000 jobs to cut costs. Some of these cuts occurred in the company’s content moderator teams, where employees are tasked with detecting and reporting child sex abuse material and other graphic and abusive content on the platforms.

“Harms happening on digital platforms are continuous­ly being framed as problems to be solved through increased user responsibi­lity and parental interventi­on, rather than through meaningful systemic change,” said Lianna McDonald, executive director at the Canadian Centre for Child Protection, a charity focused on child safety.

The Canadian Centre for Child Protection and NSPCC have repeatedly called for government­s to introduce regulation­s that address the online safety of children.

“Meta has a fundamenta­l responsibi­lity to look at their sites and the algorithms that they use. Child safety online can feel like an uphill battle for even the most present of parents,” said Goldblatt Grace. “Meta has a responsibi­lity to make its social media platforms safer for kids.”

In response to the Guardian’s request for comment, Sophie Voegel, a spokespers­on for Meta, said: “The exploitati­on of children is a horrific crime – we don’t allow it and we work aggressive­ly to fight it on and off our platforms. We proactivel­y aid law enforcemen­t in arresting and prosecutin­g the criminals who perpetrate these grotesque offenses.” She added that Meta had removed “over 34m pieces of child exploitati­on and traffickin­g content between October and December 2022 and have also reported tens of thousands of accounts of suspected trafficker­s over many years to the National Center for Missing and Exploited Children, which has repeatedly recognized us as an industry leader in the fight to keep young people safe online”.

“Far from replacing them, our parental supervisio­n tools are intended to complement our existing safeguards to help protect teens from unwanted contact,” Meta also said. “These include defaulting teens into private accounts when they sign up to Instagram, preventing people over 19 from sending private messages to teens who don’t follow them and preventing adults who have shown potentiall­y suspicious behaviour from finding, following and interactin­g with teen accounts.”

• In the US, call or text the Childhelp abuse hotline on 800-422-4453. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Associatio­n for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Braveheart­s on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines Internatio­nal.

 ?? ?? Meta this week introduced new parental supervisio­n tools. Photograph: Rachel Torres/Alamy
Meta this week introduced new parental supervisio­n tools. Photograph: Rachel Torres/Alamy

Newspapers in English

Newspapers from United States