Press-Telegram (Long Beach)

Report documents how easily children can access graphic images

- By Tiffa■y Hsu

Violent, distressin­g imagery related to the conflict between Hamas and Israel, including graphic posts showing dead children and adults, are easily accessible to young users on platforms such as Instagram, researcher­s have found.

The Institute for Strategic Dialogue, a research group that studies online platforms, created accounts on Instagram, TikTok and Snapchat under the guise of British 13-year-olds. Within a 48hour period from Oct. 14 through 16, the researcher­s said, they found more than 300 problemati­c posts. More than 78% of the posts were on Instagram, and about 5% were on Snapchat. The figures were released in a report Wednesday.

The researcher­s said they switched on Instagram's Sensitive Content Control feature and TikTok's Restricted mode — which are meant to shield young users from potentiall­y risky material — before running their searches.

Despite policies and features meant to protect increasing­ly online youth, the researcher­s found that grisly content was not difficult to find: 16.9% of the posts that surfaced when searching for the “Gaza” hashtag on Instagram were graphic or violent, compared with 3% on TikTok and 1.5% on Snapchat. TikTok's search function was sometimes automatica­lly populated with phrases like “Gaza dead children” and “dead woman Gaza” the researcher­s found.

“In times of conflict, where misinforma­tion and disinforma­tion run rampant, it becomes even more critical to safeguard young people from the potential emotional impact of such material, and provide the support necessary to process and contextual­ize this type of content,” Isabelle Frances-Wright, an author of the report, said in an emailed statement.

Meta, which owns Instagram, addressed its efforts to balance safety and speech in a blog post about the war on Friday. It noted that it establishe­d a special operations center with expert monitors working in Hebrew and Arabic, who removed or flagged more than 795,000 pieces of harmful content in the first three days of the conflict. The company also said that Instagram allows users to control how much sensitive content they are recommende­d.

In its own blog post last weekend, TikTok said it had also opened a command center and added more Arabic- and Hebrewspea­king moderators, removing more than 500,000 videos and closing 8,000 livestream­s since Hamas' attack on Oct. 7. The platform said it is automatica­lly detecting and removing graphic and violent content, placing optin screens over disturbing images and adding restrictio­ns to its livestream­ing function amid the hostage situation.

Snapchat's parent company, Snap, said in a statement that it is "continuing to rigorously monitor" the platform and "determinin­g any additional measures needed to mitigate harmful content." The platform does not have an open newsfeed or livestream­ing abilities, which limits harmful content from going viral, the company said.

Amid a flood of posts about the war, some schools have urged parents to delete their children's online accounts to shield them from Hamas' attempts at psychologi­cal warfare. (Hamas accounts have been blocked by platforms like Instagram and TikTok but remains active on Telegram.) The CEO of parental app BrightCana­ry told USA Today that online searches for hostages among users between ages 9 and 13 surged 2,800% in recent days.

Newspapers in English

Newspapers from United States