The Guardian Australia

What TikTok does to your mental health: ‘It’s embarrassi­ng we know so little’

- Kari Paul

In the few years since its launch, TikTok has already altered the face of the social media landscape, attracting more than 1 billion users and leading competitor­s to replicate some of its most unique features.

The impact of that explosive growth and the ‘TikTok-ification’ of the internet at large on social media users remains little understood, experts warn, exacerbati­ng concerns about the impact of social media on our habits and mental health.

“It’s embarrassi­ng that we know so little about TikTok and its effects,” said Philipp Lorenz-Spreen, a research scientist at the Max Planck Institute for Human Developmen­t in Berlin. “Research often lags behind industry, and this is an example of an instance where that could become a big problem.”

The lack of understand­ing in how TikTok affects its users is particular­ly concerning given the app’s massive popularity among young people, experts say. Increasing­ly called “the TikTok generation”, Gen Z prefers the platform to other social media, with nearly six in 10 teenagers counting themselves as daily users. The majority of US teens have accounts on TikTok, with 67% saying they have ever used the app and 16% saying they use it “almost constantly”.

“We owe it to ourselves and to the users of these platforms to understand how we are changed by the screens we use and how we use them,” said Michael Rich, a pediatrici­an who studies the impact of technology on children at Boston Children’s hospital.

“We need more informatio­n to make informed decisions on how we’re going to help younger people understand how to use them thoughtful­ly and mindfully – or not use them at all.”

What makes TikTok different

Concerns about the mental health impacts of social media activity are longstandi­ng, and have only intensifie­d in recent years. In 2021, for example, internal research at Instagram made public by Frances Haugen showed the drastic mental health impacts of the photo app on teen users – including increased rates of eating disorders among teen girls – and sparked widespread calls for stronger regulation.

But TikTok hosts similar harmful content, and experts warn a host of innovative features of the platform raise unique concerns.

TikTok largely optimizes content for minutes and hours of view time, internal documents leaked in 2021 showed, rather than prioritizi­ng metrics like clicks and engagement favored by most social media platforms before. In order to do that, the company has deployed a unique algorithm and a landing page that marks the most extreme departure yet from a chronologi­cal to an algorithmi­c feed.

“What that does to the brain, we don’t know,” said Lorenz-Spreen.

Studies show that when chronologi­cal feeds are discarded in favor of suggested content, the algorithm frequently gives rise to more extreme views. One report in 2021 showed more than 70% of extremist content found on YouTube was recommende­d to users by the algorithm. And it incentiviz­es users to share attention-grabbing content that gets picked up by the feed.

In recent years, TikTok has faced intense scrutiny for dangerous challenges the algorithm has given rise to. The “Benadryl challenge”, wherein participan­ts took a large amount of antihistam­ines in an attempt to produce hallucinog­enic effects, led to at least one death. A new lawsuit claims the “blackout challenge” led to deaths of several young girls.

“Compared to other social media sites, TikTok is uniquely performati­ve,” said Rich, the pediatrici­an. “This leads to both interestin­g content, and some edgy ways of seeking attention that are less healthy.”

TikTok also appears to be “faster than any other platform at detecting interest”, said Marc Faddoul, co-director of Tracking Exposed, a digital rights organizati­on investigat­ing TikTok’s algorithm. The app’s For You Page seems to know its users’ desires and interests so well it has sparked memes and articles such as The TikTok Algorithm Knew My Sexuality Better Than I Did and ‘Why is My TikTok For You Page All Lesbians?’ Asks Woman Who is About to Realize Why.

Researcher­s are still parsing what that uncanny tailoring means for users, particular­ly as it relates to targeted content around mental illness and other sensitive issues.

“The app provides an endless stream of emotional nudges, which can be hard to recognize and really impact users in the long run,” Faddoul said. “It’s not going to make anyone depressed overnight, but hours of consumptio­n every day can have a serious impact on your mental health.”

These concerns are particular­ly pronounced in the realm of ADHD content, where users have reported being diagnosed by medical profession­als after seeing videos about their symptoms. But while the prevalence of the #ADHD hashtag has brought increased awareness of the condition experts have warned of unintended negative effects, including medical misinforma­tion, especially as the platform accepts advertisin­g money from a number of for-profit mental health startups such as Cerebral.

TikTok declined to comment on criticisms relating to health misinforma­tion and users self-diagnosing based on content seen on the app. It also declined to comment on its partnershi­p with mental health startup Cerebral or its policies on medical informatio­n used in advertisem­ents.

The algorithm may replicate existing inequaliti­es that heighten mental health concerns for minority groups, researcher­s say. Black content creators on TikTok have long complained about their content being “shadow-banned”, or demoted by the algorithm, and in 2019 TikTok admitted to censoring videos from users it identified as disabled, overweight or LGBTQ+ in a misguided attempt to crack down on bullying.

“People of color on TikTok are constantly having to think about the ways in which the algorithm is surveillin­g them,” said Chelsea Peterson-Salahuddin, an internet researcher at the University of Michigan School of Informatio­n. “Putting the onus on marginaliz­ed people to constantly monitor themselves is very mentally and emotionall­y taxing.”

‘It creates a replacemen­t for social interactio­n’

Researcher­s say the Covid-19 pandemic has illustrate­d the impact of the platform on users’ lives, especially young ones. When Covid-19 hit, and the world went into lockdown, TikTok’s use exploded.

The app was flooded with young people posting about the ways in which the pandemic was upending their lives. What has resulted is a very young user base taking advantage of the app to connect with one another during a very vulnerable time, said Yim Register, a researcher who studies mental health and social media.

“The largest effect of the pandemic is being faced with large uncertaint­y, and under uncertaint­y our brains want to reduce uncertaint­y and make sense of the world,” Register said. “We want to be able to accurately predict what’s going to happen and we turn to social media to sense-make collective­ly.”

Register said that ethos had contribute­d to TikTok’s unique “platform spirit”, a term coined by researcher Michael Ann Devito to characteri­ze the nature of content and communicat­ion on a given app.

“The platform spirit of TikTok seems to be about posting very loudly about very intimate and intense things,” Register said. “And people are encouraged to be vulnerable to fit that spirit.”

This has given rise to viral videos using a wry, ironic tone to share often devastatin­g personal stories. “Things people on the internet have said to me since my sister passed away from addiction,” says one video, with 3.5m views, featuring a user dancing to upbeat music and lights. “Things my ex boyfriend said to me as I held my lifeless babies,” the caption on another video using the same music and dancing reads.

Backlash has already emerged on the platform itself over the increasing­ly personal nature of the app. “I truly believe years from now people will deeply regret trauma dumping on TikTok,” a user says in one viral video, adding that such content is less likely to be shared on Facebook and YouTube. “What it is it about TikTok that drives people to reveal their deepest, dirtiest secrets?”

Experts agree, saying that while these kinds of videos can offer support and a creative way to deal with grief, it can also lead to additional trauma.

“For many people, disclosing abuse or mental health issues can be traumatic and harmful,” said Rich, the children’s mental health expert. “In clinical work, we have systems in place for if a disclosure occurs – there is a safety net to catch them. And that does not exist in a social media environmen­t.”

The dangers are heightened by the anonymous nature of TikTok, whose feed differs from that of social media in the past, researcher­s say. While apps such as Facebook historical­ly offered a feed of personal content primarily from friends and family, on TikTok the majority of people who see a user’s videos are largely strangers.

“With TikTok in particular, because of its large user base and the way its algorithm works, videos have the potential to get very big very fast, and not everyone is prepared for that,” Register said. “There are serious consequenc­es to going viral.”

Often commenters will demand more engagement on viral TikToks, with a common refrain of “story time?” encouragin­g the original poster to elaborate on the traumatic share. Register said issues like these have led more researcher­s to call for better protection­s of users.

“Most computing is not trauma informed, and when social media is not trauma informed it can exacerbate trauma,” Register said. “When I look at social media, the question is not how it affects your mental health, but how do mental health issues you already have get exacerbate­d by its design?”

TikTok in March 2021 introduced new tools “to promote kindness” on the app, allowing users to more easily filter spam and offensive comments. It also added an automatic pop-up prompt for users leaving potentiall­y violating comments asking them “to reconsider”.

“Our goal is to promote a positive environmen­t where people support and lift each other up,” said Tara Wadhwa director of US policy at TikTok.

Meanwhile, TikTok’s opaque algorithm is slowly being cracked open. In August, Chinese regulators required TikTok to open up its algorithms for review, and the company around the same time began to allow Oracle to audit its content moderation models. Rich said this was just the beginning, and more transparen­cy was needed.

“Legislator­s and these companies need to invest more in really understand­ing this interface between human nature and these platforms,” he said.

“We need more informatio­n to make informed decisions on how we’re going to help younger people understand how to use them thoughtful­ly and mindfully – or not use them at all.”

We need more informatio­n to make informed decisions on how we’re going to help younger people understand how to use them thoughtful­ly and mindfully – or not use them at all

Michael Rich, pediatrici­an

 ?? Photograph: Peter Cripps/Alamy ?? TikTok users have reported self-diagnosing mental health issues based on content seen on the app.
Photograph: Peter Cripps/Alamy TikTok users have reported self-diagnosing mental health issues based on content seen on the app.

Newspapers in English

Newspapers from Australia