Business Day

How Tiktok keeps pushing suicide to vulnerable kids

• Platform’s algorithm barrages depressed users with negative content — even when they are dead

- Olivia Carville

TikTok’s algorithm does not know Chase Nasca is dead. More than a year after Nasca killed himself at age 16, his account remains active.

Scroll through his For You feed, and you see an endless stream of clips about unrequited love, hopelessne­ss, pain and what many posts glorify as the ultimate escape: suicide.

“Take the pain away. Death is a gift,” says one video pushed to the account in February 2023, days before the first anniversar­y of Nasca’s death. In another, a male voice says, “I’m going to put a shotgun in my mouth and blow the brains out the back of my head,” and a female voice responds: “Cool.”

The feed looked much the same in the days before Nasca died. On February 13 2022, it surfaced a video of an oncoming train with the caption “went for a quick lil walk to clear my head”. Five days later, Nasca stopped at the Long Island Rail Road tracks that run through the hamlet of Bayport, New York, about just under a kilometre from his house. He leaned his bike against a fence and stepped on the track, at a blind curve his parents had warned him about since he was old enough to walk. He sent a message to a friend: “I’m sorry. I can’t take it anymore.” A train rounded the bend, and he was gone.

It is impossible to know why Nasca ended his life. There are often multiple factors leading to suicide, and he left no note. But two weeks after his death, his mother, Michelle, started searching his social media accounts, desperate for answers. When she opened the TikTok app on his iPad, she found a library of more than 3,000 videos her son had bookmarked, liked, saved or tagged as a favourite. She could see the terms he had searched for: batman, basketball, weightlift­ing, motivation­al speeches. And she could see what the algorithm had brought him: many videos about depression, hopelessne­ss and death.

Since TikTok exploded into popular culture in 2018, people have been trying to understand the short-form video platform and its effect on children. Owned by Chinese internet company ByteDance, the app reached 1-billion downloads faster than any previous social media product.

Its success stems from its stickiness. The algorithm underlying its recommenda­tion engine delivers a carousel of riveting user-created content to keep people staring at their screens. TikTok has become so popular — used by 150-million Americans according to the company — that Silicon Valley rivals are trying to mimic it.

Politician­s are stoking fears that it could be used as a disinforma­tion tool by the Chinese government. In March, US President Joe Biden’s administra­tion threatened to ban the app — as did the Trump administra­tion — if ByteDance does not sell its stake.

As the political debate carries on, researcher­s and child psychologi­sts are watching with increasing alarm. Surveys of teens have revealed a correlatio­n between social media and depression, self-harm and suicide. Centres for Disease Control and Prevention data show nearly one in four teens said they had considered killing themselves in 2021, nearly double the level a decade earlier. The American Psychologi­cal Associatio­n and other authoritie­s pin the blame partly on social media.

At the congressio­nal hearing in March, a representa­tive brought up Nasca’s death, showing TikTok CEO Shou Chew some of the clips the app had sent the boy and asking if Chew would let his own children watch such content. That same month, Nasca’s parents filed a wrongful-death lawsuit in New York state court against TikTok, ByteDance and the railroad.

TikTok says it cannot comment on pending litigation, but a spokespers­on, Jamie Favazza, says the company is committed to the safety and wellbeing of its users, especially teens. “Our hearts break for any family that experience­s a tragic loss,” she says. “We strive to provide a positive and enriching experience and will continue our significan­t investment in safeguardi­ng our platform.”

TikTok’s original recommenda­tion algorithm was designed by a team of engineers in China, working for ByteDance. But while the app was made in China, it is used almost everywhere except China. It cannot even be downloaded in its homeland.

TikTok says its algorithm is now maintained by engineers around the world, with teams based in North America, Europe and Asia contributi­ng. But more than a dozen former employees from the company’s trust and safety team who were interviewe­d by Bloomberg say executives and engineers in Beijing still hold the keys.

The trust and safety team designs features and policies to keep TikTok users safe. Based in the US, Ireland and Singapore, it moderates the billions of videos uploaded to the platform every day and is responsibl­e for safety issues such as content that sexualises minors and viral challenges that encourage children to take part in dangerous dares.

Team members remove posts that violate standards and create tools to help users filter out harmful material.

But the former employees, who spoke on condition of anonymity because they signed nondisclos­ure agreements, say that they had little influence over the algorithm that drives the For You feed and that their requests for informatio­n about how it works were often ignored. They insist that they were set up to fail — asked to enhance the safety of an app whose underpinni­ngs they were unable to comprehend.

Michelle still recalls exactly what the first video she saw after gaining access to her son’s account said: “I’m caught in a life I didn’t ask to be in.” She watched Chase’s For You feed for more than an hour and could not understand why there were no happy or funny videos, which is what she thought TikTok was about. She asked one of Chase’s two older brothers why he had made his account so dark.

“Chase didn’t do that, Mom,” her son replied. “That’s coming from the algorithm.”

In a world of infinite informatio­n, algorithms are rules written into software that help sort out what might be meaningful to a user and what might not. TikTok’s algorithm is trained to track every swipe, like, comment, rewatch and follow and to use that informatio­n to select content to keep people engaged. Greater engagement, in turn, increases advertisin­g revenue. The company has fine-tuned its recommenda­tion system to such a degree that users sometimes speculate that the app is reading their minds.

Other social media platforms employ similar recommenda­tion engines. TikTok’s is distinguis­hed by its reach, according to Guillaume Chaslot, a French data scientist who worked on YouTube’s algorithm and now consults with his country’s government on its efforts to regulate online platforms. His experience in the field suggests to him that TikTok’s algorithm controls a greater share of the content reaching a user’s feed than those of most other social media platforms. “When depressive content is good for engagement, it is promoted by the algorithm,” he says.

Concern about TikTok’s recommenda­tion engine have been raised internally since at least 2020. That was when Charles Bahr, a former advertisin­g sales manager in TikTok’s office in Germany, says he warned his superiors the algorithm was sending Generation Z users endless streams of depressing and suicide-glorifying videos.

Bahr spent a year and a half with the company, joining in July 2020, at age 18. He had founded two tech start-ups as a teenager and was advising politician­s and businesses on how to master TikTok when he was hired.

When he first started using the app, he says, his For You feed was amusing and fun. He loved the product and was proud to wear his TikTok T-shirt. Once he started posting videos identifyin­g himself as an employee, though, many in his growing following began to forward him disturbing videos that violated TikTok’s rules, urging him to remove them.

One of the first scary videos he remembers being sent was of a man shooting himself in the head. As Bahr watched clips like this, sometimes passing them to the trust and safety team for help, his feed began to warp. “More and more depression, suicide and self-harm content came on,” he says. Some days it led him to cry.

Bahr’s feed made selling ads tricky. He regularly held workshops with prospectiv­e clients, and many asked to see how the app worked. He could not show his own For You page, he says, because he feared it would scare them off. “Every time I entered a workshop, I switched from my sad, dark account to a second demo account that had quite a normal feed,” he says. “It took me a long time to realise that maybe it’s not only me that has a feed that’s so extreme.”

When Bahr was invited to speak at an annual meeting of TikTok’s European communicat­ions team in November 2020, he saw it as an opportunit­y to raise the issue. In a PowerPoint presentati­on reviewed by Bloomberg, he told the group that TikTok should make it a mission to listen to its users, especially those struggling with mental health issues.

“Even though we inspire young people to be their most creative selves on TikTok, there is an endless community of young people not knowing where to go,” one of his slides said. He then showed three posts from young users struggling with depression. Bahr says he recommende­d that the app not censor such content but instead elevate more positive clips for younger users.

Seven months later, the Wall Street Journal published an investigat­ion that involved monitoring more than 100 automated accounts to track how TikTok’s algorithm works. Within 36 minutes, the newspaper reported, a bot programmed to engage with videos about depression was fed a stream of content that was 93% about sad topics. TikTok said at the time that the Journal’s bots were not representa­tive of human behaviour.

The Journal’s experiment prompted Bahr to conduct one of his own. He opened a new TikTok account and made a screen recording of himself as he engaged with sad content to see how long it would take for his feed to become negative. It took 17 minutes.

Bahr says he raised his concerns on an internal messaging system with the algorithm strategy team in Europe but got no response. A few months later he was fired for alleged expense account fraud and misuse of company tools. Bahr, who maintains his innocence, sued the company for wrongful dismissal. TikTok did not pursue its claims against Bahr and settled out of court.

The company did not respond to requests for comment about Bahr’s 2020 presentati­on and said it could not respond to his general criticisms or to the concerns he raised internally, which it said it “can’t validate”.

Psychologi­sts say it is more difficult for teens to withstand the addictive properties of algorithms, because their prefrontal cortexes, responsibl­e for decision-making, judgment and impulse control, are not fully developed. Two-thirds of US teens use TikTok every day, according to a 2022 Pew Research Centre survey, with 16% saying they are on the platform almost constantly.

A majority of those surveyed said that they had a positive experience on social media in general and that it gave them a sense of belonging. But almost half said they felt overwhelme­d by the drama they found there, and more than a quarter said social media made them feel worse about their lives.

Social media is a fiercely competitiv­e industry, dependent on a young and fickle audience. Companies rely on algorithms to keep their platforms cool in the eyes of teen users, and they protect this intellectu­al property fiercely. The lack of transparen­cy has limited academic research and given rise to conflictin­g claims. On one hand, the platforms provide crucial opportunit­ies for connection among teens. On the other, they encourage children to compare themselves to others, to become addicted to the technology and to discover content that glamorizes harmful behaviour.

Former members of TikTok’s trust and safety team say they feared their platform was having a negative effect on teens and did not understand why the company was not hiring child psychologi­sts to work on algorithm design.

Many read the documents leaked in 2021 by Frances Haugen, then a data scientist at Facebook, which showed the company was aware its products were harming children. The former TikTok employees say they believed their app’s harms could be worse than Facebook’s, but did not have the power to deal with the problem, or even to study it.

While practicall­y all tech companies are secretive about their data, these insiders, who also had experience working for Google, Meta Platforms and Twitter, cast TikTok as Fort Knox by comparison. The secrecy was especially pronounced when it came to the algorithm. Former trust and safety members say that they were never privy to informatio­n about how it worked, how it was weighted and how it could be changed and that team leaders could not get answers from the engineers who designed it.

More than a dozen people, some of whom were still with TikTok as recently as 2022, say they were stonewalle­d when they tried to access basic informatio­n about the algorithm. One safety leader in Europe says he asked the Beijing-based head of engineerin­g to host a meeting with the broader trust and safety team so they could ask questions and better understand the recommenda­tion engine. The leader says the request was ignored.

TikTok says that it takes concerns voiced by employees seriously, that members of the trust and safety team work directly with engineerin­g and that anyone who left the company before 2021 would not be familiar with changes made since then. It also says that all important documents about algorithm changes and most important commentary accompanyi­ng its code are written in English, but that some notes in the code can contain other languages used by its engineers in Asia and elsewhere.

All social media platforms have been criticised for pumping addictive content to teens, for harming their mental health and for surreptiti­ously gathering data on them. But when a Chineseown­ed company does these things, US legislator­s tend to paint it in a sinister light.

Fears that China’s government could require TikTok to hand over its user data or to have the app’s recommenda­tion engine favour Chinese interests have prompted the EU and the US, Canada and other countries to ban the app from government-issued devices. Some US college campuses have blocked it from their Wi-Fi networks, and in April, Montana’s legislatur­e became the first to pass a bill blocking the app on all personal devices.

At the March congressio­nal hearing, Chew, the CEO, was hammered with questions about TikTok’s connection to Beijing and its relationsh­ip with the Chinese Communist Party. His attempts to parry the blows — “this is American data on American soil by an American company overseen by American personnel ”— were scepticall­y received.

At one point, Florida Republican Gus Bilirakis introduced Chase Nasca’s parents, who were in the audience. “Mr Chew, your company destroyed their lives,” Bilirakis said. “Would you share this content with your two children?” He then played a 30second screen recording from Nasca’s TikTok account, after which he asked Chew whether he took responsibi­lity for the app’s algorithm, yes or no. Chew started to answer, but Bilirakis cut him off. “Yes or no?” he demanded.

“We do provide resources for anyone who types in anything like suicide,” Chew said before Bilirakis interrupte­d again, declaring, “I see you’re not willing to answer the question or take any responsibi­lity for your parent company, the technology and the harm it creates.”

TikTok says it has been working in recent months to remove some of the mystery around TikTok and update the platform to make it safer for children. In 2020 it made it possible for parents to check on what their children are doing while using TikTok. The next year it added stronger privacy measures on accounts of those under 16, setting them to private by default.

Then, in February, it announced it would grant US researcher­s access to some data on accounts and content, though it requires them to send their findings to TikTok before publicatio­n. In March, the company added a feature that notifies users under 18 when they have been on the platform for more than an hour and makes them enter a passcode to remain signed in. It also began allowing users to reset their For You recommenda­tions as though they had just signed up for a new account if they felt the content they saw was too dark.

Ryn Linthicum, head of mental health policy at TikTok, says that the company forbids posts glorifying subjects such as suicide and self-harm and that it trains its artificial intelligen­ce systems and 40,000 human moderators to remove them. In the last quarter of 2022, according to company data, TikTok filtered out more than 97% of such content before other users saw it, among the more than 85-million videos it took down overall.

Moderation is a continuous challenge, says Linthicum, who uses the pronoun “they” and has been in their role for a year. It is hard to train an algorithm to distinguis­h between crying from sadness and crying from joy, or between a video that raises awareness of depression or anorexia and one that encourages those conditions.

“This is an incredibly complex space,” Linthicum says. “What may be harmful or hurtful to one person is not necessaril­y what is going to be harmful or hurtful to another.”

In their view, some content should not be censored, because it can help vulnerable teens feel less stigmatise­d. “People go through ups and downs,” Linthicum says. “They have sad life experience­s.”

‘WHEN DEPRESSIVE CONTENT IS GOOD FOR ENGAGEMENT, IT IS PROMOTED BY THE ALGORITHM’

SOCIAL MEDIA IS A FIERCELY COMPETITIV­E INDUSTRY, DEPENDENT ON A YOUNG AND FICKLE AUDIENCE

More than 200 lawsuits have been filed against social media platforms since the start of 2022, many of them arguing that, even if the companies cannot be held liable for the content posted on their platforms, they should be accountabl­e for harms built into their products.

The suits include at least two dozen filed by school districts against some combinatio­n of Instagram, Snapchat, TikTok, YouTube and their parent companies, alleging that they are responsibl­e for the country’s youth mental health crisis.

Many of the others have been filed by the Social Media Victims Law Centre, the Seattlebas­ed firm representi­ng the Nasca family. In more than 65 cases, the centre alleges that social media products have caused sleep deprivatio­n, eating disorders, drug addiction, depression and suicide.

Laura Marquez-Garrett, one of the centre’s attorneys, says the lawsuits against TikTok argue that its algorithm is designed to target vulnerabil­ities. “There’s a really dark side of TikTok that most adults don’t see,” she says. “You could have a child and a parent in the same room, together watching TikTok on their phones, and they’d be seeing an entirely different product.”

The Nasca family decided to sue TikTok after talking to the Social Media Victims Law Centre. “People need to know about the dangers of social media,” Dean says. “Chase wasn’t bullied. He had a great group of friends, and he was excelling academical­ly and athletical­ly. He wasn’t doing drugs, he wasn’t drinking, so what was the variable that was introduced that put him over the edge? I can only conclude it’s this constant barrage of TikTok videos.”

 ?? ??

Newspapers in English

Newspapers from South Africa