Las Vegas Review-Journal

Wasn’t Tiktok supposed to be fun for its users?

- By Shira Ovide The New York Times Company

There is a predictabl­e trajectory for social media apps. Many of them start out as helpful or even pure fun. But when they get popular enough, just about every app becomes a place for consequent­ial discussion­s about politics and social issues, too. And with that comes both meaningful conversati­ons and a litany of nastiness.

This reality has come for Tiktok. An app better known for viral dance videos has become a significan­t source of political and social misinforma­tion, as my colleague Tiffany Hsu detailed in a recent article.

Ahead of Kenya’s recent presidenti­al election, a widely shared Tiktok post showed an altered, violent image of one of the candidates with a caption that described him as a murderer. (The post was eventually removed.) Falsehoods about diets and school shootings easily spread in the app, Tiffany reported, as have variations on the Pizzagate conspiracy.

And on the serious even if not terrible side, American politician­s and their allies are embracing Tiktok to spread their campaign messages and promote policies such as the Child Tax Credit.

This may not be exactly what Tiktok has in mind. Executives have continued to describe Tiktok as an entertainm­ent app. And sure, most people use Tiktok, Facebook, Pinterest, Nextdoor, Youtube and Twitch in fun, productive and informativ­e ways.

But it is inevitable that apps must plan for what will go wrong when online conversati­ons eventually encompass the full scope of human interest. That will include political informatio­n and social activist movements, as well as nasty insults and even incitement­s to violence and hawking of bogus products for financial gain.

“It’s the life cycle of a user-generated content platform that once it reaches a critical mass, it runs into content moderation problems,” said Evelyn Douek, an assistant professor at Stanford Law School whose research focuses on online speech.

The tricky part, of course, is how to manage apps that evolve from “We’re just for fun!” to “We take our responsibi­lity seriously.” (Tiktok said that almost verbatim in its blog post last week.)

Pinterest is best known for pretty posts for wedding planning or meal inspiratio­n, but it also has policies to weed out false informatio­n about vaccines and steers people to reliable sources when they search for terms related to self-harm. Roblox is a silly virtual world, but it also takes precaution­s — such as exhorting people to “be kind” — in case children and young adults want to use the app to do harmful things such as bullying someone.

Tiktok knows that people use the app to discuss politics and social movements, and with that comes the potential risks. On Wednesday, Tiktok laid out its plans to protect the 2022 U.S. elections from harmful propaganda and unsubstant­iated rumors.

Maybe more so than other apps, Tiktok doesn’t start with a presumptio­n that each post is equally valid or that what becomes popular should be purely the will of the masses. Tiktok creates trending hashtags, and reporters have found the app may have tried to direct people away from some material, like Black Lives Matter protests.

(Tiktok is owned by the Chinese technology company Bytedance. And posts on Douyin, Bytedance’s version of Tiktok in China, are tightly controlled, as all sites in China are.)

Whether Tiktok is more or less effective at managing humans than Facebook or Youtube is open to debate. So is the question of whether Americans should feel comfortabl­e with an app owned by a Chinese company influencin­g people’s conversati­ons.

To put it frankly, it stinks that all apps must plan for the worst of the human condition. Why shouldn’t Twitch just be a place to enjoy watching people play video games, without fans abusing the app to stalk its stars? Why can’t neighbors coordinate school bus pickups in Nextdoor without the site also harboring racial profiling or vigilantis­m? Can’t Tiktok just be for fun?

Sorry, no. Mixing people with computeriz­ed systems that shine attention on the most compelling material will amplify our best and our worst.

I asked Douek how we should think about the existence of rumors and falsehoods online. We know that we don’t believe every ridiculous thing we hear or see, whether it’s in an app or in conversati­ons at our favorite lunch spot. And it can feel exhausting and self-defeating to cry foul at every manipulate­d video or election lie online. It’s also counterpro­ductive to feel so unsure about what to believe that we don’t trust anything. Some days it all feels awful.

Douek talked me out of that fatalism and focused on the necessity of a harm reduction plan for digital life. That doesn’t mean our only choices are either every single app becoming full of garbage or Chinese-style government control of internet content. There are more than two options.

“As long as there have been rules, people have been breaking them. But that doesn’t mean platforms shouldn’t try to mitigate the harm their services contribute to and try to create a healthier, rather than unhealthie­r, public sphere,” Douek said.

Newspapers in English

Newspapers from United States