Las Vegas Review-Journal

Facebook wrestles with features it used to define Facebook

- By Mike Isaac

SAN FRANCISCO — In 2019, Facebook researcher­s began a new study of one of the social network’s foundation­al features: the Like button.

They examined what people would do if Facebook removed the distinct thumbsup icon and other emoji reactions from posts on its photo-sharing app Instagram, according to company documents. The buttons had sometimes caused Instagram’s youngest users “stress and anxiety,” the researcher­s found, especially if posts didn’t get enough Likes from friends.

But the researcher­s discovered that when the Like button was hidden, users interacted less with posts and ads. At the same time, it did not alleviate teenagers’ social anxiety and young users did not share more photos, as the company thought they might, leading to a mixed bag of results.

Mark Zuckerberg, Facebook’s chief executive, and other managers discussed hiding the Like button for more Instagram users, according to the documents. In the end, a larger test was rolled out in just a limited capacity to “build a positive press narrative” around Instagram.

The research on the Like button was an example of how Facebook has questioned the bedrock features of social networking. As the company has confronted crisis after crisis on misinforma­tion, privacy and hate speech, a central issue has been whether the basic way that the platform works has been at fault — essentiall­y, the features that have made Facebook be Facebook.

Apart from the Like button, Facebook has scrutinize­d its share button, which lets users instantly spread content posted by other people; its groups feature, which

is used to form digital communitie­s; and other tools that define how more than 3.5 billion people behave and interact online. The research, laid out in thousands of pages of internal documents, underlines how the company has repeatedly grappled with what it has created.

What researcher­s found was often far from positive. Time and again, they determined that people misused key features or that those features amplified toxic content, among other effects. In an August 2019 internal memo, several researcher­s said it was Facebook’s “core product mechanics” — meaning the basics of how the product functioned — that had let misinforma­tion and hate speech flourish on the site.

“The mechanics of our platform are not neutral,” they concluded.

The documents — which include slide decks, internal discussion threads, charts, memos and presentati­ons — do not show what actions Facebook took after receiving the findings. In recent years, the company has changed some features, making it easier for people to hide posts they do not want to see and turning off political group recommenda­tions to reduce the spread of misinforma­tion.

But the core way that Facebook operates — a network where informatio­n can spread rapidly and where people can accumulate friends and followers and Likes — ultimately remains largely unchanged.

Many significan­t modificati­ons to the social network were blocked in the service of growth and keeping users engaged, some current and former executives said. Facebook is valued at more than $900 billion.

“There’s a gap between the fact that you can have pretty open conversati­ons inside of Facebook as an employee,” said Brian Boland, a Facebook vice president who left last year. “Actually getting change done can be much harder.”

The company documents are part of the Facebook Papers, a cache provided to the Securities and Exchange Commission and to Congress by a lawyer representi­ng Frances Haugen, a former Facebook employee who has become a whistleblo­wer. Haugen earlier gave the documents to The Wall Street Journal. This month, a congressio­nal staff member supplied the redacted disclosure­s to more than a dozen other news organizati­ons, including The New York Times.

In a statement, Andy Stone, a Facebook spokesman, criticized articles based on the documents, saying that they were built on a “false premise.”

“Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderst­ands where our own commercial interests lie,” he said.

He said Facebook had invested $13 billion and hired more than 40,000 people to keep people safe, adding that the company has called “for updated regulation­s where democratic government­s set industry standards to which we can all adhere.”

In a post this month, Zuckerberg said it was “deeply illogical” that the company would give priority to harmful content because Facebook’s advertiser­s don’t want to buy ads on a platform that spreads hate and misinforma­tion.

“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” he wrote.

The foundation­s of success

When Zuckerberg founded Facebook 17 years ago in his Harvard University dorm room, the site’s mission was to connect people on college campuses and bring them into digital groups with common interests and locations.

Growth exploded in 2006 when Facebook introduced the News Feed, a central stream of photos, videos and status updates posted by people’s friends. Over time, the company added more features to keep people interested in spending time on the platform.

In 2009, Facebook introduced the Like button. The tiny thumbs-up symbol, a simple indicator of people’s preference­s, became one of the social network’s most important features. The company allowed other websites to adopt the Like button so users could share their interests back to their Facebook profiles.

That gave Facebook insight into people’s activities and sentiments outside of its own site, so it could better target them with advertisin­g. Likes also signified what users wanted to see more of in their News Feeds so people would spend more time on Facebook.

Facebook also added the groups feature, where people join private communicat­ion channels to talk about specific interests, and pages, which allowed businesses and celebritie­s to amass large fan bases and broadcast messages to those followers.

Another innovation was the share button, which people used to quickly share photos, videos and messages posted by others to their own News Feed or elsewhere. An automatica­lly generated recommenda­tions system also suggested new groups, friends or pages for people to follow, based on their previous online behavior.

But the features had side effects, according to the documents. Some people began using Likes to compare themselves to others. Others exploited the share button to spread informatio­n quickly, so false or misleading content went viral in seconds.

Facebook has said it conducts internal research partly to pinpoint issues that can be tweaked to make its products safer. Adam Mosseri, the head of Instagram, has said that research on users’ well-being led to investment­s in anti-bullying measures on Instagram.

Self-examinatio­n

As Facebook’s researcher­s dug into how its products worked, the worrisome results piled up.

In a July 2019 study of groups, researcher­s traced how members in those communitie­s could be targeted with misinforma­tion. The starting point, the researcher­s said, were people known as “invite whales,” who sent invitation­s out to others to join a private group.

These people were effective at getting thousands to join new groups so that the communitie­s ballooned almost overnight, the study said. Then the invite whales could spam the groups with posts promoting ethnic violence or other harmful content, according to the study.

Another 2019 report looked at how some people accrued large followings on their Facebook pages, often using posts about cute animals and other innocuous topics. But once a page had grown to tens of thousands of followers, the founders sold it. The buyers then used the pages to show followers misinforma­tion or politicall­y divisive content, according to the study.

As researcher­s studied the Like button, executives considered hiding the feature on Facebook as well, according to the documents. In September 2019, it removed Likes from users’ Facebook posts in a small experiment in Australia.

The company wanted to see if the change would reduce pressure and social comparison among users. That, in turn, might encourage people to post more frequently to the network.

But people did not share more posts after the Like button was removed. Facebook chose not to roll the test out more broadly, noting, “Like counts are extremely low on the long list of problems we need to solve.”

Last year, company researcher­s also evaluated the share button. In a September 2020 study, a researcher wrote that the button and so-called reshare aggregatio­n units in the News Feed, which are automatica­lly generated clusters of posts that have already been shared by people’s friends, were “designed to attract attention and encourage engagement.”

But gone unchecked, the features could “serve to amplify bad content and sources,” such as bullying and borderline nudity posts, the researcher said.

That’s because the features made people less hesitant to share posts, videos and messages with one another. In fact, users were three times more likely to share any kind of content from the reshare aggregatio­n units, the researcher said.

One post that spread widely this way was an undated message from an account called “The Angry Patriot.” The post notified users that people protesting police brutality were “targeting a police station” in Portland, Ore. After it was shared through reshare aggregatio­n units, hundreds of hate-filled comments flooded in.

It was an example of “hate bait,” the researcher said.

A common thread in the documents was how Facebook employees argued for changes in how the social network worked and often blamed executives for standing in the way.

In an August 2020 internal post, a Facebook researcher criticized the recommenda­tion system that suggests pages and groups for people to follow and said it can “very quickly lead users down the path to conspiracy theories and groups.”

“Out of fears over potential public and policy stakeholde­r responses, we are knowingly exposing users to risks of integrity harms,” the researcher wrote. “During the time that we’ve hesitated, I’ve seen folks from my hometown go further and further down the rabbit hole” of conspiracy theory movements like Qanon and anti-vaccinatio­n and COVID-19 conspiraci­es.

The researcher added, “It has been painful to observe.”

 ?? ILLUSTRATI­ON BY MEL HAASCH; PHOTOGRAPH­Y BY TOM BRENNER/THE NEW YORK TIMES ?? Likes and shares made Facebook what it is. Now, company documents show, it’s struggling to deal with their effects.
ILLUSTRATI­ON BY MEL HAASCH; PHOTOGRAPH­Y BY TOM BRENNER/THE NEW YORK TIMES Likes and shares made Facebook what it is. Now, company documents show, it’s struggling to deal with their effects.
 ?? JEFF CHIU / AP FILE (2020) ?? The thumbs up “Like” logo is shown on a sign April 14, 2020, at Facebook headquarte­rs in Menlo Park, Calif. Complaints whistleblo­wer Frances Haugen filed with the SEC, along with redacted internal documents obtained by media outlets, show a troubled, internally conf licted company, where data on the harms it causes is abundant, but solutions are halting at best.
JEFF CHIU / AP FILE (2020) The thumbs up “Like” logo is shown on a sign April 14, 2020, at Facebook headquarte­rs in Menlo Park, Calif. Complaints whistleblo­wer Frances Haugen filed with the SEC, along with redacted internal documents obtained by media outlets, show a troubled, internally conf licted company, where data on the harms it causes is abundant, but solutions are halting at best.

Newspapers in English

Newspapers from United States