Guymon Daily Herald

People or profit? Facebook papers show deep conflict within

- By BARBARA ORTUTAY AP Technology Writer

Facebook the company is losing control of Facebook the product — not to mention the last shreds of its carefully crafted, decade-old image as a benevolent company just wanting to connect the world.

Thousands of pages of internal documents provided to Congress by a former employee depict an internally conflicted company where data on the harms it causes is abundant, but solutions, much less the will to act on them, are halting at best.

The crisis exposed by the documents shows how Facebook, despite its regularly avowed good intentions, appears to have slow-walked or sidelined efforts to address real harms the social network has magnified and sometimes created. They reveal numerous instances where researcher­s and rank-and-file workers uncovered deep-seated problems that the company then overlooked or ignored.

Final responsibi­lity for this state of affairs rests with CEO Mark Zuckerberg, who holds what one former employee described as dictatoria­l power over a corporatio­n that collects data on and provides free services to roughly 3 billion people around the world.

"Ultimately, it rests with Mark and whatever his prerogativ­e is — and it has always been to grow, to increase his power and his reach," said Jennifer Grygiel, a Syracuse University communicat­ions professor who's followed Facebook closely for years.

Zuckerberg has an ironclad hold on Facebook Inc. He holds the majority of the company's voting shares, controls its board of directors and has increasing­ly surrounded himself with executives who don't appear to question his vision.

But he has so far been unable to address stagnating user growth and shrinking engagement for Facebook the product in key areas such as the United States and Europe. Worse, the company is losing the attention of its most important demographi­c — teenagers and young people — with no clear path to gaining it back, its own documents reveal.

Young adults engage with Facebook far less than their older cohorts, seeing it as an "outdated network" with "irrelevant content" that provides limited value for them, according to a

November 2020 internal document. It is "boring, misleading and negative," they say.

In other words, the young see Facebook as a place for old people.

Facebook's user base has been aging faster, on average, than the general population, the company's researcher­s found. Unless Facebook can find a way to turn this around, its population will continue to get older and young people will find even fewer reasons to sign on, threatenin­g the monthly user figures that are essential to selling ads. Facebook says its products are still widely used by teens, although it acknowledg­es there's "tough competitio­n" from TikTok, Snapchat and the like.

So it can continue to expand its reach and power, Facebook has pushed for high user growth outside the U.S. and Western Europe. But as it expanded into less familiar parts of the world, the company systematic­ally failed to address or even anticipate the unintended consequenc­es of signing up millions of new users without also providing staff and systems to identify and limit the spread of hate speech, misinforma­tion and calls to violence.

In Afghanista­n and Myanmar, for instance, extremist language has flourished due to a systemic lack of language support for content moderation, whether that's human or artificial intelligen­ce-driven. In Myanmar, it has been linked to atrocities committed against the country's minority Rohingya Muslim population.

But Facebook appears unable to acknowledg­e, much less prevent, the real-world collateral damage accompanyi­ng its untrammele­d growth. Those harms include shadowy algorithms that radicalize users, pervasive misinforma­tion and extremism, facilitati­on of human traffickin­g, teen suicide and more.

Internal efforts to mitigate such problems have often been pushed aside or abandoned when solutions conflict with growth — and, by extension, profit.

Backed into a corner with hard evidence from leaked documents, the company has doubled down defending its choices rather than try to fix its problems.

"We do not and we have not prioritize­d engagement over safety," Monika Bickert, Facebook's head of global policy management, told The Associated Press this month following congressio­nal testimony from whistleblo­wer and former Facebook employee Frances Haugen. In the days since Haugen's testimony and appearance on "60 Minutes" — during which Zuckerberg posted a video of himself sailing with his wife Priscilla Chan — Facebook has tried to discredit Haugen by repeatedly pointing out that she didn't directly work on many of the problems she revealed.

"A curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusion­s about us," Facebook tweeted from its public relations "newsroom" account earlier this month, following the company's discovery that a group of news organizati­ons was working on stories about the internal documents.

"At the heart of these stories is a premise which is false. Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or wellbeing misunderst­ands where our own commercial interests lie," Facebook said in a prepared statement Friday. "The truth is we've invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook."

Statements like these are the latest sign that Facebook has gotten into what Sophie Zhang, a former Facebook data scientist, described as a "siege mentality" at the company. Zhang last year accused the social network of ignoring fake accounts used to undermine foreign elections. With more whistleblo­wers — notably Haugen — coming forward, it's only gotten worse.

"Facebook has been going through a bit of an authoritar­ian narrative spiral, where it becomes less responsive to employee criticism, to internal dissent and in some cases cracks down upon it," said Zhang, who was fired from Facebook in the fall of 2020. "And this leads to more internal dissent."

"I have seen many colleagues that are extremely frustrated and angry, while at the same time, feeling powerless and (dishearten­ed) about the current situation," one employee, whose name was redacted, wrote on an internal message board after Facebook decided last year to leave up incendiary posts by former President Donald Trump that suggested Minneapoli­s protesters could be shot. "My view is, if you want to fix Facebook, do it within."

This story is based in part on disclosure­s made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen's legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizati­ons, including The Associated Press.

They detail painstakin­gly collected data on problems as wide-ranging as the traffickin­g of domestic workers in the Middle East, an over-correction in crackdowns on Arabic content that critics say muzzles free speech while hate speech and abuse flourish, and rampant anti-vaccine misinforma­tion that researcher­s found could have been easily tamped down with subtle changes in how users view posts on their feed.

The company insists it "does not conduct research and then systematic­ally and willfully ignore it if the findings are inconvenie­nt for the company." This claim, Facebook said in a statement, can "only be made by cherry-picking selective quotes from individual pieces of leaked material in a way that presents complex and nuanced issues as if there is only ever one right answer."

Haugen, who testified before the Senate this month that Facebook's products "harm children, stoke division and weaken our democracy," said the company should declare "moral bankruptcy" if it is to move forward from all this.

At this stage, that seems unlikely. There is a deep-seated conflict between profit and people within Facebook — and the company does not appear to be ready to give up on its narrative that it's good for the world even as it regularly makes decisions intended to maximize growth.

"Facebook did regular surveys of its employees — what percentage of employees believe that Facebook is making the world a better place," Zhang recalled.

"It was around 70 percent when I joined. It was around 50 percent when I left," said Zhang, who was at the company for more than two years before she was fired in the fall of 2020.

Facebook has not said where the number stands today.

Newspapers in English

Newspapers from United States