The Atlanta Journal-Constitution

A look into how Facebook works

Secret algorithms are just the beginning, as reporter tells it.

- By Jacob Silverman

For all the public scrutiny heaped on tech companies in recent years, few people know how Facebook really works. Certainly not lawmakers and, sometimes, not even Facebook itself. The company that shapes the informatio­nal diet and worldviews of billions of people is a behemoth of growing complexity, a knotwork of automated systems and carefully constructe­d algorithms that exist behind a scrim of corporate secrecy.

The average observer tends to glimpse Facebook piecemeal, finding a privacy scandal here, intrusive advertisin­g there, perhaps some hate speech in the timeline, all of it forming an incomplete mental image of how the platform operates and why users see what they see.

“Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets,” the new book by Wall Street Journal reporter Jeff Horwitz, tries to give us the whole elephant. An extension of the Facebook Files, a series of prizewinni­ng articles that Horwitz and his colleagues based on more than 20,000 screenshot­s of Facebook documents from Frances Haugen, who worked as a product manager at Facebook before becoming disillusio­ned, “Broken Code” offers a comprehens­ive, briskly reported examinatio­n of key systems governing the platform and their many failings.

Combining Haugen’s access to original sources with interviews with Facebook insiders, Horwitz sets out to demonstrat­e that Facebook is perhaps less deliberate­ly malevolent and more casually destructiv­e than previously thought — rending the social fabric, funneling its customers into extremist groups, catalyzing political polarizati­on, flooding the infoscape with disinforma­tion, and providing tools that inadverten­tly facilitate human traffickin­g and other varieties of exploitati­on and fraud.

According to Horwitz, Facebook has long known what’s wrong with its platform — the company employs a boatload of researcher­s — but it would rather not know, so reports and memos sometimes get buried. As Horwitz chronicles, a reporting system might be redesigned to simply produce fewer user reports. A request to tweak an algorithm that would reduce the spread of fake news was approved, but Meta CEO Mark Zuckerberg ordered that its impact should be reduced 80%, lest it affect growth or anger power users. Zuckerberg then told the manager who proposed the change, “Don’t bring me something like this again.”

“Efforts to engineer growth had inadverten­tly rewarded political zealotry,” Horwitz writes. “And the company knew far more about the negative effects of social media usage than it let on.”

In Horwitz’s telling, Facebook’s leadership suffered from the egoism of the mission-driven corporatio­n, believing that it could do little wrong and that the success of the platform was an inherent good. They were, after all, linking the globe. Their ambitions were utopian — and deeply lucrative. “They had never considered the possibilit­y that it could do harm on the same scale,” Horwitz writes.

By the 2016 U.S. presidenti­al election, Facebook, with its sophistica­ted tools for targeting people according to narrow demographi­c data and interests, had become a key tool for political campaigns. Facebook offered help to both Donald Trump’s and Hillary Clinton’s campaigns. Only Trump accepted, and so a Facebook staffer went to San Antonio to embed himself in the office from which Brad Parscale directed Trump’s 2016 digital efforts. Facebook’s targeting tools were also credited with aiding in the election of Rodrigo Duterte in the Philippine­s and Narendra Modi in India.

“The truth was that it had no idea what was happening on its platform in most countries,” Horwitz writes. Nonwestern countries were consigned to the category “Rest of World.” Despite being a dominant communicat­ions and broadcast medium in dozens of countries, Facebook often had no native speakers, no policy experts and no real on-theground staff in these places. “Most of our integrity systems are less effective outside of the United States,” noted one employee.

In Horwitz’s account, Facebook is constantly working to compensate for its own inherent structural flaws. As a force for content distributi­on and for exciting people’s worst emotional appetites, Facebook is unparallel­ed. Its libidinal appeal is potentiall­y endless. Some of Facebook’s most problemati­c users — the ones who spread colossal amounts of racist content, for example — are those who post obsessivel­y, thousands of times per day. Wary of outright censorship and bad press, the company instead finds ways to suppress the reach of these users’ content or channel their behaviors to less destructiv­e ends. “If people had to be bigots, the company would prefer they be bigots on Facebook,” observes a member of the Integrity team.

“Broken Code” is light on summary conclusion­s, and that’s for the best. Too many tech books offer 11 chapters of doom and gloom, diagnosing our dire predicamen­t of mass surveillan­ce and exploitati­ve automated systems, and then follow all that up with one chapter of false solace where everything, in a brushstrok­e (or maybe a mouse click), is easily solved. “Broken Code” is something better. It’s a smartly reported investigat­ion into the messy internal machinatio­ns of one of the world’s most important and least understood companies. Horwitz emerges with the company’s dirty secrets but no pat conclusion­s. That’s left to the reader, who might decide that all of this has to go.

In Jeff Horwitz’s account, Facebook is constantly working to compensate for its own inherent structural flaws.

 ?? ??

Newspapers in English

Newspapers from United States