Arkansas Democrat-Gazette

Role of Facebook in ’20 debated

Social network insists it has learned its lesson since 2016

- BARBARA ORTUTAY AND DAVID KLEPPER

Ever since Russian agents and other opportunis­ts abused its platform in an attempt to manipulate the 2016 U.S. presidenti­al election, Facebook has insisted that it’s learned its lesson and is no longer a conduit for misinforma­tion, voter suppressio­n and election disruption.

But it has been a long and halting journey for the social network. Critical outsiders, as well as some of Facebook’s own employees, say the company’s efforts to revise its rules and tighten its safeguards remain wholly insufficie­nt to the task, despite it having spent billions on the project.

“Am I concerned about the election? I’m terrified,” said Roger McNamee, a Silicon Valley venture capitalist and an early Facebook investor turned vocal critic. “At the company’s current scale, it’s a clear and present danger to democracy and national security.”

The company’s rhetoric has certainly gotten an update. Chief Executive Officer Mark Zuckerberg now casually references possible outcomes that were unimaginab­le in 2016 — among them, possible civil unrest and potentiall­y a disputed election that Facebook could easily make even worse — as challenges the platform now faces.

“This election is not going to be business as usual,” Zuckerberg wrote in a September Facebook post in which he outlined Facebook’s efforts to encourage voting and remove misinforma­tion from its service. “We all have a responsibi­lity to protect our democracy.”

While other platforms like Twitter and YouTube have al

so struggled to address misinforma­tion and hateful content, Facebook stands apart for its reach and scale and, compared to many other platforms, its slower response to the challenges identified in 2016.

In the immediate aftermath of President Donald Trump’s election, Zuckerberg pushed back on the idea that “fake news” spread on Facebook could have influenced the 2016 election, calling it “a pretty crazy idea.” A week later, he walked back the comment.

Since then, Facebook has issued a stream of mea culpas for its slowness to act against threats to the 2016 election and promised to do better. “I don’t think they have become better at listening,” said David Kirkpatric­k, author of a book on Facebook’s rise. “What’s changed is more people have been telling them they need to do something.”

The company has hired outside fact-checkers, added multiple restrictio­ns on political advertisem­ents and taken down thousands of accounts, pages and groups it found to be engaging in what it calls “coordinate­d inauthenti­c behavior.”

It’s also started added warning labels to posts that contain misinforma­tion about voting and has, at times, taken steps to limit the circulatio­n of misleading posts. In recent weeks the platform also banned posts that deny the Holocaust and joined Twitter in limiting the spread of an unverified political story about Hunter Biden, son of Democratic presidenti­al candidate Joe Biden, published by the New York Post.

In the four years since the last election, Facebook’s earnings and user growth have soared. This year, analysts expect the company to rake in profits of $23.2 billion on revenue of $80 billion, according to FactSet. It currently boasts 2.7 billion users worldwide, up from 1.8 billion at this time in 2016.

Facebook faces a number of government investigat­ions into its size and market power, including an antitrust probe by the U.S. Federal Trade Commission. An earlier FTC investigat­ion socked Facebook with a large $5 billion fine, but didn’t require any additional changes.

“Their No. 1 priority is growth, not reducing harm,” Kirkpatric­k said. “And that is unlikely to change.”

Facebook insists it takes the challenge of misinforma­tion seriously — especially when it comes to the election.

“Elections have changed since 2016, and so has Facebook,” the company said in a statement laying out its policies on the election and voting. “We have more people and better technology to protect our platforms, and we’ve improved our content policies and enforcemen­t.”

Kirkpatric­k notes that board members and executives who have pushed back against Zuckerberg — a group that includes the founders of Instagram and WhatsApp — have left the company.

“He is so certain that Facebook’s overall impact on the world is positive” and that critics don’t give him enough credit for that, Kirkpatric­k said of Zuckerberg. As a result, the Facebook CEO isn’t inclined to take constructi­ve feedback. “He doesn’t have to do anything he doesn’t want to. He has no oversight,” Kirkpatric­k said.

The federal government has so far left Facebook to its own devices, a lack of accountabi­lity that has only empowered the company, according to Rep. Pramila Jayapal, D-Wash., who grilled Zuckerberg during a July Capitol Hill hearing.

Warning labels are of limited value if the algorithms underlying the platform are designed to push polarizing material at users, she said. “I think Facebook has done some things that indicate it understand­s its role. But it has been, in my opinion, far too little, too late.”

In the immediate aftermath of President Donald Trump’s election, Zuckerberg pushed back on the idea that “fake news” spread on Facebook could have influenced the 2016 election

 ?? (AP file photo) ?? “We all have a responsibi­lity to protect our democracy,” Facebook CEO Mark Zuckerberg said in a September post.
(AP file photo) “We all have a responsibi­lity to protect our democracy,” Facebook CEO Mark Zuckerberg said in a September post.

Newspapers in English

Newspapers from United States