San Francisco Chronicle

Facebook did well, but can’t ease vigilance

- By Kevin Roose

After an Election Day largely free of viral social media misinforma­tion, and with little trace of the kind of Russian troll stampede that hit its platform in 2016, executives at Facebook may be tempted to take a victory lap. That would be a mistake. It’s true that Facebook and other social media companies have made strides toward cleaning up their services in the past two years. The relative calm we saw on social media Tuesday is evidence that, at least for one day, in one country, the forces of chaos on these services can be contained.

But more than anything, this year’s midterm election cycle has exposed just how fragile Facebook remains.

Want a disaster-free Election Day in the social media age? You can have one, but it turns out that it takes constant vigilance from law enforcemen­t agencies, academic researcher­s and digital security experts for months on end.

It takes an ad hoc “war room” at Facebook headquarte­rs with dozens of staff members working round-the-clock shifts. It takes hordes of journalist­s and fact checkers willing to police the service for false news stories and hoaxes so that they can be contained before spreading to millions. And even if you avoid major problems from bad actors domestical­ly, you might still need to disclose, as Facebook did late Tuesday night, that you kicked off yet another group of what appeared to be Kremlin-linked trolls.

I’ve experience­d Facebook’s fragility firsthand. Every day for the past several months, as I’ve covered the midterms through the lens of social media, I’ve started my day by looking for viral misinforma­tion on the service. (I’ve paid attention to Twitter, YouTube and other social networks, too, but Facebook is the 800pound gorilla of internet garbage, so it got most of my focus.)

Most days, digging up large-scale misinforma­tion on Facebook was as easy as finding baby photos or birthday greetings. There were doctored photos used to stoke fear about the caravan of Latin American migrants headed toward the United States border. There were easily disprovabl­e lies about the women who accused Justice Brett Kavanaugh of sexual assault, cooked up by partisans with bad-faith agendas. Every time major political events dominated the news cycle, Facebook was overrun by hoaxers and conspiracy theorists, who used it to sow discord, spin falsehoods and stir up tribal anger.

Facebook was generally responsive to these problems after they were publicly called out. But its scale means that even people who work there are often in the dark. Some days, while calling the company for comment on a new viral hoax I had found, I felt like a college RA telling the dean of students about shocking misbehavio­r inside a dorm he’d never visited. (“The freshmen are drinking what?”)

Other days, combing through Facebook falsehoods has felt like watching a nation poison itself in slow motion. A recent study by the Oxford Internet Institute, a department at the University of Oxford, found that 25 percent of all electionre­lated content shared on Facebook and Twitter during the midterm election season could be classified as “junk news.” Other studies have hinted at progress in stemming the tide of misinforma­tion, but the process is far from complete.

A Facebook spokesman, Tom Reynolds, said that the company had improved since 2016, but there is “still more work to do.”

“Over the last two years, we’ve worked hard to prevent misuse of Facebook during elections,” Reynolds said. “Our teams worked round the clock during the midterms to reduce the spread of misinforma­tion, thwart efforts to discourage people from voting and deal with issues of hate on our services.”

Facebook has framed its struggle as an “arms race” between itself and the bad actors trying to exploit its services. But that mischaract­erizes the nature of the problem. This is not two sovereign countries locked in battle, or an intelligen­ce agency trying to stop a nefarious foreign plot. This is a rich and successful corporatio­n that built a giant machine to convert attention into advertisin­g revenue, made billions of dollars by letting that machine run with limited oversight, and is now franticall­y trying to clean up the mess that has resulted.

As the votes were being tallied Tuesday, I talked to experts who have paid close attention to Facebook’s troubles over the past several years. Most agreed that Election Day itself had been a success, but the company still had plenty to worry about.

“I give them better marks for being on the case,” said Michael Posner, a professor of ethics and finance at New York University’s Stern School of Business. “But it’s yet to be seen how effective it’s going to be. There’s an awful lot of disinforma­tion still out there.”

“On the surface, for Facebook in particular, it’s better because some of the worst content is getting taken down,” said Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University. Albright, who has found networks of Russian trolls operating on Facebook in the past, has written in recent days that some of the company’s features — in particular, Facebook groups that are used to spread misinforma­tion — are still prone to exploitati­on.

“For blatantly false news, they’re not even close to getting ahead of it,” Albright said. “They’re barely keeping up.”

Jennifer Grygiel, an assistant professor at Syracuse University who studies social media, said that Facebook’s pattern of relying on outside researcher­s and journalist­s to dig up misinforma­tion and abuse is worrying.

“It’s a bad sign that the war rooms, especially Facebook’s war room, didn’t have this informatio­n first,” Grygiel said.

It’s worth asking, over the long term, why a single American company is in the position of protecting free and fair elections all over the world. But that is the case now, and we now know that Facebook’s action or inaction can spell the difference between elections going smoothly and democracie­s straining under a siege of misinforma­tion and propaganda.

To Facebook’s credit, it has become more responsive in recent months, including cracking down on domestic disinforma­tion networks, banning particular­ly bad actors such as Alex Jones of Infowars, and hiring more people to deal with emerging threats.

But Facebook would not have done this on its own. It took sustained pressure from lawmakers, regulators, researcher­s, journalist­s, employees, investors and users to force the company to pay more attention to misinforma­tion and threats of election interferen­ce.

Facebook has shown, time and again, that it behaves responsibl­y only when placed under a well-lit microscope. So as our collective attention fades from the midterms, it seems certain that outsiders will need to continue to hold the company accountabl­e, and push it to do more to safeguard its users — in every country, during every election season — from a flood of lies and manipulati­on.

 ?? Tom Brenner / New York Times ?? Facebook CEO Mark Zuckerberg’s company showed more vigilance toward limiting false informatio­n that it did during the 2016 campaign.
Tom Brenner / New York Times Facebook CEO Mark Zuckerberg’s company showed more vigilance toward limiting false informatio­n that it did during the 2016 campaign.

Newspapers in English

Newspapers from United States