Face­book did well, but can’t ease vig­i­lance

San Francisco Chronicle Late Edition - - BUSINESS REPORT - By Kevin Roose

Af­ter an Elec­tion Day largely free of vi­ral so­cial me­dia mis­in­for­ma­tion, and with lit­tle trace of the kind of Rus­sian troll stam­pede that hit its plat­form in 2016, ex­ec­u­tives at Face­book may be tempted to take a vic­tory lap. That would be a mis­take. It’s true that Face­book and other so­cial me­dia com­pa­nies have made strides to­ward clean­ing up their ser­vices in the past two years. The rel­a­tive calm we saw on so­cial me­dia Tues­day is ev­i­dence that, at least for one day, in one coun­try, the forces of chaos on th­ese ser­vices can be con­tained.

But more than any­thing, this year’s midterm elec­tion cy­cle has ex­posed just how frag­ile Face­book re­mains.

Want a dis­as­ter-free Elec­tion Day in the so­cial me­dia age? You can have one, but it turns out that it takes con­stant vig­i­lance from law en­force­ment agen­cies, aca­demic re­searchers and dig­i­tal se­cu­rity ex­perts for months on end.

It takes an ad hoc “war room” at Face­book head­quar­ters with dozens of staff mem­bers work­ing round-the-clock shifts. It takes hordes of jour­nal­ists and fact check­ers will­ing to po­lice the ser­vice for false news sto­ries and hoaxes so that they can be con­tained be­fore spread­ing to mil­lions. And even if you avoid ma­jor prob­lems from bad ac­tors do­mes­ti­cally, you might still need to dis­close, as Face­book did late Tues­day night, that you kicked off yet an­other group of what ap­peared to be Krem­lin-linked trolls.

I’ve ex­pe­ri­enced Face­book’s fragility first­hand. Ev­ery day for the past sev­eral months, as I’ve cov­ered the midterms through the lens of so­cial me­dia, I’ve started my day by look­ing for vi­ral mis­in­for­ma­tion on the ser­vice. (I’ve paid at­ten­tion to Twit­ter, YouTube and other so­cial net­works, too, but Face­book is the 800pound go­rilla of in­ter­net garbage, so it got most of my fo­cus.)

Most days, dig­ging up large-scale mis­in­for­ma­tion on Face­book was as easy as find­ing baby pho­tos or birth­day greet­ings. There were doc­tored pho­tos used to stoke fear about the car­a­van of Latin Amer­i­can mi­grants headed to­ward the United States bor­der. There were eas­ily dis­prov­able lies about the women who ac­cused Jus­tice Brett Ka­vanaugh of sex­ual as­sault, cooked up by par­ti­sans with bad-faith agen­das. Ev­ery time ma­jor po­lit­i­cal events dom­i­nated the news cy­cle, Face­book was over­run by hoax­ers and con­spir­acy the­o­rists, who used it to sow dis­cord, spin false­hoods and stir up tribal anger.

Face­book was gen­er­ally re­spon­sive to th­ese prob­lems af­ter they were pub­licly called out. But its scale means that even peo­ple who work there are of­ten in the dark. Some days, while call­ing the com­pany for com­ment on a new vi­ral hoax I had found, I felt like a col­lege RA telling the dean of stu­dents about shock­ing mis­be­hav­ior in­side a dorm he’d never vis­ited. (“The fresh­men are drink­ing what?”)

Other days, comb­ing through Face­book false­hoods has felt like watch­ing a na­tion poi­son it­self in slow mo­tion. A re­cent study by the Ox­ford In­ter­net In­sti­tute, a depart­ment at the Univer­sity of Ox­ford, found that 25 per­cent of all elec­tion­re­lated con­tent shared on Face­book and Twit­ter dur­ing the midterm elec­tion sea­son could be clas­si­fied as “junk news.” Other stud­ies have hinted at progress in stem­ming the tide of mis­in­for­ma­tion, but the process is far from com­plete.

A Face­book spokesman, Tom Reynolds, said that the com­pany had im­proved since 2016, but there is “still more work to do.”

“Over the last two years, we’ve worked hard to pre­vent mis­use of Face­book dur­ing elec­tions,” Reynolds said. “Our teams worked round the clock dur­ing the midterms to re­duce the spread of mis­in­for­ma­tion, thwart ef­forts to dis­cour­age peo­ple from vot­ing and deal with is­sues of hate on our ser­vices.”

Face­book has framed its strug­gle as an “arms race” be­tween it­self and the bad ac­tors try­ing to ex­ploit its ser­vices. But that mis­char­ac­ter­izes the na­ture of the prob­lem. This is not two sovereign coun­tries locked in bat­tle, or an in­tel­li­gence agency try­ing to stop a ne­far­i­ous for­eign plot. This is a rich and suc­cess­ful cor­po­ra­tion that built a gi­ant ma­chine to con­vert at­ten­tion into ad­ver­tis­ing rev­enue, made bil­lions of dol­lars by let­ting that ma­chine run with lim­ited over­sight, and is now fran­ti­cally try­ing to clean up the mess that has re­sulted.

As the votes were be­ing tal­lied Tues­day, I talked to ex­perts who have paid close at­ten­tion to Face­book’s trou­bles over the past sev­eral years. Most agreed that Elec­tion Day it­self had been a suc­cess, but the com­pany still had plenty to worry about.

“I give them bet­ter marks for be­ing on the case,” said Michael Pos­ner, a pro­fes­sor of ethics and fi­nance at New York Univer­sity’s Stern School of Busi­ness. “But it’s yet to be seen how ef­fec­tive it’s go­ing to be. There’s an aw­ful lot of dis­in­for­ma­tion still out there.”

“On the sur­face, for Face­book in par­tic­u­lar, it’s bet­ter be­cause some of the worst con­tent is get­ting taken down,” said Jonathan Al­bright, re­search di­rec­tor at the Tow Cen­ter for Dig­i­tal Jour­nal­ism at Columbia Univer­sity. Al­bright, who has found net­works of Rus­sian trolls op­er­at­ing on Face­book in the past, has writ­ten in re­cent days that some of the com­pany’s fea­tures — in par­tic­u­lar, Face­book groups that are used to spread mis­in­for­ma­tion — are still prone to ex­ploita­tion.

“For bla­tantly false news, they’re not even close to get­ting ahead of it,” Al­bright said. “They’re barely keep­ing up.”

Jen­nifer Gry­giel, an as­sis­tant pro­fes­sor at Syra­cuse Univer­sity who stud­ies so­cial me­dia, said that Face­book’s pat­tern of re­ly­ing on out­side re­searchers and jour­nal­ists to dig up mis­in­for­ma­tion and abuse is wor­ry­ing.

“It’s a bad sign that the war rooms, es­pe­cially Face­book’s war room, didn’t have this in­for­ma­tion first,” Gry­giel said.

It’s worth ask­ing, over the long term, why a sin­gle Amer­i­can com­pany is in the po­si­tion of pro­tect­ing free and fair elec­tions all over the world. But that is the case now, and we now know that Face­book’s ac­tion or in­ac­tion can spell the dif­fer­ence be­tween elec­tions go­ing smoothly and democ­ra­cies strain­ing un­der a siege of mis­in­for­ma­tion and pro­pa­ganda.

To Face­book’s credit, it has be­come more re­spon­sive in re­cent months, in­clud­ing crack­ing down on do­mes­tic dis­in­for­ma­tion net­works, ban­ning par­tic­u­larly bad ac­tors such as Alex Jones of In­fowars, and hir­ing more peo­ple to deal with emerg­ing threats.

But Face­book would not have done this on its own. It took sus­tained pres­sure from law­mak­ers, reg­u­la­tors, re­searchers, jour­nal­ists, em­ploy­ees, in­vestors and users to force the com­pany to pay more at­ten­tion to mis­in­for­ma­tion and threats of elec­tion in­ter­fer­ence.

Face­book has shown, time and again, that it be­haves re­spon­si­bly only when placed un­der a well-lit mi­cro­scope. So as our col­lec­tive at­ten­tion fades from the midterms, it seems cer­tain that out­siders will need to con­tinue to hold the com­pany ac­count­able, and push it to do more to safe­guard its users — in ev­ery coun­try, dur­ing ev­ery elec­tion sea­son — from a flood of lies and ma­nip­u­la­tion.

Tom Bren­ner / New York Times

Face­book CEO Mark Zucker­berg’s com­pany showed more vig­i­lance to­ward lim­it­ing false in­for­ma­tion that it did dur­ing the 2016 cam­paign.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.