Face­book’s fake news prob­lem: What’s its re­spon­si­bil­ity?

Kuwait Times - - TECHNOLOGY -

NEW YORK: Face­book is un­der fire for fail­ing to rein in fake and bi­ased news sto­ries that some be­lieve may have swayed the pres­i­den­tial elec­tion. Its predica­ment stems from this ba­sic co­nun­drum: It ex­er­cises great con­trol over the news its users see, but it de­clines to as­sume the ed­i­to­rial re­spon­si­bil­ity that tra­di­tional pub­lish­ers do. On Mon­day, Face­book took a mi­nor step to ad­dress the is­sue, clar­i­fy­ing its ad­ver­tis­ing pol­icy to em­pha­size that it won’t dis­play ads on sites that run in­for­ma­tion that is “il­le­gal, mis­lead­ing or de­cep­tive, which in­cludes fake news.”

The com­pany said it was merely mak­ing ex­plicit a pol­icy that was al­ready im­plied. Its move fol­lowed a sim­i­lar step by Google ear­lier on Mon­day, after the search gi­ant ac­knowl­edged that it had let a false ar­ti­cle about the elec­tion re­sults slip into its list of rec­om­mended news sto­ries. In the case of both com­pa­nies, the aim is to dis­cour­age fake-news sites by de­priv­ing them of rev­enue. Face­book is also said to be fac­ing brew­ing in­ter­nal tur­moil over its in­flu­ence and what it can and should do about it.

Em­ploy­ees have ex­pressed con­cern over Face­book’s role in spread­ing mis­in­for­ma­tion and racist memes largely as­so­ci­ated with the alt-right, ac­cord­ing to The New York Times and Buz­zfeed. Some have re­port­edly formed an un­of­fi­cial task force to in­ves­ti­gate the role the com­pany played in the elec­tion. Founder and CEO Mark Zucker­berg, how­ever, in­sists that Face­book re­mains a neu­tral tech­nol­ogy plat­form where its users can share any­thing they want, with only a tiny frac­tion of it fake or prob­lem­atic. Last week, Zucker­berg called the idea that vot­ers might have been in­flu­enced by what they saw on Face­book - fake, uber-par­ti­san sto­ries, such as a false one about Pope Fran­cis en­dors­ing Don­ald Trump for pres­i­dent - “pretty crazy.”

Peo­ple up­date to Face­book so fre­quently that the com­pany has no choice but to fil­ter what ev­ery­one sees in their news feeds - the main artery through which users see posts from their friends, fam­ily, busi­nesses, news sources and celebri­ties they fol­low. The com­pany’s se­cret al­go­rithms are de­signed to de­liver the posts from friends and other sources that will draw peo­ple in and lead them to read and click and “like” and share “max­i­miz­ing their en­gage­ment,” in Face­book’s jar­gon. Face­book fre­quently tweaks its al­go­rithm to im­prove en­gage­ment. Var­i­ous changes have been aimed at shut­ting out sites that pro­mote click­bait and other garbage that users say they don’t want to see, even as they click on it and share away. When users are sur­rounded by posts they want to see, they’re more likely to stick around.

That’s key to Face­book’s ad­ver­tis­ing busi­ness. But it can be prob­lem­atic when it comes to false but highly in­ter­est­ing posts. Face­book’s news feed “max­i­mizes for en­gage­ment. As we’ve learned in this elec­tion, bulls-t is highly en­gag­ing,” for­mer Face­book prod­uct de­signer Bobby Good­latte wrote in an Elec­tion Day post . “Highly par­ti­san, fact-light out­lets” on both the right and the left, he wrote, “have no con­cern for the truth, and re­ally only care for en­gage­ment . ... It’s now clear that democ­racy suf­fers if our news en­vi­ron­ment in­cen­tivizes bulls-t.”

So­cial me­dia com­pa­nies to­day have to ac­knowl­edge that they are news or­ga­ni­za­tions, said Jef­frey Herbst, pres­i­dent and CEO of the New­seum, a jour­nal­ism mu­seum in Wash­ing­ton. “Not like news com­pa­nies of the 20th cen­tury,” he added. “But not just pipes where peo­ple get their news. They de­ter­mine what is news.” In a post Satur­day night, Zucker­berg re­jected that idea. “News and me­dia are not the pri­mary things peo­ple do on Face­book, so I find it odd when peo­ple in­sist we call our­selves a news or me­dia com­pany in or­der to ac­knowl­edge its im­por­tance,” he wrote. “Face­book is mostly about help­ing peo­ple stay con­nected with friends and fam­ily.”

Face­book as news­pa­per?

Back in 2013, Zucker­berg said he wanted Face­book to be peo­ple’s “own per­sonal news­pa­per,” one that de­liv­ers the sto­ries most in­ter­est­ing and im­por­tant to them. That’s still the com­pany’s goal - though mi­nus any ref­er­ence to it­self as a me­dia com­pany of any kind. Of course, fake stuff has ex­isted on the in­ter­net long be­fore Face­book. And un­der the law, Face­book is no more re­spon­si­ble for what ap­pears on its site than “the pa­per mills that print news­pa­pers are re­spon­si­ble for their con­tent,” said Steve Jones, a pro­fes­sor at the Univer­sity of Illi­nois at Chicago who stud­ies com­mu­ni­ca­tion tech­nol­ogy. At the same time, Jones said he thinks the broader is­sue of Face­book’s re­spon­si­bil­ity is one that’s go­ing to be “de­bated for­ever.”“Even the no­tion of truth is some­thing that’s highly con­tested at this point,” he said.

Newspapers in English

Newspapers from Kuwait

© PressReader. All rights reserved.