Lofty as­pi­ra­tions for Face­book groups risk link­ing wrong peo­ple

The Daily Telegraph - Business - - Technology Intelligen­ce - By Lau­rence Dodds in San Fran­cisco

It was half-time, Fe­bru­ary 2020, and the San Fran­cisco 49ers were tied 10-10 with the Kansas City Chiefs; the big­gest prize in Amer­i­can foot­ball was still all to play for. While the teams re­grouped, Face­book was mak­ing its own big play. In its first-ever Su­per Bowl ad­vert, the globe-span­ning so­cial net­work showed off a med­ley of Face­book groups re­lat­ing to the word “rock”: ex­treme climbers in Utah, replica Stone­henge builders and a Rocky Bal­boa fan club gate­crashed by Sylvester Stal­lone. “What­ever you rock”, the clos­ing ti­tles promised, “there is a group for you”, be­fore flash­ing the slo­gan: “More to­gether.”

It was the cul­mi­na­tion of a three­year plan by chief ex­ec­u­tive Mark Zucker­berg to fix the prob­lems that had sown such chaos in the 2016 US elec­tion. Back then, Face­book’s pas­sion-seek­ing con­tent se­lec­tion al­go­rithms had filled vot­ers’ feeds with unchecked par­ti­san hoaxes and Rus­sian trolling. Zucker­berg’s an­swer was to re­build the news feed around per­sonal con­nec­tions such as friends, fam­ily and groups. If Face­book could shift 900m more users into “mean­ing­ful” groups, he de­clared, it would “strengthen our so­cial fab­ric and bring the world closer to­gether”. What could go wrong? The an­swer may not sur­prise you. The “booga­loo” mili­tia sub­cul­ture, linked to at least seven ter­ror plots and at­tempts at vi­o­lence; the QAnon con­spir­acy move­ment; an armed con­fronta­tion in Wis­con­sin that killed two peo­ple; ram­pant false wild­fire ru­mours that over­loaded first re­spon­ders on the US west coast; and wide­spread Covid-19 de­nial­ism. Each of these fis­sures have or are al­leged to have roots in Face­book groups.

“I’m try­ing not to over­state this, but I’m more nervous about Face­book groups than any other func­tion on any other so­cial me­dia plat­form when it comes to this elec­tion,” says Jesse Lehrich, who man­aged the Hil­lary Clin­ton cam­paign’s re­sponse to Rus­sian med­dling in 2016 and now works for Ac­count­able Tech. “When I have my night­mares of all the worstcase sce­nar­ios around elec­tion day, or while the vote count­ing is hap­pen­ing,

Face­book groups are the worst vec­tor of dis­in­for­ma­tion and in­cite­ment.”

Pre­vi­ously, Face­book had shut down groups it con­sid­ered dan­ger­ous or whose mem­bers had re­peat­edly bro­ken its rules, and re­stricted those that hosted too much fake news.

Last week, how­ever, it launched a more rad­i­cal crack­down, re­mov­ing all health groups from its au­to­matic rec­om­men­da­tions; de-pri­ori­tis­ing con­tent from po­ten­tially dan­ger­ous groups in users’ news feeds; and threat­en­ing to put any mem­ber who breaks a rule within a group on a kind of 30-day cy­ber pro­ba­tion.

The prob­lem, Lehrich ar­gues, is that this de­pends on Face­book catch­ing rule-break­ers in the first place. The com­pany does much of its polic­ing via ar­ti­fi­cial in­tel­li­gence, which peers in­side right­eous and wicked groups alike and col­lared al­most 12m pieces of hate speech and hate group pro­pa­ganda over the past year. But Face­book does not re­lease any es­ti­mates of how much it might have missed. Com­pared to or­di­nary users, groups are highly re­sis­tant to cen­sor­ship.

Many watch and learn from Face­book’s in­ter­ven­tions, con­stantly vary­ing their code words and swap­ping tips on how to avoid the all-see­ing al­go­rithm.

Some also use their join-up ques­tion­naires to in­ter­ro­gate new users about their be­liefs, or even make them pledge never to re­port group con­tent to Face­book. Once in­side, safe in a bub­ble of peo­ple who think like they do, mem­bers can fall foul of age-old group dy­nam­ics: con­fir­ma­tion bias, group-think and “group-shift”, in which group opin­ions be­come pro­gres­sively

more ex­treme over time. All of which leaves out­side re­searchers in the dark.

“From the sto­ries we have, the pri­vate groups seem to be par­tic­u­larly prob­lem­atic, but we can’t mea­sure that as re­searchers be­cause we don’t have any ac­cess to them,” says Kate Star­bird, a pro­fes­sor at the Univer­sity of Wash­ing­ton in Seattle.

Face­book it­self has of­ten helped ex­trem­ist groups grow via its own rec­om­men­da­tion al­go­rithms.

One in­ter­nal study in 2016 found that the com­pany’s sug­ges­tions were re­spon­si­ble for 64pc of the in­take of ex­trem­ist groups. Dur­ing the pan­demic, as mil­lions moul­dered at home glued to their screens, these sys­tems forged con­nec­tions not only within move­ments but be­tween them, link­ing up anti-vac­cine groups with QAnon, with booga­loo.

It is an ex­am­ple, says Star­bird, of how Face­book’s al­go­rithms do not just recog­nise a pre-ex­ist­ing so­cial re­al­ity but ac­tu­ally cre­ate new ones.

That is the crown­ing irony of Face­book groups. Zucker­berg has long pro­claimed it his mis­sion to “con­nect the world”. Now his ser­vice ap­pears to be con­nect­ing like-minded para­noids en masse. “It’s not just about the ide­ol­ogy, it’s about the com­mu­nity and the iden­tity,” says Star­bird, who be­gan her ca­reer study­ing so­cial me­dia users who re­sponded to crises, and who sees many sim­i­lar­i­ties in on­line cults such as QAnon. “They feel like they’re do­ing some­thing good for the world; that they are help­ing. This is a mis­sion of theirs, and they’re do­ing it to­gether.”

Or, as Face­book might put it, more to­gether.

A QAnon pro­tester, with the con­spir­acy move­ment al­leged to have roots in Face­book groups. Be­low, Mark Zucker­berg

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.