288 pages

Pasatiempo - - IN OTHER WORDS - Cus­to­di­ans of the In­ter­net. the

Who are the cus­to­di­ans of the in­ter­net — the men, women, and al­go­rithms who de­ter­mine the con­tent we see on so­cial me­dia? What are they hid­ing and re­mov­ing, and what do they base their judg­ments on? The an­swers are rarely, and never thor­oughly, forth­com­ing from the so­cial me­dia com­pa­nies them­selves. “When they ac­knowl­edge mod­er­a­tion at all, plat­forms gen­er­ally frame them­selves as open, im­par­tial, and non­in­ter­ven­tion­ist — in part be­cause their founders fun­da­men­tally be­lieve them to be so, and in part to avoid obli­ga­tion or li­a­bil­ity,” writes Tar­leton Gille­spie, a prin­ci­pal re­searcher at Mi­crosoft Re­search, New Eng­land, in his schol­arly new book

But, Gille­spie ar­gues, con­tent mod­er­a­tion is not an aux­il­iary fea­ture of so­cial me­dia plat­forms. In­stead, “mod­er­a­tion is, in many ways, com­mod­ity that plat­forms of­fer.” It is “a key part of what so­cial me­dia plat­forms do that is dif­fer­ent, that dis­tin­guishes them from the open web: they mod­er­ate (re­moval, fil­ter­ing, sus­pen­sion), they rec­om­mend (news feeds, trend­ing lists, per­son­al­ized sug­ges­tions), and they cu­rate (fea­tured con­tent, front-page of­fer­ings).” Un­der­stand­ing mod­er­a­tion, then, is es­sen­tial not just to un­der­stand­ing why a par­tic­u­lar In­sta­gram post was taken down. It is es­sen­tial to dis­cern­ing how what we are al­lowed to read and pub­lish on­line is de­ter­mined. Be­sides their ef­fects on in­for­ma­tion shar­ing, those de­ter­mi­na­tions have rip­pling ef­fects across so­cial norms, values, and pol­i­tics. “Plat­forms may not shape pub­lic dis­course by them­selves, but they do shape the scope of pub­lic dis­course,” Gille­spie writes. “And they know it.”

All stake­hold­ers in con­tent mod­er­a­tion — from the Sil­i­con Val­ley com­pa­nies who set the rules to the end user who comes across a per­son­ally of­fen­sive post — have an all-but-im­pos­si­ble task. (The stake­hold­ers who have the worst of it are un­doubt­edly the men and women around the world whose job it is to de­cide in mere sec­onds whether, for in­stance, a post is de­scrib­ing sex­ual vi­o­lence or con­don­ing it.) At ev­ery step of the mod­er­a­tion process, judg­ments must be made about what is ac­cept­able ver­sus what is pornog­ra­phy, hate speech, ob­scen­ity, abuse, or the pro­mo­tion of self-harm; the bound­aries must be re­viewed and per­haps re­drawn each time a user in­tro­duces an­other po­ten­tially ac­tion­able of­fense. Rules and stan­dards must be ap­plied to top­ics that have greatly var­ied cul­tural values.

Gille­spie takes it as a given that so­cial me­dia plat­forms need con­tent mod­er­a­tion, con­sid­er­ing the pro­lif­er­a­tion of hor­rors and harms that would oth­er­wise un­fold. The ques­tion, then, is not whether plat­forms should mod­er­ate, but how they can do it best, or at least in the least ob­jec­tion­able way. Be­cause the com­pa­nies’ mod­er­a­tion pro­cesses are not pub­lic, a com­plete re­view of the hows and whys is im­pos­si­ble. But Gille­spie works around this sig­nif­i­cant hur­dle by thor­oughly re­search­ing

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.