Editorials: Re­sist­ing the blue pen­cil

Face­book must face down at­tempts to re­strict free speech

The Washington Times Daily - - COMMENTARY -

James Lank­ford, the Repub­li­can se­na­tor from Ok­la­homa, was rightly in­censed Mon­day by an ABC News on­line head­line that “Jeff Ses­sions ad­dresses ‘anti-LGBT hate group’ but DOJ won’t re­lease his re­marks.”

The item, more rant than news, re­flected a rant by the South­ern Poverty Law Center des­ig­nat­ing the Al­liance De­fend­ing Free­dom as a “hate group” in an ac­count of the at­tor­ney gen­eral’s speech to the al­liance, which lit­i­gates re­li­gious-lib­erty cases. “In this coun­try,” the se­na­tor told the pres­i­dent of the net­work, “we have the free­dom to dis­agree. How­ever, dis­agree­ment is … not the same as hate.”

That’s a dis­tinc­tion with a sig­nif­i­cant dif­fer­ence, and it’s one that Mark Zucker­berg would do well to con­tem­plate as Face­book weighs how it can do a bet­ter job of polic­ing so-called “hate speech” on his so­cial net­work. Face­book em­ploys about 4,500 “con­tent mod­er­a­tors,” in­clud­ing third-party con­trac­tors, and has promised to hire 3,000 more this year. Even with that many mon­i­tors, polic­ing con­tent is a daunt­ing task for a site that says it has nearly 2 bil­lion users.

Even with a 15,000-word in­ter­nal man­ual to go by, the con­tent mod­er­a­tors still have to make sub­jec­tive judg­ment calls. Whether some­thing qual­i­fies as “hate speech” isn’t al­ways clear. Be­cause of the sheer vol­ume, con­tent mod­er­a­tors don’t have the lux­ury of look­ing for nu­ance or judg­ing re­deem­ing so­cial val­ues of a post that sits on the mar­gin.

Face­book deletes about 288,000 posts each month as “hate speech,” but crit­ics say Face­book’s stan­dards are ar­bi­trary and capri­cious. Those who have had posts taken down by Face­book for vi­o­lat­ing its stan­dards call those stan­dards un­clear and in­con­sis­tent.

That’s of­ten unavoid­able, be­cause posts that some peo­ple con­sider of­fen­sive or oth­er­wise ob­jec­tion­able aren’t of­fen­sive to others. “Snowflakes” among racial, re­li­gious and sex­ual mi­nori­ties, who have raised um­brage-tak­ing to an art form, com­plain the loud­est. Free­dom of speech, our most pre­cious right, should not be rou­tinely si­lenced on Face­book (or any­where else) merely at snowflake be­hest, as on many col­lege cam­puses.

Face­book in the In­ter­net age is sim­i­lar to the Founders’ “pub­lic square,” and must be kept open to all but the most ob­vi­ously ex­treme speech, nu­dity (to pro­tect chil­dren who use Face­book), vi­o­lent live videos and re­cruit­ment and in­cite­ment to ter­ror­ism.

But where to draw the line? What con­sti­tutes on­line “shout­ing ‘fire’ in a crowded theater”? It’s an ar­gu­ment that’s been sim­mer­ing and rag­ing in the United States for 240 years. Mi­nor­ity groups say they’re dis­pro­por­tion­ately cen­sored when they use Face­book to call out racism, of­ten us­ing racist anti-white rhetoric to do so, or any­thing they re­gard as anti-LGBT or anti-Is­lam. But Face­book doesn’t of­fer spe­cific ex­pla­na­tions of why posts are pulled down, or make pub­lic data pub­lic on what gets ex­cised.

Face­book is un­der­stand­ably wary of be­ing the ar­biter or gate­keeper of the pub­lic dis­course on its site, and tech­nol­ogy com­pa­nies that host such speech aren’t legally re­spon­si­ble for the con­tent posted by third par­ties.

Courts have made that clear in rul­ings fa­vor­ing those sued for post­ing scathing re­views, as on Yelp or Trip Ad­vi­sor sites, of their bad ex­pe­ri­ences with those busi­nesses’ goods and ser­vices. Some states even have laws against at­tempts to si­lence opin­ions some don’t agree with.

Mr. Zucker­berg should take his cue from Supreme Court Jus­tice Louis Bran­deis’ ad­mo­ni­tion 90 years ago that the rem­edy for “false­hoods and fal­la­cies” is “more speech, not en­forced si­lence.” Right on.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.