The Week (US)

Elections: Facebook and Twitter face misinforma­tion flood

-

Anticipati­ng the potential for a rapid spread of misinforma­tion as the November election nears, Twitter is making some of its most significan­t changes to date, said Shirin Ghaffary in Vox.com. The platform announced last week it is taking explicit steps to reduce the transmissi­on of “viral” posts, first by directing users to a screen that will encourage them to add commentary before they can retweet another post. Another screen will provide credible informatio­n if a user tries to retweet a post that Twitter has identified as misleading. The company also plans to apply more aggressive “warning labels on misleading posts by politician­s and accounts with more than 100,000 followers.” The moves are all about adding “friction,” or “nudging users to think twice before sharing misleading content.”

Facebook, too, has made election-minded changes, said Casey Newton in Platformer.news. The most proactive is banning political ads after polls close on Nov. 3. Most of the time, content moderation is “working to fix something after it has already broken.” But President Trump and his supporters “have given us advance notice of their intention” to declare fraud if he loses, and “we know giant pools of dark money are standing by to flood available channels with advertisin­g insisting” that a coup is underway. Facebook has identified “the most obvious routes of attack” and taken action. It also deserves credit for moving aggressive­ly to cull QAnon conspiracy theories. Ruling that QAnon is a “militarize­d social movement,” Facebook is now eliminatin­g QAnon-linked pages and groups without waiting for them to directly advocate violence.

Just don’t think this is the end of conspiracy mongering on Facebook, said Kate Cox in ArsTechnic­a.com. On the contrary, some changes that Facebook has planned will “create even more fertile ground for the spread of extremism and misinforma­tion.” Chief among them: Facebook has tweaked its algorithm to promote posts from groups that you’re not subscribed to but could be interested in—a formula for driving toxic viral content. Also, don’t expect Facebook to expand its limited ban on post-election political ads, said Ari Levy in CNBC.com. Most times, political ads account for only about 0.5 percent of the company’s revenue. But the rate of political spending has been much higher close to the election. In the third quarter, it made up 3 percent of Facebook’s revenue, and that is sure to grow.

The spending spree has included an anti-Trump Super PAC, Defeat by Tweet, that contrary to its name hasn’t spent a dollar on Twitter. Since Google recently “limited the ability for campaigns to target users,” and Twitter banned political ads altogether last year, Facebook is “the only game in town.”

 ??  ?? Twitter: Slowing down viral posts
Twitter: Slowing down viral posts

Newspapers in English

Newspapers from United States