USA TODAY International Edition

Judgment day for Facebook and Twitter

After 2016, platforms have to prove they’ve learned lessons

- Jessica Guynn

Social media companies are on alert for attempts to delegitimi­ze the election results.

For social media, it wasn’t Election Day. It was judgment day.

All the preparatio­ns of the past four years to protect the election have come down to this. On high alert, Facebook, Twitter and Google- owned YouTube pulled on their battle fatigues to quickly spot any effort to destabiliz­e the election or delegitimi­ze the results.

This heightened state is the result of the 2016 presidenti­al election when the major online platforms were caught with their guard down as Russians inflamed the electorate with divisive messages, and falsehoods and hoaxes ran rampant.

This time, the stakes are even higher. The fullblown partisan warfare of presidenti­al campaignin­g that ripped through social media in recent months may lead to an unpreceden­ted torrent of misinforma­tion, voter suppressio­n efforts and even fomenting of violence, observers say.

“Platforms know this is a referendum on their futures and how they’ll be regulated.” Jennifer Grygiel a communicat­ions professor at Syracuse University

It began on election eve when Facebook and Twitter posted a warning label on a President Donald Trump post. Twitter said his assertion that a recent Supreme Court decision could lead to problems and even violence in the Pennsylvan­ia election was misleading, and Twitter users were prevented from liking or replying to the tweet.

Slamming on the brakes quickly slowed its spread, according to misinforma­tion researcher­s at the Election Integrity Partnershi­p, but the tweet already had been retweeted more than 55,000 times and favorited more than 126,000 times.

Facebook also fact- checked the president, with a label that says voting fraud is “extremely rare.” The result of a policy announced in September, the Facebook label is slapped on posts that seek to delegitimi­ze the outcome of the election or discuss the legitimacy of voting methods.

More tweets were flagged on Election Day. Another by the GOP alleging voting irregulari­ties in Philadelph­ia soon sported this warning: “Some or all of the content shared in this Tweet is disputed and might be misleading about an election or other civic process.”

“Platforms know this is a referendum on their futures and how they’ll be regulated. The public is aware of the risks and social media companies know they need to demonstrat­e they are trying,” Jennifer Grygiel, a communicat­ions professor at Syracuse University who studies social media, told USA TODAY.

Facebook

Facebook says it has invested billions and assigned more than 35,000 people to fight harmful content, from “coordinate­d inauthenti­c behavior” ( accounts that work together to spread misinforma­tion) to foreign interferen­ce to election- related misinforma­tion.

“Our Election Operations Center will continue monitoring a range of issues in real time – including reports of voter suppressio­n content. If we see attempts to suppress participat­ion, intimidate voters, or organize to do so, the content will be removed,” the company said Monday night.

The team staffing that center also will track other issues, such as the swarming of Joe Biden campaign buses over the week, Facebook said.

“We are monitoring closely and will remove content calling for coordinate­d harm or interferen­ce with anyone’s ability to vote,” the company said.

If a presidenti­al candidate or party declares premature victory before the race is called by major media outlets, Facebook has said it will add labels on candidates’ posts and will put a notification at the top of News Feed to alert voters that no winner has been projected.

After polls close, Facebook has said it will run a notification at the top of Facebook and Instagram and label votingrela­ted posts from everyone, including politician­s, with a link to its Voting Informatio­n Center giving the latest stateby- state results for president, the Senate and the House.

A voting alerts tool will allow state and local election authoritie­s to reach constituen­ts with notifications on Facebook. The voting alerts also will appear in the Voting Informatio­n Center.

Facebook has readied “break- glass” tools to deploy if election- related violence erupts.

Facebook stopped accepting new political ads a week before the election and plans to suspend all political ads on Facebook and Instagram after the polls close Tuesday.

It also has temporaril­y limited popular features, turning off political and social group recommenda­tions, removing a feature in Instagram hashtag pages and restrictin­g the forwarding of messages in its WhatsApp messaging app.

Twitter

Before the election, Twitter made aggressive changes to curb misinforma­tion by labeling tweets on mail- in voting and COVID- 19, even from prominent political figures including Trump.

On Election Day, the company said it will take action against any tweet that claims victory before the race is called by state election officials or projected by authoritat­ive national news outlets.

Tweets that include premature claims of victory will be labeled and will direct users to Twitter’s election page. The company says it may add a warning label or remove tweets that incite people to interfere with the election or encourage violence.

Twitter said it would prioritize labeling tweets about the presidenti­al race and any other races “where there may be significant issues with misleading informatio­n.”

When people try to retweet a tweet with a misleading informatio­n label, they’ll see a prompt pointing them to credible informatio­n before they are able to amplify it.

During the election and at least through election week, Twitter will try to slow down the spread of misinforma­tion by encouragin­g users to add commentary before amplifying content ( in other words prompting them to “Quote Tweet” instead of retweeting someone else’s post).

Users will have to tap through a warning to see tweets with misleading informatio­n from U. S. political figures, U. S.- based accounts with more than 100,000 followers or accounts that get a lot of engagement, Twitter says. Users will not be able to retweet those posts but can quote tweet them. Likes, retweets and replies also will be turned off.

In September, Twitter launched an election hub and banned political ads before the election. Its stand: political reach “should be earned, not bought.”

YouTube

On Election Day, YouTube, owned by Google, said it would prominentl­y display election results in an informatio­n panel at the top of search results for a range of queries related to the election and under election videos.

The panel will warn users that the results may not be final and will link to a Google election page that will track results in real time based on data from The Associated Press.

As polls close, YouTube said it would point users to live streams of election coverage from authoritat­ive news outlets. YouTube says it does not allow videos that mislead voters or that urge people to interfere in the election and will take down violating videos.

Google said it would temporaril­y suspend election ads on Google and YouTube after the polls close.

 ?? BILL CAMPLING / USA TODAY NETWORK; GETTY IMAGES ??
BILL CAMPLING / USA TODAY NETWORK; GETTY IMAGES

Newspapers in English

Newspapers from United States