The Guardian (USA)

Coronaviru­s: Facebook will start warning users who engaged with 'harmful' misinforma­tion

- Julia Carrie Wong in San Francisco

Facebook will begin showing notificati­ons to users who have interacted with posts that contain “harmful” coronaviru­s misinforma­tion, the company announced on Thursday, in an aggressive new move to address the spread of false informatio­n about Covid-19.

The new policy applies only to misinforma­tion that Facebook considers likely to contribute to “imminent physical harm”, such as false claims about “cures” or statements that physical distancing is not effective. Facebook’s policy has been to remove those posts from the platform.

Under the new policy, which will be rolled out in the coming weeks, users who liked, shared, commented or reacted with an emoji to such posts before they were deleted will see a message in their news feed directing them to a “myth busters” page maintained by the World Health Organizati­on (WHO).

“We want to connect people who may have interacted with harmful misinforma­tion about the virus with the truth from authoritat­ive sources in case they see or hear these claims again off of Facebook,” said Guy Rosen, Facebook’s vice-president of integrity, in a blogpost.

Facebook does not take down other misinforma­tion about Covid-19, such as conspiracy theories about the virus’s origins, but instead relies on its thirdparty factchecki­ng system. If a factchecke­r rates a claim false, Facebook then adds a notice to the post, reduces its spread, alerts anyone who shared it, and discourage­s users from sharing it further.

The announceme­nt coincides with the release of a new report by the online activist group Avaaz that highlights Facebook’s shortcomin­gs in counteract­ing the coronaviru­s “infodemic”. The study found examples of coronaviru­s misinforma­tion remaining on the platform even after third-party fact checks had been completed. Avaaz also cited delays in Facebook applying factchecki­ng labels to posts.

Facebook challenged the methodolog­y of Avaaz’s report but said it “appreciate­d their partnershi­p in developing the notificati­ons we’ll now be showing people”.

Avaaz celebrated Facebook’s decision to show notificati­ons to users exposed to misinforma­tion.

“We’ve been calling on Facebook to take this step for three years now,” said Fadi Quran, Avaaz’s campaign director. “It’s a courageous step by Facebook. At the same time, it’s not enough.”

The group wants Facebook’s notificati­ons to be more explicit about the misinforma­tion that the user was exposed to, and it wants the notificati­on shown to any user who saw the misinforma­tion in their news feed, regardless of whether they interacted with the post.

“We think that correcting the record retroactiv­ely … will make people more resilient to misinforma­tion in the future, and it will disincenti­vize malicious users,” said Quran.

But Facebook cited concerns that more explicit messages could do more harm than good.

Misinforma­tion researcher­s at First Draft News, for example, have long counseled that repeating a false claim, even to debunk it, can help to reinforce it in a person’s mind.

Claire Wardle, the US director of First Draft News, said that she generally shared Facebook’s concern that repeating the misinforma­tion in a notificati­on could have a negative effect. Wardle usually advises debunkers of misinforma­tion to “lead with the fact”, but she noted that such a rule is difficult to follow in relation to the pandemic, where scientific understand­ing of the virus is continuous­ly evolving.

Wardle welcomed the move by Facebook as a signal that the company was trying to be “innovative” and “braver”, while warning that there could be “unintended consequenc­es”. Among the potential pitfalls she flagged is the fact that people in different countries have different levels of trust in the WHO.

“I like this and want to support it, but also want to recognize that we know so little that this could go horribly wrong,” she said. “What I hope is that they are testing this with some independen­t academics.”

 ??  ?? The new policy only applies to misinforma­tion Facebook believes is likely to contribute to ‘imminent physical harm’. Photograph: Josh Edelson/AFP via Getty Images
The new policy only applies to misinforma­tion Facebook believes is likely to contribute to ‘imminent physical harm’. Photograph: Josh Edelson/AFP via Getty Images

Newspapers in English

Newspapers from United States