Facebook Moves to Curb Fake Content
Menlo Park: Facebook has built on its campaign to prevent the platform from being used to spread dangerous misinformation, saying it will remove bogus posts likely to spark violence.
The new tactic outlined by the the world’s leading online social network on Wednesday has been tested in Sri Lanka, which has recently been rocked by inter-religious riots
over false information posted on Facebook.
The company hopes to soon introduce the new rules in Myanmar, before expanding elsewhere. “There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down,” said a Facebook spokesperson after a briefing on the policy at the company’s campus in Silicon Valley.
“We will begin implementing the policy during the coming months.” For example, Facebook may remove inaccurate or misleading content, such as doctored photos, created or shared to stir up and ignite volatile situations in the real world.
The social network said it is partnering local organisations and authorities adept at identifying when posts are false and likely to prompt violence. Misinformation removed in Sri Lanka under the new policy included content falsely contending that Muslims were poisoning food given or sold to Buddhists.
In an interview published on Wednesday by the technology news
site Recode, Mark Zuckerberg, Facebook’s chief executive, attempted to explain how the company is trying to differentiate between offensive speech - the example he used was people who deny the Holocaust - and false posts that could lead to physical harm.
Many see Facebook as being used as a vehicle for spreading false information in recent years.