Facebook admits it could do better
UNITED STATES: Facebook has been taking a long, hard look at how it is affecting democracy – and the social media giant doesn’t like everything it sees in the mirror.
Yesterday, the company admitted it took too long to recognise how its site was being abused to spread misinformation or sow division during the 2016 US presidential election.
‘‘In 2016, we at Facebook were far too slow to recognise how bad actors were abusing our platform. We’re working diligently to neutralise these risks now,’’ wrote Samidh Chakrabarti, Facebook’s product manager of civic engagement, in a blog post.
Company CEO and co-founder Mark Zuckerberg initially dismissed the notion that Facebook influenced the election as a ‘‘pretty crazy idea’’. Since then, Facebook has been trying to understand the social network’s good and bad effects on democracy.
On one hand, social media has made it easy for people worldwide to voice their political opinions, get information quickly and speak directly to politicians. On the other, Facebook has also been abused to spread misinformation or divide the American public.
‘‘Facebook was originally designed to connect friends and family – and it has excelled at that. But as unprecedented numbers of people channel their political energy through this medium, it’s being used in unforeseen ways, with societal repercussions that were never anticipated,’’ Chakrabarti said.
Last year, Facebook found 80,000 posts from accounts linked to a Russian entity that reached about 126 million people in the US from 2015 to 2017. But by that time, the presidential election was already over.
‘‘This was a new kind of threat that we couldn’t easily predict, but we should have done better,’’ Chakrabarti said.
There are other issues that Facebook has concerns about,
"In 2016, we at Facebook were far too slow to recognise how bad actors were abusing our platform. We're working diligently to neutralise these risks now."
Samidh Chakrabarti, Facebook's product manager of civic engagement
including fake news, echo chambers, political harassment, and unequal participation from certain groups.
Facebook, which has more than 2 billion users worldwide, admits it doesn’t have all the answers.
‘‘I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t,’’ Chakrabarti said.
‘‘That’s why we have a moral duty to understand how these technologies are being used, and what can be done to make communities like Facebook as representative, civil and trustworthy as possible.’’
– TNS