Facebook admits it took too long to recognize harm to democracy
MENLO PARK >> Facebook has been taking a long, hard look at how it’s affecting democracy, and the social media giant doesn’t like everything it sees in the mirror.
On Monday, the company admitted it took too long to recognize how its site was used to spread misinformation or sow division during the 2016 U.S. presidential election.
“In 2016, we at Facebook were far too slow to recognize how bad actors were abusing our platform. We’re working diligently to neutralize these risks now,” wrote Samidh Chakrabarti, Facebook’s product manager of civic engagement, in a blog post.
Company CEO and co-founder Mark Zuckerberg initially dismissed the notion that Facebook influenced the election as a “pretty crazy idea.”
Since then, Facebook has been trying to understand the social network’s effects on democracy.
On one hand, social media has made it easy for people worldwide to voice their political opinions, get information quickly and speak directly to politicians. On the other, Facebook has been used to spread misinformation or divide the American public.
“Facebook was originally designed to connect friends and family — and it has excelled at that. But as unprecedented numbers of people channel their political energy through this medium, it’s being used in unforeseen ways with societal repercussions that were never anticipated,” Chakrabarti said.
Last year, Facebook found 80,000 posts from accounts linked to a Russian entity that reached around 126 million people in the United States from 2015 to 2017.
But by that time, the U.S. presidential election was already over.
“This was a new kind of threat that we couldn’t easily predict, but we should have done better,” Chakrabarti said.
There are other issues that Facebook has concerns about, including fake news, echo chambers, political harassment and unequal participation from certain groups.
In a blog post, Harvard Law School Professor Cass Sunstein compared social media to the use of cars. Automobiles allow people to get from one place to another, but there are thousands of crashes every year.
“For social media and democracy, the equivalents of car crashes include false reports (“fake news”) and the proliferation of information cocoons — and as a result, an increase in fragmentation, polarization and extremism,” Sunstein wrote.
“I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t,” Chakrabarti said. “That’s why we have a moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible.”