SOCIAL MEDIA COMPANIES GOING TOO FAR
Social media companies may have finally crossed the Rubicon.
In the past few weeks these companies took the ultimate step of banning the president of the United States from their platforms. We guess they signaled that they are more powerful than any president, now or in the future.
But aside from power, and whether their decision is right or wrong, they are now wading deeper into political waters. That is where sharks swim, and they may rue the day they did it.
Social media companies were originally given immunity from not just libel but any liability for what others post on their websites, on the theory that they were merely community bulletin boards where a body could post anything and everything in the spirit of free speech.
Some people post controversial and revolutionary things on bulletin boards. It was done 500 years ago when Martin Luther challenged the Catholic Church by posting his 95 Theses on the Castle Church door in Wittenberg, Germany.
In those days, church doors were used as community bulletin boards. Martin Luther’s 95 Theses launched a religious revolution that became the Protestant Reformation. It’s hard to think of anything more revolutionary than that.
Back in the early days of America, even before we were a country, the content of newspapers could be ugly, vile and defamatory. Some of it was just as awful as what you can see today on Facebook, Twitter and YouTube.
But over time, Americans came up with a solution: libel laws. The courts made newspapers and its owners responsible for everything they printed. If there was defamatory and malicious information in the paper, or a careless disregard for the truth, the newspaper was responsible, could be sued, and had to pay those damaged.
Today we have that same defamatory, often vile, content on social media. But the major difference is these new companies are not responsible for it. In fact, they have a virtual “get out of jail free card” with now-famous Section 230, a provision of the federal Communications Decency Act of 1996, which means they can’t be sued for libel for what others say on their platforms.
Let’s say Section 230 is abolished, making these social media companies responsible for what others post there. It is unlikely that would be even possible. Facebook has more than two billion worldwide users. There’s simply no way to monitor more than two billion postings.
What is the solution? The way the social media companies are making decisions — banning American citizens or politicians but still allowing dictators around the world to have accounts — doesn’t look like self-governing is a workable solution.
Another solution would be government regulation. That’s probably the worst option, and could easily result in authoritarian control by whatever party is in power. It would be a complete abdication of freedom of speech and freedom of the press.
So who should regulate them? What about the users themselves? If these companies are going to truly be community bulletin boards, even for a world community, why not allow users to both post and remove comments?
How would this work in the real world? If QAnon or antifa each made some outrageous post, they could be taken down immediately. If someone across town posts some malicious lie and damages your reputation, you can take it down rather than going through the expense of a libel lawsuit.
In such a scenario, these companies should have all the protections of Section 230 since they would be not responsible for the content. They would lose much of their control, and maybe not make as much money.
But it would solve a lot of their headaches, since they could no longer be accused of banning anyone. It would get them out of politics. And like anyone else, they could remove content, and should if it is akin to yelling fire in a crowded theater, or if there are otherwise dangerous posts.
That is probably not too much to ask of companies with the largest platforms for communication in world history.