Wanting it both ways
Facebook will accept regulation so long as it can set the terms
Facebook founder and chief executive Mark Zuckerberg made a perplexing request of global lawmakers this month in Munich: When it comes to speech, don’t treat social media sites like newspapers and don’t treat them like telecommunications companies, but treat them like they’re “somewhere in between.”
What does that mean? The company released a white paper days later that offers some answers — and raises many more questions along the way.
Facebook has become eager for regulation just as regulation has become inevitable. States within the European Union have already begun to concoct frameworks for forcing companies to moderate content more aggressively, and now the body as a whole is seeking a unified rule.
Meanwhile, officials here are looking askance at firms’ traditional protections from liability for users’ actions. The debate isn’t about whether platforms must carry everything. It’s about whether they may — or whether these sites ought to have stronger obligations to police material that’s illegal, or even just harmful.
Facebook’s answer is basically “yes, but.” The site wants rules, but it prefers that those rules focus on monitoring and removal mechanisms that firms must put in place, rather than restrictions on companies carrying specific types of speech. Firms, in other words, ought to fully enforce the terms of service they already have.
The proposal is useful in that it would at least theoretically hold Facebook and its cohort to public account for doing, or not doing, what they say. But it’s also quite similar to the status quo. To really encourage the more aggressive enforcement that legislators want to see — not only from Facebook but also from the scrappier and often scarier sites on the internet — a bigger stick might be necessary.
Certainly, punishing companies for individual pieces of offending content would encourage over-censorship. But governments could tell companies the categories of content they’re supposed to be policing and then certify whether their efforts are adequate to the job.
In Europe, these categories of content look likely to stretch beyond the merely illegal to the “harmful” — a risky proposition that could be mitigated by regulators clearly defining what they expect of firms. In the United States, illegality is the word, and though internet sites’ traditional protection from liability for what users post shouldn’t be removed altogether, there is room for revision: Protect good Samaritan companies that have reasonable systems in place for detecting and scrubbing illegal content, and don’t protect those that refuse to try.
Any of these rules would tell companies to make tricky and sensitive decisions about expression — outsourcing a role usually performed by the courts to a private actor. Of course, that’s also what is happening already.
Key to any regime for online content regulation, then, is transparency into what companies’ policies are, how they carry them out and why they make the decisions they do. Key also will be avenues for appeal that are just as open.
Facebook doesn’t want to be treated like a newspaper, or like a telecommunications company, yet right now it looks a lot like a government.