Facebook and the future we don't want to live in
It has been a difficult week for Facebook. On Monday, its family of companies including Facebook, WhatsApp, Instagram, Messenger and Oculus - suffered their largest service interruption to date. For six hours all of the platforms were offline because a routine maintenance process went awry.
Billions of users were unable to access their services while company staffers were virtually and physically locked out of the systems they needed to fix the issue.
Then on Tuesday, former Facebook employee and whistleblower Frances Haugen testified before the United States Congress that the company deliberately puts profit over protecting people by allowing harm to children particularly, and democracy more broadly.
Despite efforts from Facebook to counter Haugen's testimony on various outlets, her account was devastating for the company and came as Congress is deliberating the chance of some kind of legal or political action against the company. On the same day that Haugen was testifying, the world received an inadvertent reminder of why this was an urgent issue.
If these two things seem disconnected, it is because you have not been paying attention to Facebook's growing market dominance as a social networking platform and as a communications provider.
Today, an estimated two billion people in more than 180 countries use the WhatsApp messaging platform while there are at least 3.5 billion people who use Facebook. Instagram, while not as popular as these two sites, is increasingly important for small businesses in several countries, that use it to build and manage their client bases in lieu of building their own websites.
These platforms are unambiguously important to the global digital society because of their sheer size, and that means that small internal decisions to look the other way when people misuse them are significantly intensified, as well as easily transmitted across international borders. Positive nudges on Facebook drive people to the polls, but misinformation on the same platform drives people to drink horse medicine.
Devastating revelations about how the company thinks about its responsibility towards users coming on the heels of a service failure of this scale raise a simple yet fundamental question: Is Facebook ready for the future it is building and are we prepared to live in it?
From the way Facebook has handled Haugen's testimony, as well as the service interruption, it is evident that it does not fully understand the behemoth that it has constructed.
A simple layman account of the service interruption is that because of a software update Facebook essentially locked itself out of the backend of the system that not only governs how each of the various platforms function, but also the systems that run the company itself.
If between Facebook and WhatsApp alone there are about at least five billion individual accounts, you have to wonder why anyone thought it was a good idea to centralise all of the information in such an elementary way? It is the kind of over-centralisation that gives competition lawyers heartburn and that compels governments to intervene and stop companies from getting too big.
If Facebook was merely a large company that people depended on to communicate that would have been bad enough. But it is a large company that people depend on to communicate that also collects, monetises and transforms the personal data that people provide to it for this communication, and then holds it in opaque systems that are always two steps behind critical political developments. This perhaps explains the simple question that Congress asked Haugen: Is it time to break up Facebook?
The pure economic argument is that as long as the company is growing, it should be allowed to keep growing; after all, it is creating jobs and growing economies. But jobs and economies do not exist outside social and political contexts and will mean nothing if societies collapse.
The justification for allowing indefinite growth is feeble, particularly when the evidence that Haugen provided suggests that the company is not willing to change course on proof that it harms societies.
The company's policies on dealing with the sociology and moral economy created by the unprecedented concentration of data in its hands are wanting. It is seemingly unable or unwilling to understand that making communication easier means that people of bad intent will also find it easier to communicate.