Houston Chronicle

Facebook’s problem lies in business model

- By Moshe Y. Vardi

Facebook CEO Mark Zuckerberg’s recent call for increased regulation of the internet sidesteppe­d the biggest questions of all: Is Facebook’s business model the real problem, and if so, is it redeemable?

Over the last couple of years, Facebook rolled from scandal to scandal. Consider this small sample of Facebook scandals from 2018:

• Cambridge Analytica harvested the personal data of millions of people’s Facebook profiles without their consent and used it for political purposes.

• Facebook gave big companies greater access to its users’ data without users’ permission.

• U.K. lawmakers published internal Facebook emails, including some that involved Zuckerberg, which paint a picture of a company aggressive­ly hunting for ways to make money from the reams of personal informatio­n it was collecting from users.

• It was also disclosed that a Facebook software bug may have affected close to 7 million people who used a Facebook login and gave permission to third-party apps to access their photos.

Facebook’s reputation has taken quite a beating.

In a March 30, Washington Post op-ed, Zuckerberg called for increasing regulation of the internet in four areas: harmful content, election protection, effective privacy and data protection, as well as data portabilit­y. Given Facebook’s shoddy reputation, it’s hard to take these policy recommenda­tions at face value, regardless of their merits.

After all, until 2014, Facebook’s motto was, “Move fast and break things. Unless you are breaking stuff, you are not moving fast enough.” In “Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy,” Jonathan Taplin argues that Silicon Valley increasing­ly resembles “some kind of nightmaris­h children’s playground, populated by overgrown babies with no idea of the consequenc­es of their actions.”

Breaking things can have profound societal consequenc­es. Given recent political developmen­ts around the world, Taplin’s assertion that technology undermined democracy does not sound too batty. It’s now quite clear that “frictionle­ss sharing” on social media gave rise to the fake-news phenomenon. It’s also now widely accepted that this had a serious impact on both the 2016 U.K. Brexit referendum and the 2016 U.S. presidenti­al election. It’s this cavalier attitude about breaking things that led Wall Street Journal columnist Peggy Noonan to describe Silicon Valley executives as “moral Martians.” Facebook’s refusal to accept responsibi­lity for harm in the wake of the recent Christchur­ch attack led New Zealand’s privacy commission­er to describe the company as “morally bankrupt.”

Zuckerberg also seems oblivious that Facebook’s biggest problem may be its advertisin­g-based business model. Where do Facebook’s hefty profits come from? “Not from me,” you may say. “The advertiser­s pay to advertise.”

But advertisin­g is just a cost of doing business, and advertiser­s pass that cost to consumers in the price of their goods and services, an opaque market in which consumers support internet companies via, essentiall­y, an invisible tax.

Ethan Zuckerman, director of the MIT Center for Civic Media, called the advertisin­g-based business model the “original sin of the internet.” But market opaqueness is just one problem. As we now know, internet advertiser­s require data to ensure effective delivery of ads, so we not only pay for “free informatio­n” with an invisible tax, but we also pay by providing our personal informatio­n. Thanks to the success of “surveillan­ce capitalism,” the internet has become a huge surveillan­ce machine.

Most curiously, Zuckerberg manages to discuss internet regulation without once mentioning Section 230 of the Communicat­ions Decency Act of 1996, a fundamenta­l piece of U.S. legislatio­n that provides immunity from liability for providers and users of an “interactiv­e computer service” who publish informatio­n provided by third-party users. The law states: “No provider or user of an interactiv­e computer service shall be treated as the publisher or speaker of any informatio­n provided by another informatio­n content provider.”

By allowing Facebook and other internet companies to operate as a platform, rather than as a publisher, Section 230 frees them from liability for the content that they publish. The explosive growth of socialmedi­a platforms would have not been possible without Section 230.

Yet this explosive growth has led to widespread manipulati­on. By co-opting social media platforms, unscrupulo­us actors ranging from disgruntle­d individual­s to state-run intelligen­ce operations have found a ready way to distribute false, misleading and harmful content to millions.

This proliferat­ion of “bad speech” on social media has become politicall­y untenable, and now all social media platforms are actively fighting “fake news.” Recently, for example, social media platforms banned the conspiracy theorist Alex Jones for violating their “abusive behavior” policy. Thus, in spite of Section 230, social media platforms seem to be accepting responsibi­lity for the content they publish. In other words, they are starting to behave with some restraint, like publishers, rather than platforms.

It is not at all clear, however, whether a platform such as Facebook, with more than 2 billion active users, can behave like a traditiona­l publisher. First, there is the difficulty of vetting content from so many users. With fewer than 40,000 employees, Facebook clearly cannot have people review all its content; algorithmi­c filtering is a must.

But if we have learned anything over the last few years, it is how good people are at outsmartin­g algorithms. Facebook removed 1.5 million videos of the Christchur­ch attacks within 24 hours, yet many archived versions remain available.

More fundamenta­lly, do we really want Facebook to regulate the speech of more than 2 billion people? The Washington Post, in response to Zuckerberg’s op-ed, called that “the dark side of regulating speech on Facebook.” Traditiona­l publishers regulate speech on their platforms, but there are numerous such platforms. In contrast, there is only one Facebook.

So the fundamenta­l question is: Are social media redeemable? We now know that the utopia of frictionle­ss sharing leads to filter bubbles, fake news and extreme content. Is allowing Facebook to act as the global censor the only answer? Is there a middle path between these two extremes?

Those are the questions Zuckerberg should be addressing. Facebook users, investors and lawmakers need to know.

Vardi is a university professor and professor of computer science in the Brown School of Engineerin­g at Rice University, where he directs both the Ken Kennedy Institute for Informatio­n Technology and the Rice Initiative on Technology, Culture and Society.

 ?? Josh Edelson / AFP / Getty Images ?? Facebook CEO Mark Zuckerberg’s call for regulation­s is hard to take seriously.
Josh Edelson / AFP / Getty Images Facebook CEO Mark Zuckerberg’s call for regulation­s is hard to take seriously.

Newspapers in English

Newspapers from United States