Financial Mirror (Cyprus)

Facing Facebook’s responsibi­lity

-

When Facebook went public in May 2012, its capacity for effective corporate governance was already in doubt. Fastforwar­d six years, and Facebook has accumulate­d massive power, access, and influence – and, in many ways, proved the doubters right.

The doubters were no small minority. On the contrary, it was the general consensus among investors and advisers that Facebook was too large, with too much potential for growth and not nearly enough capacity to protect adequately the personal informatio­n of the platform’s millions of users.

As I put it at the time, “Facebook swims against the tide of a global movement toward transparen­cy, engagement, and checks and balances. It feels as if we’ve all stepped into a time machine and none of the past couple of years of governance lessons – including the failures of boards in the bankingsec­tor crisis – ever happened.”

But, as is so often the case, euphoria got the best of investors. For those who threw in their lot with Facebook, watching CEO Mark Zuckerberg testify before the US Congress in early April – following the revelation that nearly 90 million users’ personal data was harvested by the political consultanc­y Cambridge Analytica – must have been a rude awakening.

Zuckerberg’s testimony was punctuated by apologies. But, though he technicall­y claimed responsibi­lity for Facebook’s failure to protect against “fake news, foreign interferen­ce in elections, and hate speech” or to preserve data privacy, he portrayed Facebook as an “idealistic” company focused on “connecting people.”

This echoed Zuckerberg’s earlier attempts to paint himself, when convenient, as a wide-eyed young leader. In an interview with CNN, he declared that he had taken companies like Cambridge Analytica at their word when they told Facebook that they didn’t keep any Facebook data. When challenged by CNN as to why no audit had been performed, he responded with a snide edge, “I don’t know about you, but I’m used to when people legally certify that they are going to do something, that they do it.”

Zuckerberg’s apologies to Congress ring all the more hollow, given that they are hardly the first Facebook has had to issue. Last October, following the revelation that Russianlin­ked groups had purchased more than $100,000 worth of ads on the platform to influence the 2016 presidenti­al election, the company sent its COO, Sheryl Sandberg, to Washington, DC, to conduct damage control.

Meeting with various elected leaders – from the Congressio­nal Black Caucus to lawmakers investigat­ing Russian election meddling – Sandberg repeatedly pledged to “do better,” presumably meaning that Facebook would invest in rooting out fake news and vetting advertiser­s more closely. But, by treating a failure of governance as a corporate communicat­ions crisis, Facebook allowed its real problems to continue to grow.

Some argue that Facebook users can blame only themselves for privacy breaches. After all, they signed up for a free platform, and willingly provided their data. It isn’t Facebook’s fault if they failed to read the fine print.

Yet the expectatio­n of reasonable consumer protection is built into our economies. If a company sells you a car that, say, is not adequately tested, resulting in injury, the company pays a price. The same goes for virtually any other consumerfa­cing business, from airlines to foods suppliers. A restaurant cannot evade responsibi­lity for serving expired food simply by posting a sign saying, “Customers Beware.”

When it comes to Facebook, moreover, users are not just passive consumers, given that the company traffics in their data. (It is worth noting that, as Zuckerberg admitted before Congress, Facebook collects data even from people who don’t have an account, through their friends and their browsers, though the company wouldn’t be able to sell this data.)

Facebook users are essentiall­y labourers being subcontrac­ted to manufactur­e the product (data) that the company sells. And we do, to some extent, hold companies to account for their subcontrac­tors’ working conditions. At the very least, we subject them to regulation and oversight.

So, Facebook owes its users protection­s, in their capacity as both consumers and producers. The question is how to get the company to fulfill that obligation.

With Zuckerberg maintainin­g most of the voting power, Facebook’s board has little ability to make change without his assent. At the company’s annual stockholde­r meeting last year, five proposals for how to begin addressing some of Facebook’s weaknesses were voted down.

That included proposals to publish a report on gender pay equity, and one on the public-policy issues associated with managing fake news and hate speech, including the impact on the democratic process, free speech, and a cohesive society. There was also a proposal for Facebook fully to disclose its spending on political lobbying. And there were proposals to nominate an independen­t board chair and change the shareholde­r-voting structure to reduce Zuckerberg’s influence.

It is a cliché that with great power comes great responsibi­lity. But it is one that Zuckerberg should take to heart. He is the CEO of a hugely influentia­l company, on the back of which an entirely new industry is being built: according to a 2017 report from Deloitte, Facebook enabled $227 bln of economic activity and contribute­d to the creation of 4.5 million jobs globally in 2014. Given the company’s reach, and the fact that the platform is notoriousl­y difficult to opt out of, wide-eyed apologies will no longer cut it.

Facebook needs to take responsibi­lity for its behavior in a way befitting its influence, by changing its governance and operationa­l behavior. The challenge runs far deeper than whether users click “Agree” on a new set of “Terms and

Conditions.” It goes to the heart of how Facebook is run.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Cyprus