The News Herald (Willoughby, OH)

Facebook’s ‘privacy cop’ is doomed

- Bhaskar Chakravort­i

The Federal Trade Commission issued its largest-ever fine, of $5 billion, to Facebook for violating a 2011 privacy settlement in late July. But the amount is only about a month’s worth of the company’s revenue, suggesting that the fine, while seeming large, is, in fact, rather modest.

More significan­tly, Facebook is required to have an “outside assessor” – a sort of privacy cop – to monitor the company’s handling of user data, along with following a few other corporate procedural requiremen­ts. That assessor could address the fundamenta­l problems with the way Facebook operates – but as a scholar of technology companies’ business practices, I’m worried that this potentiall­y all-important role is set up for failure.

In my opinion, in order to be effective, there are three main privacy-related concerns the FTC’s newly designated cop would need to look out for: the potential for genuine violations of users’ privacy; the targeted spread of harmful content, especially resulting in election manipulati­on and ethnic violence; and instances of collecting and harvesting far more data than is warranted to provide services to users.

An independen­t assessor will lack the standards, regulatory and legal guidelines, and the insight needed to actually monitor how Facebook handles those three issues.

Facebook’s history of privacy violations extends well beyond the most publicized ones, like letting Cambridge Analytica access the personal data of 50 million users to craft micro-targeted political ad campaigns.

Facebook has secretly shared data with other companies for years, without notifying the users. That practice, as well as the function that lets users sign in to other websites and apps with their Facebook login, has helped advertiser­s follow their targets around the internet. The company has also used its trove of user data to gain a competitiv­e advantage in business negotiatio­ns, boosting its own profits without compensati­ng the users themselves.

The FTC ruling gives the privacy cop no clear guidance on which data-sharing or datawithho­lding arrangemen­ts between Facebook and other companies are legitimate and where they cross a line. This is because there are still no internatio­nally agreed-upon data protection rules, and few clear regulation­s in the U.S. to compare Facebook’s actions against.

Facebook’s business model uses its treasure trove of user data to target advertisin­g, the source of almost all the company’s revenue. An outsider will be unable to tell the difference between legitimate business practices that harvest user data to increase profits and problemati­c abuses that violate users’ privacy. In fact, FTC Commission­er Rohit Chopra, who dissented from the decision, declared that the new settlement still “allows Facebook to decide for itself how much informatio­n it can harvest from its users and what it can do with that informatio­n.”

Facebook has struggled to limit harmful content on its networks, such as that which fed ethnic violence, distribute­d misinforma­tion or facilitate­d election interferen­ce.

The outside assessor will be focused on privacy, which means that identifyin­g, verifying and policing content will be beyond the assessor’s mandate. Ironically, steps to enhance privacy, such as ensuring end-to-end encryption across all of Facebook’s messaging platforms – as Mark Zuckerberg intends to do – would help in protecting the identity of the spreaders of harmful messages, rather than exposing them and their actions.

Access to Facebook seems free, because it costs no money, but users pay with their data. The assessor should ask if the users are being charged fairly, in privacy terms, for the service they’re receiving.

Normally, price is set by a competitiv­e market, where customers can choose from a range of service providers. Not so on Facebook, where there are high costs – again, not financial, but in terms of time and effort – to leaving, and no other option offering equivalent services.

A social science phenomenon called the “network effect” means that any network is increasing­ly valuable as more people join it – but that means it’s also increasing­ly hard to leave. There are now more than 2.3 billion Facebook users around the world.

It’s hard to leave Facebook, not only because there are so many users. Many customers use their Facebook logins on other apps and services. If they delete their Facebook accounts, they lose all access to those other apps too. Worse still, Facebook has bought up many of its competitor­s. Lots of people who quit Facebook shift over to Instagram – which is owned by Facebook.

Looking to the future, the company is making the price of leaving Facebook even higher, by planning to consolidat­e its data-collection power by integratin­g its various apps, including Facebook Messenger, Instagram and WhatsApp – as well as through a proposed digital currency for transactio­ns conducted on Facebook platforms. All of these create a playing field that is tilted in favor of an all-encompassi­ng single parent company, limiting users’ choices and making switching difficult. No assessor can remedy the inherent unfairness of that imbalance.

Far more than the fine, the centerpiec­e of the FTC deal is the outside assessor. If properly designed, this role could be truly game-changing – one of a forceful privacy cop setting the standards for how the power of big technology firms is managed from here on. But the fine is a slap on the wrist, and the cop’s arms are tied and don’t reach far enough. This sets a very bad precedent: Both the FTC and Facebook can declare a victory of sorts, while the consumer loses.

The Conversati­on is an independen­t and nonprofit source of news, analysis and commentary from academic experts.

Newspapers in English

Newspapers from United States