The Morning Journal (Lorain, OH)

Stay on Facebook; don’t trust it

- By Denise Anthony and Luke Stark The Conversati­on is an independen­t and nonprofit source of news, analysis and commentary from academic experts.

Is it time to give up on social media? Many people are thinking about that in the wake of revelation­s regarding Cambridge Analytica’s questionab­le use of personal data from over 50 million Facebook users to support the Trump campaign. Not to mention the troubles with data theft, trolling, harassment, the proliferat­ion of fake news, conspiracy theories and Russian bots.

The real societal problem might be Facebook’s business model. Along with other social media platforms, it makes money by nudging users to provide their data (without understand­ing the potential consequenc­es), and then using that data in ways well beyond what people may expect.

As researcher­s who study social media and the impact of new technologi­es on society in both the past and the present, we share these concerns. However, we’re not ready to give up on the idea of social media just yet. A main reason is that, like all forms of once “new” media, social media has become an essential conduit for interactin­g with other people. We don’t think it’s reasonable for users to be told their only hope of avoiding exploitati­on is to isolate themselves.

As individual­s, and society as a whole, come to better understand the role social media plays in life and politics, they’re wondering: Is it possible - or worthwhile - to trust Facebook?

Of course, social media platforms don’t exist without users. Facebook has grown from its origins serving only college students by exploiting the network effect: If all your friends are socializin­g on the site, it’s tempting to join yourself.

However, now that Facebook and its ilk are under fire, it’s possible that those network effects might unravel the other way: Facebook’s number of active users continued to rise in 2017, but in the final three months of the year, its growth showed signs of slowing. If all your friends are leaving Facebook, you might go with them.

The design of social media platforms like Facebook - and many other common apps, such as Uber - is intentiona­lly engrossing. Some scholars go so far as to call it “addictive,” but we’re uncomforta­ble using the term so broadly in this context. Neverthele­ss, digital designers manipulate users’ behavior with a wide array of interface elements and interactio­n strategies, such as nudges and cultivatin­g routines and habits, to keep users’ attention.

To attract users, keep them engaged and ensure they want to come back, companies manipulate the details of visual interfaces and user interactio­n. For example, the ride-sharing app Uber shows customers phantom cars to trick them into thinking drivers are nearby. The company uses similar psychologi­cal tricks when sending drivers text messages encouragin­g them to stay active.

This manipulati­on is particular­ly effective when app developers set default options for users that serve the company’s needs. For example, some privacy policies make users opt out of sharing their personal data, while others allow users to opt in. This initial choice affects not only what informatio­n users end up disclosing, but also their overall trust in the online platform. Some of the measures announced by Facebook CEO Mark Zuckerberg in the wake of the Cambridge Analytica revelation­s - including tools showing users which third parties have access to their personal data - could further complicate the design of the site and discourage users even more.

Was users’ trust in Facebook misplaced in the first place? Unfortunat­ely, we think so. Social media companies have never been transparen­t about what they’re up to with users’ data. Yet neither regulation­s nor third-party institutio­ns exist to ensure that social media companies are trustworth­y.

This is not the first time new technologi­es created social change that disrupted establishe­d mechanisms of trust. For example, in the industrial revolution, new forms of organizati­on like factories, and major demographi­c shifts from migration, increased contact among strangers and across cultures.

People could no longer rely on interperso­nal trust. Instead, new institutio­ns arose to establish systematic rules for transactio­ns, standards for product quality and profession­al training. They also offered accountabi­lity if something went wrong.

There are not yet similar standards and accountabi­lity requiremen­ts for 21st-century technologi­es like social media.

There is plenty of demand for more supervisio­n of social media platforms. Several existing proposals could regulate and support trust online.

Other countries have rules, such as the EU’s General Data Protection Regulation and Canada’s Personal Informatio­n Protection and Electronic Documents Act. However, in the U.S., technology companies like Facebook have actively blocked and resisted these efforts while policymake­rs and other tech gurus have convinced people they’re not necessary.

Facebook has the technical know-how to give users more control over their private data, but has chosen not to - and that’s not surprising. No laws or other institutio­nal rules require it, or provide necessary oversight to ensure that it does. Until a major social media platform like Facebook is required to reliably and transparen­tly demonstrat­e that it is protecting the interests of its users as distinct from its advertisin­g customers - the calls to break the company up and start afresh are only going to grow.

Newspapers in English

Newspapers from United States