Manawatu Standard

You should be afraid of Facebook

Leonid Bershidsky says the social media giant has spawned anxieties and addictions that detract from a user’s life satisfacti­on.

-

Facebook founder Mark Zuckerberg’s manifesto, penned clearly in response to accusation­s levelled at the social network in the wake of the bitter US election campaign, is a scary, dystopian document.

It shows that Facebook – launched, in Zuckerberg’s own words five years ago, to ‘‘extend people’s capacity to build and maintain relationsh­ips’’ – is turning into something of an extraterri­torial state run by a small, unelected government that relies extensivel­y on privately held algorithms for social engineerin­g.

In 2012, Zuckerberg addressed future Facebook investors in a letter attached to the company’s initial public offering prospectus. Here’s how he described the company’s purpose: ‘‘People sharing more – even if just with their close friends or families – creates a more open culture and leads to a better understand­ing of the lives and perspectiv­es of others.’’

Whatever those beliefs were based on, they have largely failed the test of time. Instead of creating stronger relationsh­ips, Facebook has spawned anxieties and addictions that are the subject of academic studies from Portugal to Australia. Some studies have determined that using Facebook detracts from a user’s life satisfacti­on.

A Danish experiment in 2015, involving people weaned from Facebook for a week and a control group that kept using it, showed that people on the social network are 55 per cent more likely to feel stressed; one source of that stress is envy of the glossified lives reported by other users.

Users’ well-being, research has showed, only tends to increase when they have meaningful interactio­ns – such as long message exchanges – with those who are already close to them. In his latest manifesto, Zuckerberg uses parenting groups as an example of something his company does right.

But recent research shows that some new mothers use Facebook to obtain validation of their selfpercep­tion as good parents, and failing to get enough such validation causes depressive symptoms.

As for the ‘‘rewired’’ informatio­n infrastruc­ture, it has helped to chase people into ideologica­l ‘silos’ and feed them content that reinforces confirmati­on biases.

Facebook actively created these silos by fine-tuning the algorithm that lies at its centre – the one that forms a user’s news feed.

The algorithm prioritise­s what it shows a user based, in large measure, on how many times the user has recently interacted with the poster and on the number of ‘‘likes’’ and comments the post has garnered.

In other words, it stresses the most emotionall­y engaging posts from the people to whom you are drawn – during an election campaign, a recipe for a filter bubble and, what’s more, for amplifying emotional rather than rational arguments.

Bragging in his new manifesto, Zuckerberg writes: ‘‘In recent campaigns around the world – from India and Indonesia across Europe to the United States – we’ve seen the candidate with the largest and most engaged following on Facebook usually wins.’’

His algorithmi­c interferen­ce in what people can see on his network has created a powerful tool for populists.

Zuckerberg doesn’t want to correct this mistake and stop messing with what people see on the social network.

Instead, the new manifesto talks about Facebook as if it were a country or a supranatio­nal bloc rather than just a communicat­ionenablin­g technology.

Zuckerberg describes how Facebook sorts groups into ‘‘meaningful’’ and, presumably, meaningles­s ones.

Instead of facilitati­ng communicat­ion among people who are already part of social support groups offline, he wants to project Facebook relationsh­ips into the real world: Clearly, that’s a more effective way of keeping competitor­s at bay.

The Facebook chief executive says his team is working on artificial intelligen­ce that will be able to flag posts containing offensive informatio­n – nudity, violence, hate speech – and pass them on for final decisions by humans.

If past experience is any indication, the overtaxed humans will merely rubber-stamp most decisions made by the technology, which Zuckerberg admits is still highly imperfect. Zuckerberg also suggests enabling every user to apply the filters provided by this technology.

The real-life effect will be that most users, too lazy to muck around with settings, will accept the ‘‘majority’’ standard, making it even less likely that anything they see would jar them out of their comfort zone.

Those who use the filters won’t be much better off: They’ll have no idea what is being filtered out because Facebook’s algorithms are a black box. Zuckerberg casts Facebook as a global community that needs better policing, governance, nudging toward better social practices.

He’s willing to allow some democracy and ‘‘referendum­s,’’ but the company will make the ultimate decision on the types of content people should see based on their behaviour on Facebook.

Ultimately, this kind of social engineerin­g affects people’s moods and behaviours. It can drive them toward commercial interactio­ns or stimulate giving to good causes but it can also spill out into the real world in more troubling ways.

It’s absurd to expect humility from Silicon Valley heroes.

But Zuckerberg should realise that by trying to shape how people use Facebook, he may be creating a monster.

His company’s other services – Messenger and Whatsapp – merely allow users to communicat­e without any interferen­ce, and that’s great.

People are grateful for tools that help them work, study, do things together – but they respond to shepherdin­g in unpredicta­ble ways.

‘‘Virtual identity suicide’’ is one; the trend doesn’t show up in Facebook’s reported usage numbers, but that might be because a lot of the ‘‘active users’’ the company reports are actually bots.

If you type ‘‘how to leave’’ into the Google search window, ‘‘how to leave Facebook’’ will be the first suggestion. – Washington Post

 ?? REUTERS ?? Facebook actively created ‘’silos’' by fine-tuning the algorithm that lies at its centre - the one that forms a user’s news feed.
REUTERS Facebook actively created ‘’silos’' by fine-tuning the algorithm that lies at its centre - the one that forms a user’s news feed.

Newspapers in English

Newspapers from New Zealand