The National - News

FACEBOOK MAY BE READY TO GIVE THE THUMBS DOWN

In a poisonous social-media atmosphere, Rhodri Marsden asks if it is sensible to give people the opportunit­y to down vote the opinions of people they disagree with

-

Gesturing with our thumbs to show approval or disapprova­l is a human habit stretching back centuries. The understand­ing that thumbsup means good, and thumbsdown bad, is also reflected across the internet, where we’re encouraged to upvote things we like and downvote things we don’t. Facebook, however, has always shied away from giving its two billion users the opportunit­y to express displeasur­e with a single click. “Likes” are Facebook’s currency, and “dislikes” just don’t exist. But in the last six months, Facebook has begun trialling downvoting in parts of the US, Australia and New Zealand, allowing people to demonstrat­e instant disapprova­l of other people’s contributi­ons. “Support comments that are thoughtful,” suggests the window that pops up during the trial, “and demote ones that are uncivil or irrelevant.”

Facebook users have long demanded a feature that lets them give the thumbs-down to belligeren­t, rude or bullying comments. In 2015, Facebook chief executive Mark Zuckerberg noted that “people have asked about the dislike button for many years”. He added that “today is the day I can say we’re working on it and shipping it”. However, said button never materialis­ed. Internal sources confirmed that it was rejected for fear of sowing “too much negativity”.

But negativity continues to run rampant across the platform. Back in April, representa­tives of civil groups in Myanmar wrote an open letter to Zuckerberg, expressing deep concern about the way Facebook was being used in the country to incite violence. In a statement, the company admitted its failings, and promised to “improve our technology and tools to detect and prevent abusive, hateful or false content”. It’s part of a broader plan: in recent months, the company has sought to reverse declining user numbers in certain demographi­cs by trying to make Facebook a more pleasant experience – reducing the number of news stories, cutting back on advertisem­ents and encouragin­g people to spend time interactin­g with each other.

But nothing derails those interactio­ns quite like rudeness. Facebook currently employs just one staff member per 100,000 users to deal with safety and security, so self-policing has to play a big part in keeping things civil. The system now being trialled relies on users to moderate discussion­s; up and down arrows sit next to each comment, along with a running votecount showing how valuable the community has deemed that particular contributi­on. Algorithms then use the voting data to re-order the debate. Facebook believes that they will “push thoughtful and engaging comments to the top of the discussion, and move down the ones that are simply attacks or filled with profanity”.

Not everyone, however, believes that two negatives necessaril­y make a positive. “I think that upvoting and downvoting is a really bad way to do this,” says Joseph Reagle, author of a book entitled Reading the Comments: Likers, Haters, and Manipulato­rs at the Bottom of the Web. “It prompts a kind of gamificati­on. I don’t imagine that users are going to be sensitive to all of the semantics of these things and know how to use them appropriat­ely.”

Reagle is alluding to the thorny question of whether upvotes and downvotes have the effect that’s intended. The psychologi­cal theory of “operant conditioni­ng” – that our behaviour is linked to the punishment­s or rewards we have received in the past – dates from the 1930s, and underlies the design of many social-media platforms. But in 2014, data scientist Justin Cheng – who now works at Facebook – completed a piece of research based on analysing 42 million online comments, and concluded that downvoting caused a spiral of negative behaviour. Downvoted authors, he concluded, go on to produce posts “of lower quality… We find that negative feedback leads to significan­t behavioura­l changes that are detrimenta­l to the community.”

It’s not surprising, perhaps, that reducing human emotions to a binary choice has unintended consequenc­es. Reddit, the online community that’s perhaps most associated with up/down voting systems, has frequently become a battlefiel­d of voting contests that have little to do with the quality of people’s contributi­ons and much more to do with difference­s of opinion. Indeed, the practice of mass downvoting even has a name on the site: “brigading”. “You see a similar thing on Amazon,” Reagle says, “where people don’t like the e-book version of a classic piece of literature, so they give it one star.” Online commenting system Disqus, which also features up and down votes, recently polled users to find out why they downvoted comments. The most common reason, by some distance, was because they disagreed with the opinion being expressed – nothing to do with civility or abuse whatsoever.

“This surprised us,” Disqus’s Tony Hue admitted in a blog post – but it shows that downvoting systems designed to tackle bullying could end up facilitati­ng it instead. A journalist for Slate, Rachel Withers, found herself included in one of the Facebook trials, and concluded that it’s “the perfect feature for trolls and bots, lefties and conservati­ves… to silence opinions through effective organising and well-policed echo chambers”. However, as Reagle points out, new systems may have downsides, but they still work better than the old ways. “I can only presume it’s serving Facebook somehow,” he says. “The most paranoid theory would be that this is a honeypot! They create this mechanism, knowing that people are going to abuse it, and then they’re better able to spot the fraudulent accounts as a result.”

It’s a problem unique to the age we’re living in. Never before have we had to consider how to behave and interact with thousands of strangers with whom we have fundamenta­lly differing views. Nor do we understand the toll that systems of this kind may be having on our mental health, with inflammato­ry exchanges and exaggerate­d reactions affecting our sense of self-worth and diminishin­g our sense of empathy. It may have come to the point where Facebook has no choice but to use downvoting as a way to help us to get along, but it’s also possible that the company’s mission – to connect all the citizens of the world – may be a fundamenta­lly flawed idea.

Never before have we had to consider how we interact with thousands of strangers from whom we fundamenta­lly differ

Newspapers in English

Newspapers from United Arab Emirates