Birmingham Post

Algorithm that changed society may come back to bite Facebook

- Chris Bucktin

THIS week Britain was provided with a worrying insight into the practices of Facebook as a former manager from the social media giant began her European tour.

The US has already been left reeling by Frances Haugen’s revelation­s, but this week she arrived in the UK to tell Parliament what everyone – even those like me without a Facebook account – should be concerned about.

Since its creation by Mark Zuckerberg in 2004, Facebook has been no stranger to controvers­y.

For years, much of its practices were unknown to those who trusted every aspect of their life with it.

But following Haugen’s US and

UK testimony, which includes thousands of pages of documents, it’s not just the whistle but Facebook’s roof that has been blown.

She told how in 2016, Facebook gave its users five new ways to react to a post in their news feed – ‘wow, love, sad, haha, and angry’.

It is alleged that Facebook developed the algorithm that decides what individual­s see in their news feeds to utilise these reaction emojis as signals to promote more emotional and provocativ­e material, including informatio­n that is likely to irritate them.

According to internal documents, Facebook’s ranking system began treating emoji replies as five times more significan­t than “likes” in

2017.

The notion was simple: posts that elicited a lot of reply emojis kept people engaged longer, and keeping users engaged was crucial to Facebook’s success.

Researcher­s at Facebook were quick to notice a serious issue. Favouring “controvers­ial” postings, such as those that irritate users, might “inadverten­tly open the door to additional spam/abuse/clickbait,” according to one of the internal memos.

“It’s possible,” a colleague is said to have replied.

In 2019, the company’s scientists discovered that postings that elicited furious response emojis were more likely to include disinforma­tion, toxicity, and low-quality content.

It suggests Facebook has been amplifying some of the platform’s worst features for the past three years, making them more prominent in users’ feeds and thereby disseminat­ing them to a far larger audience.

It is said the algorithmi­c promotion’s strength hampered the work of Facebook’s content moderators and integrity teams, who were up against a mountain of dangerous informatio­n.

Haugen’s disclosure­s have rightfully strengthen­ed political support for more significan­t legislatio­n in the States, UK and Europe. Some have even demanded Zuckerberg resign as Facebook’s CEO.

The social network has, of course, hit back.

Spokesman Mitch Henderson defended its practices and said it had spent £9.5 billion and hired 40,000 people to work on safety issues.

“Contrary to what was discussed at the hearing, we’ve always had the commercial incentive to remove harmful content from our sites,” he said.

“People don’t want to see it when they use our apps, and advertiser­s don’t want their ads next to it.”

Whichever way you cut it though, Haughen’s testimony is worrying.

On the evidence she provides, she is right when she said she believed Facebook’s products harm children, stoke division, weaken our democracy and much more.

Remember when, if people didn’t have anything nice to say, they kept their opinions to themselves, or

God forbid, actually called or went to see their friends rather than detailed every aspect of their lives online for the world to see?

Facebook’s products harm children, stoke division, weaken our democracy...

 ?? ??
 ?? ?? > Frances Haugen in Parliament
> Frances Haugen in Parliament
 ?? ?? >
Mark Zuckerberg
> Mark Zuckerberg

Newspapers in English

Newspapers from United Kingdom