How Facebook can make us more narrow-minded
IT distracts us from work and throws up pictures of our ex when we’re least expecting it.
Now it seems there might be another unfortunate side effect to using Facebook – becoming more narrow minded.
According to researchers, people who use the social network suffer an ‘echo chamber’ effect, in which their views are reinforced by peers who hold the same beliefs.
This is because people tend to form groups of shared interest online, meaning any bias they hold is simply repeated back to them – rather than being challenged.
As a result, controversial theories – such as the causes of autism or misinformation about epidemics – can be given more weight than serious academic research.
‘Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation and polarisation,’ said the paper published in the journal Proceedings of the National Academy of Sciences.
‘This comes at the expense of the quality of information and leads to proliferation of biased narratives fomented by unsubstantiated rumours, mistrust, and paranoia.’
They added that while the phenomenon can be found across the web, it is likely to be exaggerated on Facebook because of the way the platform’s ‘algorithms’ work. The firm has invested heavily in computer codes which highlight articles that most interest users.
The researchers – from Boston University in the US, Sapienza University in Rome and several other Italian institutes – analysed Face- book data about the topics people discussed on the social network in 2010 and 2014.
They found that once users accepted a piece of information as fact, it spread rapidly throughout that particular online ‘community’. They were able to point to a number of such claims which travelled quickly – despite having no proven basis in science.
These included the contentious claim that vaccines cause autism. The effect also caused confusion during the recent ebola crisis, as people shared incorrect information about the disease.
‘Whether a news item, either substantiated or not, is accepted as true by a user may be strongly affected by... how much it coheres with the user’s system of beliefs,’ the scientists added.
‘Such a phenomenon is particularly evident [online] where users – embedded in homogenous clusters – process information through a shared system of meanings, and trigger collective framing of narratives that are often biased towards self-confirmation.’
According to the researchers, the problem of unreliable information going ‘viral’ online is now so serious it is classed of one of the biggest social threats.
They said: ‘Massive digital misinformation is becoming pervasive in online social media to the extent that it has been listed by the World Economic Forum as one of the main threats to our society.’
Nearly 1.6billion users log in to Facebook at least once a month, and more than 1billion of these access the network every day.
The company does not disclose how many users it has in the UK specifically, but in its last official figures it had 315million users across Europe every month.
‘Rumours, mistrust and paranoia’