Danger of filter bubbles and echo chambers
ABOUT a decade ago, the term “filter bubble” was created by the internet activist Eli Pariser to refer to a state of “intellectual isolation” where a search engine algorithm selectively personalises web searches by basing the information that a user sees on a profile of the user, such as preferences, location, past click- behaviour and search history.
A filter bubble, therefore, often results in users having significantly less contact with opposing perspectives and contradicting viewpoints, which causes the user to become intellectually isolated in their own cultural and ideological bubble. Filter bubbles distort our thinking and our understanding of the world, and hinder our ability to make balanced decisions.
Typical examples of filter bubbles are Google’s personalised search, Google News and Facebook’s personalised news stream. But they are not limited to these large technology companies. A newspaper company can, with the help of artificial intelligence ( AI), print a unique copy of the newspaper for each of its subscribers based on their digital profile and preferences. Many websites offer personalised content selections, based on a person’s browsing history, age, gender, location, and other data.
AI- and algorithm- driven news, web searches and websites ensure that a person sees only “relevant” results. No two people see the same results when they do a search, nor do they see the same news from their news curation or online newspaper apps.
A simple Google search of the same word or phrase by different people results in vastly different results depending on the profile and history of the user. According to Pariser, the computer screen becomes a one- way mirror that reflects the interests of users, while the algorithmic observers avariciously watch what they click.
The problem is that filter bubbles create “echo chambers” ( a situation in which beliefs are reinforced by repetition inside a closed system), resulting in the assumption that every person thinks the same and that alternative perspectives do not exist. Eventually, a new “reality” without any cognitive dissonance is created, and people do not even realise that what they see is being filtered.
Fake news and filtering are, however, only part of the problem. There is a hidden and more dangerous problem. According to some, this isolation of individuals and lack of exposure to contradicting views has led to the current deep- seated biases, polarisation of our societies, lack of tolerance for opposing views and a general vulnerability to and trust in fake news. We have all seen the intensification of the polarisation, intolerance and violence in recent times during elections, political rivalries, as well as sectarian protests and violence all over the world.
It is not difficult to see why the phenomenon of filter bubbles and echo chambers caused widespread concern. In our current volatile, uncertain, complex and ambiguous world, we need understanding, clarity and adaptability. True democracy requires citizens to be able to understand and accommodate the viewpoints of others, but many people enjoy the comfort and security of their own bubble. Democracy necessitates the reliance on shared facts, but we live in parallel, separate universes. However convenient personalisation may be, it promotes auto- propaganda and indoctrinates us with our own ideas.
If only a third of the 2.7 billion Facebook users, or a small percentage of the estimated 300 million Google News readers, consider it as their main news source, it is a cause for serious concern. Even more so, because not even the search engine optimisation experts know exactly how search rankings are organised. Neither do we know what information search engines and social platforms collect to build our digital profiles.
Privacy and ethical controversies over the use of AI algorithms to filter online content have raised awareness of the issue and resulted in many technology companies altering their practices. Governments are also increasingly formulating regulations to exercise more control over the collecting and mining of user data by large technology companies.
But as long as access to these platforms is free and they are financed by digital ads, which are driven by engagement and sharing rather than the accuracy or value of the news, regulations will not be enough. Users will have to do their part by using incognito browsing, deleting search histories, deleting or blocking cookies, and using ad- blockers.