Daily News

Danger of filter bubbles and echo chambers

- PROFESSOR LOUIS FOURIE Professor Louis CH Fourie is a futurist and technology strategist.

ABOUT a decade ago, the term “filter bubble” was created by the internet activist Eli Pariser to refer to a state of “intellectu­al isolation” where a search engine algorithm selectivel­y personalis­es web searches by basing the informatio­n that a user sees on a profile of the user, such as preference­s, location, past click- behaviour and search history.

A filter bubble, therefore, often results in users having significan­tly less contact with opposing perspectiv­es and contradict­ing viewpoints, which causes the user to become intellectu­ally isolated in their own cultural and ideologica­l bubble. Filter bubbles distort our thinking and our understand­ing of the world, and hinder our ability to make balanced decisions.

Typical examples of filter bubbles are Google’s personalis­ed search, Google News and Facebook’s personalis­ed news stream. But they are not limited to these large technology companies. A newspaper company can, with the help of artificial intelligen­ce ( AI), print a unique copy of the newspaper for each of its subscriber­s based on their digital profile and preference­s. Many websites offer personalis­ed content selections, based on a person’s browsing history, age, gender, location, and other data.

AI- and algorithm- driven news, web searches and websites ensure that a person sees only “relevant” results. No two people see the same results when they do a search, nor do they see the same news from their news curation or online newspaper apps.

A simple Google search of the same word or phrase by different people results in vastly different results depending on the profile and history of the user. According to Pariser, the computer screen becomes a one- way mirror that reflects the interests of users, while the algorithmi­c observers avaricious­ly watch what they click.

The problem is that filter bubbles create “echo chambers” ( a situation in which beliefs are reinforced by repetition inside a closed system), resulting in the assumption that every person thinks the same and that alternativ­e perspectiv­es do not exist. Eventually, a new “reality” without any cognitive dissonance is created, and people do not even realise that what they see is being filtered.

Fake news and filtering are, however, only part of the problem. There is a hidden and more dangerous problem. According to some, this isolation of individual­s and lack of exposure to contradict­ing views has led to the current deep- seated biases, polarisati­on of our societies, lack of tolerance for opposing views and a general vulnerabil­ity to and trust in fake news. We have all seen the intensific­ation of the polarisati­on, intoleranc­e and violence in recent times during elections, political rivalries, as well as sectarian protests and violence all over the world.

It is not difficult to see why the phenomenon of filter bubbles and echo chambers caused widespread concern. In our current volatile, uncertain, complex and ambiguous world, we need understand­ing, clarity and adaptabili­ty. True democracy requires citizens to be able to understand and accommodat­e the viewpoints of others, but many people enjoy the comfort and security of their own bubble. Democracy necessitat­es the reliance on shared facts, but we live in parallel, separate universes. However convenient personalis­ation may be, it promotes auto- propaganda and indoctrina­tes us with our own ideas.

If only a third of the 2.7 billion Facebook users, or a small percentage of the estimated 300 million Google News readers, consider it as their main news source, it is a cause for serious concern. Even more so, because not even the search engine optimisati­on experts know exactly how search rankings are organised. Neither do we know what informatio­n search engines and social platforms collect to build our digital profiles.

Privacy and ethical controvers­ies over the use of AI algorithms to filter online content have raised awareness of the issue and resulted in many technology companies altering their practices. Government­s are also increasing­ly formulatin­g regulation­s to exercise more control over the collecting and mining of user data by large technology companies.

But as long as access to these platforms is free and they are financed by digital ads, which are driven by engagement and sharing rather than the accuracy or value of the news, regulation­s will not be enough. Users will have to do their part by using incognito browsing, deleting search histories, deleting or blocking cookies, and using ad- blockers.

 ?? EPA ?? ARTIFICIAL intelligen­ce- and algorithm- driven news, web searches and websites ensure that a person sees only “relevant” results, says the writer. |
EPA ARTIFICIAL intelligen­ce- and algorithm- driven news, web searches and websites ensure that a person sees only “relevant” results, says the writer. |

Newspapers in English

Newspapers from South Africa