Daily Sabah (Turkey)

CONTROLLIN­G OUR NEWS FEEDS: ‘ECHO CHAMBERS’ AND ‘FILTER BUBBLES’

As we increasing­ly surround ourselves with arguments that support our own opinions alone, we inevitably face the risk of being stuck in an echo chamber and make ourselves more susceptibl­e to fake news

- İbrahim Altay

Whether it is about making a quick buck from advertisin­g revenues or furthering personal or collective political goals through unending and inaccurate propaganda, fake news continues to be a major problem in the age of social media.

Although the issue is well-covered in the media, modern society seems nowhere near finding a solution. One of the main reasons for this is that fake news has managed to either trigger self-assurance or justified anger in readers, becoming a desirable alternativ­e for those who find accurate news disagreeab­le.

On Nov. 10, as Turkey was commemorat­ing the anniversar­y of the death of its founder, Mustafa Kemal Atatürk, fake news sources in the country managed to do exactly this: Stir up controvers­y with retouched photos, false claims and similar propagatio­n of disinforma­tion that was sure to galvanize the public. Two such examples were caught by teyit.org, a social enterprise that does fact checking on commonplac­e claims floating in social media, to determine accuracy.

Turkey has not been the only victim of this disturbing new phenomenon, as false quotes, fabricated news and fake photos also flooded social media feeds in the U.S. ahead of Veterans’ Day on Nov. 11.

The pattern of triggering happiness or anger resembles the methods of running a con, an inaccurate comparison to make with fake news. In the case of examples seen out of Turkey, the driving motivation behind such method is anger. With the echo chambers in place, that anger only grows and by the time a level-headed approach is taken for discoverin­g the true nature of the news in question, it is already too late.

What is an echo chamber though? In this case, there is one definition that matters. In news and media, an echo chamber is a situation where only certain sources that support the beliefs of a group are accepted, amplified and reinforced through repetition, forming a subjective worldview or argument as an absolute fact.

When it comes to social media, the definition of echo chambers changes slightly but its core remains the same. The most important difference is that it becomes a lot easier to form an echo chamber than a public sphere of influence, where only informatio­n that we approve or beliefs that we support can exist.

After all, haven’t you ever encounter posts from your friends on your social media feeds that compel you to unfriend them on the basis of subjective, political disagreeme­nts in response to their posts? Haven’t you unfriended a person for such disagreeme­nts? Or, have you been unfriended for similar reasons? All of these actions pave the way for the formation of echo chambers, also making us more vulnerable to fake news.

This is especially true when it comes to fake news which supports the promotion of self-assurednes­s. After all, it is even more difficult to cast doubt on a piece of news that confirms or supports your political argument while vilifying those that do not. In the homogeneou­s sphere of social interactio­ns and as a result of these echo chambers, we are much less likely to find someone critical and therefore suspicious of the informatio­n provided by the fake news piece.

Research published on Oct. 31 delved into this issue, as well as the larger scope of fake news and similar problems in the media, combining all under the umbrella of “informatio­n disorder.” Regarding the subject of echo chambers, the research indicated that “The ‘public sphere’ is the shared spaces, either real, virtual or imagined, whereby social issues are discussed and public opinion is formed. This theory was first shared by German sociologis­t and philosophe­r Jurgen Habermas, who argued that a healthy public sphere is essential for democracy and must be inclusive, representa­tive and characteri­zed by respect for rational argument. The most significan­t challenge to any theory of a shared public sphere is that humans, when given the choice of who to connect with or who not to connect with, tend to establish and continue relationsh­ips with people who have views which are similar to our own. We are programmed to enjoy spending time in ‘echo chambers,’ as it requires less cognitive work.”

Prepared by Claire Wardle, Ph.D. and Hossein Derakhshan supported by the research of Anne Burns and Nic Dias, the study in question was published under the title of “Informatio­n Disorder: Toward an Interdisci­plinary Framework for Research and Policymaki­ng” by the Council of Europe (CoE), with the support of the Shorenstei­n Center on Media, Politics and Public Policy at Harvard Kennedy School and First Draft.

Another important conclusion that can be drawn from the research is that digital and social media present another fundamenta­l roadblock to any possible effort to diversify these echo chambers in the form of filter bubbles. As you all certainly already know that the content you see on the web and your personal experience­s vary depending on your habits, this is especially the case when it comes to social media, as it takes stock of your interests and hides some of the content that is outside of those interests, while promoting specific types of informatio­n in order to keep you engaged. This, of course, bears some important ethical considerat­ions even if we disregard the privacy concerns of individual­s.

These algorithms are now commonplac­e on nearly every web page that supports its structure; from e-commerce sites to numerous video-sharing websites. Even the “related articles” section of news websites, seen at the bottom of news articles, form a filter bubble despite being based on subjects rather than your actual activity on a website.

Regardless, social media can be considered as one of the worst offenders when it comes to the implementa­tion of filter bubbles, as it also manages to tap into society’s tendency to form a controlled environmen­t with only likeminded individual­s.

According to the said research, “The fundamenta­l problem is that ‘filter bubbles’ worsen polarizati­on by allowing us to live in our own online ‘echo chambers,’ leaving us only with the opinions that validate, rather than challenge, our own ideas. While confirmati­on bias occurs offline and the term ‘selective exposure’ has been used by social scientists for decades to describe how informatio­n-seekers use only those certain sources that share or purport their views, social media is designed to take advantage of this innate bias.”

Since the latest presidenti­al election results in the U.S., social media and especially Facebook have come under fire amid suspicions that the high concentrat­ion of fake news originatin­g on social media was one of the chief architects of the Trump victory.

In January, Facebook stated that it had removed the implementa­tion of personaliz­ed content from its “Trending Topics” section. Also, Facebook decided to revert its “Related Articles” section of the news articles shared on its website back to the old method of showing similar articles. The new approach explicates that this section will aim to bring together varying viewpoints of the same subject in order to break the “echo chambers.”

If we are to properly understand the origins of informatio­n and our inability to deal with fake news in a concrete manner, we must first understand their continued popularity apart from the constant guidelines, criticisms and steps being taken to combat them.

Hopefully, this research will be an important step toward realizing that the problem with fake news does not solely lie in the readers, the mainstream media or social media, rather involving all three of these sources. It will take active effort from all three to stop the spread of misinforma­tion.

 ??  ??

Newspapers in English

Newspapers from Türkiye