Echo Chambers

(Read time: 4 minutes)

As our world becomes ever more interconnected, resulting in an abundance of opportunities to build bridges and unite communities, there is simultaneously a risk that we fragment into divided groups where we collectively coalesce and screen out unwanted information. In a research paper released in 1997, Marshall Van Alstyne and Erik Brynjolfsson explained the downsides of the shift to global interconnectedness:

“With the customized access and search capabilities of IT, individuals can focus their attention on career interests, music and entertainment that already match their defined profiles, or they can read only news and analysis that align with their preferences. Individuals empowered to screen out material that does not conform to their existing preferences may form virtual cliques, insulate themselves from opposing points of view, and reinforce their biases. Internet users can seek out interactions with like-minded individuals who have similar values, and thus become less likely to trust important decisions to people whose values differ from their own.”

We know how beneficial it can be to consider multiple sources of information whilst making a decision. Who doesn’t filter through the negative reviews of an Amazon product before they buy it? However, how often do people actively seek the entire spectrum of facts in their individual decision-making processes? More importantly, how often do people actively seek out a counter opinion? The result of this is an Echo Chamber, an environment in which an individual typically encounters information which reflects and reinforces their own opinions. There are two major issues with living in a mindset of an Echo Chamber; firstly they are very tricky to recognise and, secondly the risk of completely shutting off an opposing viewpoint reduces the likelihood of an open dialogue.

Naturally some form of bias is unavoidable, however other factors may be at play. In a TED talk published in 2011 titled “Beware Online Filter Bubbles”, Eli Pariser says:

“…your Filter Bubble is your own personal unique universe of information that you live in online and what’s in your filter bubble depends on who you are and it depends on what you do. But the thing is that you don’t decide what gets in and more importantly, you don’t actually see what gets edited out…”

A number of individuals have argued that the notion of a Filter Bubble simply doesn’t exist. Nonetheless, the fact that a “Recommended for you” feature is built into platforms such as Facebook, YouTube, Netflix and Instagram does suggest otherwise. Pariser argues that algorithms can dictate what we are exposed to, and consequently the good intention of exposing us to tailored content could be enhancing our Echo Chambers and Filter Bubbles. This becomes even more problematic as social media becomes the leading source of information.

The idea behind Echo Chambers and Filters Bubbles conforms to the human tendency to gravitate toward confirmation bias. According to Psychology Today, confirmation bias occurs when the influence on one’s thoughts, due a desired belief, causes them to believe something simply because they would like it to be true. As a result, individuals stop gathering information at the point where their biases are confirmed.

Whilst staying in our chambers may be comforting, it also perpetuates a sense of confusion when others don’t resonate with our thought process. This is not to say we are wrong in our beliefs; rather, seeing as the easiest person to fool is oneself, perhaps we should all be more questioning of our individual confirmation bias.

Interested in seeing other posts, please see the related articles section and links to share below: