Thoughts on echo chambers
Erin Kernohan-Berning
11/27/20244 min read
The term “echo chamber” has been thrown around a lot online and in the media lately, often in response to online disagreements where one person has disagreed with another and neither have managed to come to any agreement. Often these disagreements are centred around politics and human rights issues, which are big, complex, and difficult subjects.
An echo chamber, literally speaking, is a hollow space where sound can bounce off the interior to create a reverberation. Acoustic echo chambers make for great sounding music (the kind of sound you get in churches, music halls, and your shower). An echo chamber, figuratively speaking, is a virtual space in which only one set of ideas or opinions is allowed to circulate at the exclusion of rebuttal or contradiction, potentially resulting in confirmation bias.
Confirmation bias is an error where, in testing a hypothesis, evidence which supports the hypothesis is accepted, while evidence that does not support the hypothesis is rejected. All humans are susceptible to confirmation bias. Because of this, people who work in fields where objectivity is important, such as science and journalism, are trained to actively guard against confirmation bias in the course of their work.
Because social media is a very new iteration of the World Wide Web, research on echo chambers is in its infancy. There is disagreement on what constitutes an echo chamber on social media or even what kind of real-world consequences such echo chambers might have. In public discourse, the worry with echo chambers is that by providing personal moderation tools for people to filter and block certain users and commentary, social media platforms are creating an environment uniquely primed for confirmation bias – for people to tailor their social media feeds so that they only have to hear what they want to hear.
One issue with this commentary is that it neglects to acknowledge how little agency users have when it comes to algorithmic social media feeds. Most social media algorithms push content that has high engagement to the surface, regardless of how that engagement happens. Content that fuels discord usually has very high engagement and is rewarded by reaching more devices, eyes, and brains. This sets up a situation where there’s an incentive for people to behave poorly online, and the only recourse a person has to avoid such behaviour is to employ the aforementioned personal moderation tools.
Echo chamber discourse also doesn’t take into account the fallacy of false equivalence. This fallacy assumes that in any particular argument there are two equally supported perspectives. However, in many cases this is not true. For instance, there is plenty of evidence to support that the world is round, but no evidence supporting that the world is flat. However, according to the echo chamber discourse in our current online moment, treating all arguments as equal regardless of factual merit is considered a virtue, and people declaring the world is flat should get equal airtime to belabour that.
In a white paper commissioned by the Knight Foundation, the authors argue that even if people seek out content on social media that fits in their worldview, we are still regularly exposed to opposing viewpoints elsewhere. They further argue that there is potentially more risk of encountering an echo chamber in an offline environment. Take this column for instance. If you only get your technology opinions from me, then you might be setting up an echo chamber for yourself. This is why when I post my column to my website at ErinKernohan.ca, I include links to the information I consumed, so that you might draw your own conclusions about the same topic.
Right now, echo chamber is more of an accusation than a phenomenon, an assertion that no matter how much you don’t want to hear my point of view, you should have to anyway. Ironically, the term echo chamber has been used to reject perspectives from those with different lived experiences, particularly people who are marginalized and whose human rights are currently at risk, by those who want to spread anti-trans rhetoric, misogyny, and racism freely and without rebuttal.
Asking social media users to stand in the firehose of every opinion on the planet will not solve the problem of confirmation bias. Rather, encouraging people to be curious and to seek a diversity of opinion that isn’t limited to hot takes in under 240 characters can help. Listening to people who cite their sources, acknowledge their own limitations, and correct their mistakes is also beneficial. If we all hold our opinions to a higher standard and allow our opinions to change when better information is presented to us, we can all better come to understand our world and each other.
Learn more
The History of Echo (Echo) Chambers (Chambers). 2011. Geoffrey Granka. (Audio Geek Zine) Last accessed 2024/11/27.
The Myth of the Online Echo Chamber. 2018. David Robson. (BBC) Last accessed 2024/11/27.
Avoiding the Echo Chamber about Echo Chambers. 2024. Andrew Guess et al. (ResearchGate) Last accessed 2024/11/27.
Resist the Coarse Filter. 2024. Eryk Salvaggio. Last accessed 2024/11/27.
Echo chambers, filter bubbles, and polarisation: a literature review. 2022. Dr Amy Ross Arguedas et al. (Reuter's Institute) Last accessed 2024/11/27.
On the impossibility of breaking the echo chamber effect in social media using regulation. 2024. Chen Avin et al. (Nature) Last accessed 2024/11/27.
The echo chamber effect on social media. 2021. Matteo Cinelli et al. (Proc Natl Acad Sci USA) Last accessed 2024/11/27.
TikTok vs. Democracy. 2024. Philosophy Tube. (YouTube) Last accessed 2024/11/27.
Do Blocklists Work? 2024. Steve Boots. (YouTube) Last accessed 2024/11/27.
Populism, Media Revolutions, and Our Terrible Moment. 2024. Hank Green. (YouTube) Last accessed 2024/11/27.
Correction log
Nothing here yet.