When I reflect on my time at university, I certainly can’t recall every detail I was taught, but key gems of information have stayed with me, well and truly cemented in my brain. I took a journalism class called ‘Innovation Cultures – Perspectives on Science and Technology’. It involved many heated discussions, often from a sociological angle, about the future of technology and how the accumulation of data would shape society.
One of our assignments focused on the shift in journalism culture, comparing citizen journalism (e.g online platforms, technological developments, and social media) with traditional broadcast journalism (e.g television, newspapers, and radio). It sparked some key reflections that still resonate with me today, particularly around the concepts of echo chambers and filter bubbles.
In essence, technology has enabled personalised algorithms, built on datasets gathered over time, to shape the digital media and news shown to each of us. While this makes it more engaging for the user and makes us feel like we can truly relate to what is on our screens, a downside is the creation of echo chambers, where users are only exposed to content that reinforces their existing views.
The metaphor of an acoustic echo chamber (where sounds reverberate in a hollow enclosure) illustrates how this occurs online. The filter bubble, an algorithmic bias, limits the information a user encounters, perpetuating echo chambers and distorting perspectives, which can often lead to widespread misinformation.
Although intended to filter out the plethora of information and data that exists on the internet, algorithms that are ‘learning’ from a user’s digital footprint are determining what is seen and heard.
Since I studied this topic, technology has advanced even further, and personalised algorithms have become even more sophisticated. It raises the question: where are we headed, and what must we consider in order to prevent filter bubbles and echo chambers from distorting the digital space further?
We need to focus on creating digital spaces that expose us to a diversity of ideas.
Algorithms shouldn’t just reflect our past behaviour but should also challenge our thinking and broaden our perspectives. By being transparent about how content is recommended and making sure that the data is varied and diverse, we can create more inclusive, open online environments.
Whether businesses are building digital platforms or people are simply interacting with information online, our goal as consumers and creators should be to engage with a wide range of content, challenge assumptions, and make informed decisions. In doing so, we can ensure that technology remains a tool for learning and growth—one that expands our viewpoints rather than merely reflecting them back to us.