Uncategorized

The Internet: Our Problematic Public Arena

The internet can be a great thing. I’d know, I live half my life inside it. From having a seemingly infinite scroll of information at your fingertips, to engaging in social media debates on platforms like Twitter and Reddit, to […]

The internet can be a great thing. I’d know, I live half my life inside it. From having a seemingly infinite scroll of information at your fingertips, to engaging in social media debates on platforms like Twitter and Reddit, to blogging, the internet appears (and is often painted) as a democratic, Utopian field for a public space. But it’s not. The internet is not the public arena for healthy conversation that it was designed to be – and it’s disturbing. Let me unpack this.

The various algorithms holding the internet together are sophisticated. In marketing academia, we learn that this is a great thing, because it allows for highly efficient, highly personalised targeting. Mass customisation is the term thrown around, and it’s a deeply impersonal form of individualised targeting. Everything we do on the internet, everywhere we walk or drive with Google Maps activated and perhaps even everything we say aloud within hearing reference of our smart phones (looking at you, Facebook Messenger) is collected. This gives internet companies – and thus the marketing companies they sell their data to – a scarily accurate silhouette of our personas IRL, and strong, vested interests in what we are exposed to in our daily online ventures.

smartphone turned-on in vehicle mount inside vehicle

These algorithms also target and develop our preferences toward what we see online. If we search for conspiracy theories on YouTube, a variety of other theories will appear in the recommended viewing sidebar, or even automatically load to play next. If we click on a few videos citing US presidential conspiracies (always a hot topic), the recommendation algorithm will learn even more about us, and will attempt to target us with this more detailed niche. This occurs on most platforms, and it’s a clever tactic to filter through the abundance of diverse information that exists online, and select and present information to us that is designed to capture our attention for longer, thus keeping us on the platform or site.

This spiral continues infinitely; ultimately what we’re each left with is an ‘echo chamber’ – the amplification of our online opinions. We are surfing on what we perceive to be a broad internet possessing a variety of viewpoints, an abundance of free information. But these algorithms have calculated our preferences to the extent that we’re simply sifting through a bunch of directly targeted content that reinforces our own ideologies, even that which may have begun as a mild curiosity. This becomes problematic in tandem with statistics that claim an increasing majority of us receive most of our news via social media platforms.

The results of echo chamber culture are horrific. A healthy society fosters friendly debate and opposition of issues. When a person is effectively wrapped up in echoes of their own opinion with little or no opposition, those opinions appear infallible. But they’re not. On a large scale, with more internet users than ever, what we’re seeing as a result of the echo chamber paradigm is a rise in hate speech. Religious and political extremists are appearing. Nationalism is on the rise across the West, as a result of a growth in the polarisation of our political views. The very existence of climate change, which has no valid reason to be ignored, is a political issue; we can’t move forward. Anti-vax groups used to be a very weak minority. Now, it’s a strong force in many societies, resulting in weakened herd immunity that has already cost numerous lives. There are also dire implications for the coexistence of echo chamber culture, spam bots and the spread of fake news. This relates to several theories which attempt to use this paradigm to explain the shock results of the 2016 US presidential election.

white tablet computer on top of newspaper

This paradigm is unlikely to change anytime soon. What we really need is for major social media platforms, such as Facebook and Twitter, to systemically work against echo chamber culture. This would require an evolution of the algorithms in a way that invites the typical internet user to notice and understand diverse viewpoints, rather than effectively keep them confined in their own opinion, which ricochets off the corners of the internet like a ping pong ball. Recommended content algorithms need to be retracted and rethought. However, this is unlikely to occur on a large scale. The internet’s core structure is deeply embedded in neoliberal values and corporate ownership which genuinely profits from the strong marketing potential offered by sophisticated algorithm networks that capture the pitfalls of the attention economy so accurately. However, some small steps have been taken to mitigate the effects of echo chamber culture. Twitter, for example, has proposed a model involving taking some responsibility and action toward the ‘health’ of public online conversation.

I don’t think solving this issue this is as simple as preaching get off the internet (although a decent break every once in a while, would probably benefit us all). What’s more viable (and important) is to shift the internet’s design into something that does not resemble what George Orwell envisioned a 1984 society to appear like. Or, perhaps these structures have already been undermining our society for years, but the internet is what’s been making it visible. Regardless, we need to be wary, and we need to push for change. Social media platforms have the capacity to provide a strong framework for healthy, democratic conversation, and we need to push them harder to do so.