YouTube conspiracy

YouTube’s algorithm shift highlights the issue with modern conspiracies and social progress

YouTube broadcasts more than one billion hours of video content each day. The platform operates via a recommendation algorithm which controls which videos are automatically loaded in a user’s queue and which videos appear under the heading recommended for you. This algorithm has been heavily criticised in the past for recommending videos which contain hateful content and falsehoods – and over-promoting conspiracy videos to users who wouldn’t normally come across them via organic searches. A 2016 study found that more than 50% of American adults believed the government was withholding information and/or being deliberately misleading about the events of 9/11 – we might never know if circulating online conspiracy theories is the cause of this statistic.


For most of us, conspiracy theories are just a bit of fun. Others take them much more seriously, but at the surface believing that the US government plays host to aliens, or that HIV is a device-control mechanism looks pretty harmless. However, this may not be the case.

Psychological research has suggested that the circulation of conspiracy theories can have detrimental impacts on how a society functions. For example, there are a plethora of conspiracies surrounding the US government, many of them negative. American citizens who are exposed to these theories are less likely to turn up and vote at election polls. This shows that conspiracy theories can compromise democracy – at least in countries where voting is non-compulsory. Similarly, people who are exposed to climate change conspiracies are less likely to make a genuine effort to decrease their carbon footprint.

Mark Zuckerberg

As much fun as it can be to discuss some conspiracy videos, such as Mark Zuckerberg being a secret lizard, the world being flat, Beyoncé being in the illuminati and Avril Lavigne being a clone, the conspiracies which have trended on YouTube over recent years have been hugely problematic.

For example, the school shooting massacre at Sandy Hook Elementary in the US spawned a variety of conspiracy theories. As news of the devastating shooting spread around the globe, so did theories that the event was staged to make a statement about gun control – even going as far as to say the distressed children interviewed by reporters were crisis actors. Not only has this conspiracy been described as “absurd”, it’s also distracting to the actual news reporting and hugely insensitive and disrespectful to the victims and their families. Some conspiracy content has also been found to encourage racist and anti-semitic attitudes.

I stand by my viewpoint that conspiracy theories are a healthy avenue for scepticism; it isn’t always good to believe everything you hear and questioning the status quo should be encouraged. However, there’s a fine line: world experts on climate change and vaccinations, for example, had to work insanely hard to convince the world they were correct in order to initiate policies that would save lives and preserve the environment – they still are, to an extent. What some conspiracy theories do is undermine the confidence we have in these experts and thus the agendas for climate change and child vaccination are questioned and necessary actions are put off and sometimes avoided.


Are you a hardcore hobby conspiracist? Well, you don’t need to pack away your tin foil hat just yet. YouTube, in striving to maintain a balance between freedom of speech and responsibility to its userbase, are not deleting conspiracy-related videos, but merely adjusting the algorithm so that only users who subscribe to conspiracy-based channels or organically search for these videos regularly will be guided to more.