FOR the past three years or so, my kids and I have been engaged in a game of sorts, one that has become steadily more depressing as time has gone on. What started off as an amusing pastime got steadily more unnerving.

We’ve been spotting flat Earth graffiti on our travels around the country. This week I got a belter. I was sitting in the mouth of the fast flume at Coatbridge’s Time Capsule waterpark. Just as I was about to release myself into the tunnel, I spotted the words painstakingly scratched into the guard rail: “THE EARTH IS FLAT”.

As I dropped down into the darkness, carried along by the water, the unmistakable effects of gravity pulling me down towards the plunge pool, I had a realisation: these people are not as fringe as I once believed. Despite advancements in technology, giving all of us unprecedented access to science, verifiable information and evidence, some people are retreating into ignorance and unhealthy scepticism. And social media isn’t just pointing us to the rabbit hole – it’s beckoning us in.

You might be thinking that believing in a conspiracy is relatively benign. You might even think it’s amusing, like I did. And then you hear your nine-year-old twins discussing the Illuminati, Aleister Crowley and the Second World War, things gleaned from seemingly family-friendly YouTubers, and you realise we have a problem.

Given how many are increasingly looking to social media for their news, it’s important to remember one thing: YouTube is a business. It wants to make money and to do that it needs your views and your clicks. What drives that behaviour in users is providing them with sensationalist content. It’s like junk food – your brain loves it.

What may seem like a harmless indulgence, watching a video out of interest, and then clicking on another, is feeding the algorithm data. When you watch, you are teaching it what to serve up next. While you might have an interest in watching David Icke talking about lizard people or conspiracies about Area 51 because it’s funny to you, YouTube learns from your clicks and can serve up that same type of quack content to someone who may be looking at it for entirely different reasons. Regardless of your motivations, it will continue to offer you more extreme content because that’s what will hook your interest and keep you on the platform longer.

I decided to conduct an experiment. I typed “women in sports”, picked a video and let the autosuggest feature do the rest. From there, we got to trans athletes, then a video on the problems with fat positivity. Next was a video from Piers Morgan ranting about the death of chivalry, then on to “gay rights of no importance in Kenya”, before I was taken to a live LBC stream with Nigel Farage. The first comment underneath was “FREE TOMMY!!!”. I clicked back a few videos to choose an alternate path from the suggested videos below, and ended up on “feminists destroyed”, then on to George Galloway talking about anti-Semitism and right back to Farage talking about the benefits of Brexit.

Looking at this pathway, it’s clear to see how easily users can be led astray even when they have shown no interest in the resultant subject matter to begin with.

The platform is a hot mess. It’s not difficult to comprehend how young people or disaffected adults can find themselves drawn into a world like this that feeds off their attention and fills their head with ideas that offer them an alternative reality that’s easier or more interesting to believe in than reality. There has always been a tendency towards alternate explanations in popular and political culture, but never before has there been an engine so willing to nurture that nascent scepticism and cultivate that curiosity in the direction of conspiracy and paranoia.

I think it’s time we started asking questions about the role ubiquitous platforms are playing in shaping values and beliefs. It is not a stretch to see how sites like YouTube are playing a role in the radicalisation of those most vulnerable to extremist narratives.

What begins as just asking questions can quickly spiral into outright denialism and eventually extremism, for even the most passive of viewers. The algorithmic pipeline carries users deeper into harmful ideologies, normalising controversy through exposure and incremental intensification of those views.

YouTube does not consider itself to be a media platform, but given how its content is used, it is no longer conscionable for them not to have an editorial policy. This platform has become central to our lives, and particularly the lives of young people. It’s become as recognisable as Nike, as addictive as Coke and has the potential to be far more damaging than any of us realise. This is a cautionary tale about what happens when you let a line of code do a job that needs the human touch.