Youth Radicalisation on YouTube Shorts: a live experiment 

Decorative image

Youth Radicalisation on YouTube Shorts: a live experiment 

Aruna Anderson, Rys Farthing, & Alice Dawkins

Reset.Tech Australia was recently invited to present research to a Ministerial Summit on youth radicalisation. Our remarks centered on the role of social media companies as content distributors, and the relative opacity of their systems (such as content recommender systems, or algorithms).   

Counter-radicalisation efforts often involve hundreds if not thousands of hours of investigative time. We ran a live experiment to demonstrate just how quickly the seeds of radicalisation can be planted. 

At the start of the conference, we set up a YouTube account for a 17 year old Australian. We then primed this account by ‘liking’ approximately 50 pieces of misogynistic content on YouTube Shorts. Over the next 30 minutes, we watched 100 videos on YouTube Shorts to see how rapidly the account fell into a ‘manosphere’ rabbit hole. The answer was quickly. Of the final 10 videos before we stopped, 9 were overtly misogynistic or even defended white supremacy. After just half an hour, the account was well and truly in a deep rabbit hole.

The proliferation of manosphere content on social media platforms is worrying on several layers. Ideological extremism, such as ‘incel’ beliefs, has been linked to terrorist attacks. The Australian Security Intelligence Organisation has remarked publicly that the proportion of young people engaged in ‘ideologically motivated extremism’ is significantly on the rise (see the Director-General’s 2022 Threat Assessment). Our live experiment confirmed just how easily and quickly young people can fall into ideologically extremist filter bubbles online.

In other words, the algorithms work as expected. The findings of this live experiment were not shocking to us. Reset.Tech Australia has published research demonstrating how YouTube’s algorithms contribute to promoting misogynistic, anti-feminist and other extremist content to Australian boys and young men (read our report here).

Youth radicalisation is one of a growing number of complex policy issues where a sharp appreciation for the digital world and its many risky dynamics is paramount. Importantly, the call to action here should not be one of government surveillance of young people as a preventative measure. Rather, radicalisation is one case study among many for why social media companies should be under legislatively mandated requirements to assess risks manifested from all their systems, including content recommender systems, and proactively mitigate these risks. Combined with meaningful transparency and accountability, these measures might address a whole range of online harms, including radicalisation.