Report: Recommender Systems and Political Content
Digital platforms are shaping the landscape of Australian political discourse. While significant attention is rightly paid to how platforms influence political discourse through content moderation and policies around misinformation and disinformation, they also shape political discourse through the development and deployment of algorithms in content recommender systems.
Recommender systems can distort political debate by promoting extremist or dangerous content, but can also shape debate by pushing one-sided or partisan content to users. This is often described as the ‘filter bubble’ effect or as social media ‘rabbit holes’, which can damage the plurality of the content people consume.
This research explores the effect of social media algorithms on political content promotion concerning the Voice referendum in Australia. We set up sock puppets (or ‘fake accounts’) on TikTok and X (formerly Twitter) to observe the rate at which these accounts fell into ‘Yes’ or ‘No’ filter bubbles.
Our findings include the following:
- On TikTok: We primed four sock puppet accounts. Two of them fell into strong ‘No’ filter bubbles within 400 videos. One fell into a ‘Yes’ filter bubble within 250 videos, and one failed to fall into a filter bubble.
- On X (formerly Twitter): We primed two sock puppet accounts, with one falling into a ‘No’ filter bubble after around 300 Xs (tweets) and the other into a ‘Yes’ filter bubble after around 200 Xs.
The existence of ‘Yes’ and ‘No’ filter bubbles, which can rapidly appear, suggests that platforms’ recommender systems could play a role in dividing Australian political discourse. This division could shape the polarities of Australian political debates.
Despite this, algorithms and content recommender systems remain largely invisible to Australian researchers, as platforms’ ‘transparency tools’ (APIs) are being closed down, moved behind paywalls, or are only available to Europeans or Americans. As the Government considers the next steps regarding the Exposure Draft Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill and reviews the Online Safety Act 2021, consideration must be given to ensuring that independent oversight of algorithms is possible for regulators and researchers.