Presentation to the Senate Inquiry into the Administration of the Voice

  • Mis and Dis Information
Decorative image

In May 2023, Reset Tech Australia was invited to give evidence to the Senate Inquiry into the Administration of the referendum into an Aboriginal and Torres Strait Islander Voice. We have included a summary of our presentation below. 

  • Our submission highlights that significant capacity gaps exist for mitigating electorally relevant mis and disinformation. We demonstrate there is a clear place for Government to act, as prominent democracies have done elsewhere, to enforce necessary regulatory requirements that hold platforms accountable for the promotion of mis and disinformation, coupled with requirements for transparency to enable effective independent oversight.
  • Issues of mis and disinformation may seem to naturally fall to policy domains of communications and broadcasting. Yet the public media is subject to independent oversight mechanisms. Digital media distributors are not. The market dynamics of digital media distribution – namely profit incentives driving proliferation of mis and disinformation – must be included in any analysis. That is why a comprehensive policy response is important – questions of online safety, electoral integrity, and digital platform market power are all connected.
  • Europe has paved the way for digital platform accountability and independent oversight mechanisms with the Digital Services Act. Australia is at least 5 years behind Europe. Like Europe did in 2017, we use a self- or co-regulatory model that is precariously reliant on platform self-reporting. Europe has since left this model behind and adopted a robust and comprehensive digital regulatory framework that involves hefty penalties for breaches as well as ex ante (before the fact) risk mitigation incentives.
  • We will be running a project on disinformation monitoring and analysis throughout the referendum, generously supported by the Susan McKinnon Foundation. We will draw upon international best-practice risk mitigation metrics, currently in use in the European Union.
  • In the flawed era of self/co-regulation in Australia, genuine proactive measures and meaningful cautionary investment from the platforms to act in the democratic interest are few and far between. The burden unfortunately lies with Government to follow up with digital platforms, proactively, to encourage the integrity of the referendum process. A process with integrity means that search and social media services avoid the situation where digital media feeds are flooded with extreme views on extraneous issues that distort the central issue of Yes or No.  
  • We have drafted a set of questions that Government could ask of digital platforms in the lead up to the referendum. We have grouped these questions into the typical policy categories for disinformation monitoring and action. We recommend that engagement with digital platforms comprehensively covers these areas, as they are all relevant to effective mitigation strategies. We have made these questions available to the Committee. 
    1. Risk mitigation and safety measures
    2. Content moderation
    3. Notice and take-down
    4. Third-party fact checking and transparency
    5. Ensuring integrity in the promotion of political ads
    6. Algorithmic prioritisation of quality news content over disinformation