Misinformation and disinformation will not be combatted with industry codes

  • Mis and Dis Information
  • Regulatory Models
Decorative image

This briefing has been prepared in anticipation for the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024. Reset.Tech Australia supports enhanced regulatory powers for misinformation and disinformation, but is deeply concerned the process outlined is captive to an ineffective and hostile ‘industry codes’ process that lets Big Tech off easy and stymies public accountability.

Reset.Tech Australia has undertaken extensive work on testing various large platforms’ policies and responses to misinformation and disinformation in Australia, which this briefing builds on. Our most recent research—Functioning or Failing? An evaluation of the efficacy of the Australian Code of Practice on Disinformation and Misinformation—draws upon experimental research conducted in 2023 that shows severe shortcomings in the outcomes of industry’s Australian Code of Practice on Misinformation and Disinformation (the ‘Code’). That report concludes that the Code is not working, and mechanisms for public accountability are effectively non-existent.

This briefing goes a step further and considers the Code in a legal context. It concludes:

  1. The ’transparency’ mechanisms under the Code, which require platforms to self-publish ‘Transparency Reports’ each year, are worryingly poor. A Transparency Report may comply with the requirements of the Code while simultaneously breaching misleading and deceptive conduct for the purposes of Australian Consumer Law.

  2. The ‘accountability’ mechanisms under the Code, namely an Independent Review Process and a ‘public’ Complaints Model, are defective:

    i. The Independent Review Process simply cannot incentivise best practice and compliance in reporting, as its scope is confined to publicly verifiable claims. This means platforms’ claims cannot be independently scrutinised. In other words, platforms can freely mislead the public in their reports without the same fact-checking their users are subjected to on their services.

    ii. The Complaints Model severely disincentivises public complaints against Code signatories:

    1. There being no mandated access to platform data about representations contained in Transparency Reports,
    2. A burden on complainants to satisfy a ‘materially false’ threshold, which arguably imposes a higher threshold of accuracy on complainants than the standard required to be adhered to by signatories when composing Transparency Reports,
    3. A perilous environment in general for organisations collecting evidence on misinformation and disinformation risks on platforms. Routine social media research techniques can lead to massive platform legal action.

    Combined, this represents a hostile environment for public accountability.

Our key recommendations for the Bill mirror our 2023 feedback on the Exposure Draft, namely:

  1. ACMA should be immediately empowered to bypass industry codes and set a standard. The Bill anticipates as a primary route that the ACMA supervises an industry code making process. Evidence and experience shows this will replicate the mistakes of the past. Put simply, industry has had several years to get the code making right and have failed, despite persistent feedback from both ACMA and civil society. The Bill currently considers regulator standards-setting as a ‘last resort’, but it is evident that the threshold for ‘last resort’ has already been crossed.
  2. An example for a standard could include a digital platform public transparency framework, as proposed in Achieving Digital Platform Public Transparency.
  3. The Bill also envisages future ‘digital platform rules’ to be set by ACMA with parliamentary oversight. It would be prudent for parliament to provide an indication of intent at the Bill stage, such as a commitment to public accountability and public transparency, which includes access to platform data to actually permit independent scrutiny. ‘Transparency’ will not be achieved by platforms simply narrating their policies, and ‘risk assessment reports’ need data access in order to be verified.
  4. Legislated protections for accredited researchers and research organisations to platform data, in order to tackle the existing public accountability challenges with the industry code, and insulate public interest research from severe risks.