Submission: Online Safety Act Statutory Review

  • Online Safety
  • Regulatory Models
Decorative image

Overall, Reset Tech Australia warmly welcomes the review of the OSA, including:

  • The comprehensive nature of the review, and the commitment to ensuring an effective, future proofed regulatory framework.
  • The breadth of the review, including references to enhancing accountability including through a duty of care; the introduction of the children’s best interests requirement; the space to draw on international models of best practice; and the intent to investigate effective enforcement. These are necessary, important topics, and appear to indicate a movement towards a systemic approach to managing digital risks rather than addressing specific online harms.

Background – Australia proudly has ‘early-mover’ status on online safety, from its online safety commissioner model, realised through the Enhancing Online Safety for Children Act 2015, the forerunner to the OSA. Our online safety framework revolves around mandatory notice and take down remedies, delivered through the complaints and content-based removal notices schemes. While these remedies are key for certain categories of harms, they are palliative in nature, operating downstream from risks.

Palliative to Preventative – Online harms scholarship has progressed greatly in the intervening years and a key outcome is a realisation that digital platforms could and should do more, preventatively and at a systemic level, to reduce harm to users. There are gestures towards systemic protections in the OSA, via the Basic Online Safety Expectations (BOSE). The Issues Paper also signals an intent for a more systemic approach. However, the desire for a systemic model comes up against some tensions. They are:

  • A systemic model cannot neatly rest on a bedrock of 3C/4C harms typologies. 3C and 4C typologies (content, conduct, contact, contract) foreground risks that are not systemic in nature, so we recommend using them with caution. The most effective digital regulations have emerged where the focus has remained on the systemic risks that platforms create, and therefore platforms can straightforwardly mitigate against.
  • Drafters should be wary of relying on defined content types and conduct risks. This sets up protracted battles with platforms over what’s in and out of scope. It also ensures that the OSA is effectively out-of-date the moment it is published (because digital risks are inherently dynamic and move at pace), and; will create a list of disjointed obligations with all sorts of contested definitional issues. The preferred and future-proofed alternative is to work back from an enforceable obligation upon platforms to identify and manage risks.

5 Building Blocks for Online Safety – We are calling for a systemic and comprehensive model, comprising five key ‘building blocks’:

  • An overarching duty of care (in the singular, not the plural);
  • Requirements for risk assessments, which include requirements to address specific risks, including but not limited to: content risks and broader societal risks;
  • Requirements for risk mitigations, where platforms must identify the mitigation measures they are going to take. Regulators must have the power to review and investigate mitigations measures and compel platforms to improve them where they are deemed inadequate;
  • Requirements for meaningful public transparency (such as public facing risk assessment summaries, transparency reports, independent audits, public repositories of data and researcher access), alongside strong investigatory powers for regulators;
  • Requirements for strong enforcement, including resources for regulators, meaningful fining powers and the ability to ‘turn off’ services as a last resort.