Address to Tech Policy Futures

  • Regulatory Models
Decorative image

On the 28th March 2023, Reset.Tech was invited to present at the Tech Policy Futures forum at Parliament House, speaking alongside regulators, Members of Parliament, and industry leaders.

Good evening, my name is Alice and I lead Reset.Tech’s efforts in Australia. We are a global policy initiative working with experts, regulators, and policy decision-makers to build accountability and encourage proper regulation in digital markets. 

I’m really just talking about two words tonight. Words we hear in almost every other significant market and its supervision — safety, and enforcement. With all the exciting tech policy movement going on, it’s important we don’t forget them. 

Safety is — generally — about imposing transparency obligations onto firms before the harm occurs.

Enforcement is what happens when things go wrong.

Name any other market with a hefty R&D component and intense consumer interaction, and they will almost certainly have a safety regulator.  Aviation, food, pharma, financial services, are examples. 

In tech, safety policy is almost always reactive. Take content policy, for example — self-regulation and co-regulation frameworks rely on user reporting and mostly internal platform decision-making. I’m sure the eSafety Commissioner would agree with me, that it would be extremely helpful to invoke safety as more of a proactive mechanism — to prevent, rather than respond to, harms.

Just as we need a proactive vision, we also need a comprehensive vision of safety. Safe products are rights-respecting, private and don’t push harmful content. And mechanisms create safe conditions through risk assessment and mitigation measures. 

Now, at this point, industry may say that transparency exposes trade secrets to competition – an old argument meeting an expiry date, I would say, and an issue that’s well managed in many other innovative markets. 

Where we are seeing frameworks for independent safety mechanisms include in Europe in at least two places — independent, expert testing centres for algorithms under EU AI efforts, and, accredited researcher and civil society access to platform data under the Digital Services Act

Now, over to enforcement – as we know, that’s for when things go wrong. But enforcement is also a deterrent, and an important one.  

In Australia, we have some examples of enforcement cases. But our regulators, where empowered, suffer from significant resource constraints. There is understandable anxiety about launching enforcement against often the wealthiest companies in the world. But it must be done when misconduct happens. So I urge the government — invest in positive tech futures by funding your regulators!

To recap, safety and enforcement measures support healthy industries. Proactive safety mechanisms help prevent harms and mitigate risk, and robust enforcement provides remedies and deterrents. 

In my final seconds, I have four suggestions:

  1. Resist exceptionalism arguments. There is nothing really that magic about data or digital information — it is part of an industry like any other and should be supervised like any other. Let this be the end of calling something a tech company simply to attract special or gentle treatment.
  2. Test statements about ‘innovation’. Innovation is almost always invoked as the reason for softening anticipated regulation — if this balancing act is to persist then we need to have a pretty clear set of criteria for what desirable innovation is, and what it achieves.
  3. Ask more questions about job creation and economic growth. Like the gloss of innovation, lofty promises of big boosts to household income and national revenue has too often distracted policy decision-makers. Insist on evidence, not only pitches. 
  4. Be ready to build something better. Australia is a tech policy trailblazer – we gave the world the eSafety Commissioner model and the News Media Bargaining Code. We have moved first on these issues, and should continue to.