Troll hunting isn't solution to social media's hate problem

  • Media Release
Decorative image

The federal government’s plans to expose online trolls on social media won’t resolve social media’s online hate problem, says Reset Australia.

Greater transparency over how the platforms work, and in particular accountability for how their engagement algorithms amplify more extreme content is needed, rather than a focus on individual users, said Reset Australia, the local think tank arm of the global initiative tackling digital threats to democracy.

“Social media companies promote, amplify and profit from hate - catching trolls won’t end online hate,” said Chris Cooper, executive director of Reset Australia.

“The most pressing problem here is not trolls, it is the disproportionate reach of their content enabled by the algorithms of social media companies that prioritise sensational, outrageous and conspiratorial content - the form which defamatory content usually takes.

“Forcing social media companies responsible for coughing up the identity of individuals does not hold the platforms accountable for their profit-making amplification that enables that content to go viral.

“Online anonymity does protect trolls from accountability, but it also is an important tenet of a free and open internet that protects critics of the powerful which can hold leaders accountable.

“We cannot throw away anonymity and the protection it provides vulnerable communities, for the sake of reining in trolls who mostly are only able to cause harm because of social media platforms that profit from amplifying their content.”

Internationally a shift towards legally enforceable regulation, such as the EU’s Digital Services Act, is being seen. Reset Australia says a tougher regulation approach should also be adopted in Australia.

The top three policy directions Reset Australia is calling for are:

Increased transparency so evidence-based solutions can be found. This would include the introduction of “live lists” of the top trending issues during contentious periods - such as pandemics and elections.

A shift towards systemic issues, rather than focusing on content takedowns and user identification. Design features and algorithms that promote harmful content are at the root of the problem, and need to be tackled.

A ‘Black Letter Law’ by default approach which develops robust, legislated regulation which is enforced by independent arbitrators and written by policymakers, not the industry.

“We need to compel social media platforms to operate in line with public expectations. To do this we must hold them accountable for the harm they cause, not the anonymous users who take advantage of the unregulated space.

“If we are serious about protecting our democracy and social cohesion from the ravages of misinformation, disinformation, hate speech and polarisation then we urgently need to move away from the current self-regulation model.”

More comment: Chris Cooper 0403 353 621

Photo by Bermix Studio on Unsplash