Submission to Joint Standing Committee on Electoral Matters

  • Data Security
Decorative image

Reset Australia formerly operated under the name Responsible Technology Australia.

To: Parliament Of Australia - Joint Standing Committee On Electoral Matters
From: Responsible Technology Australia

Responsible Technology Australia (RTA) would like to thank the Committee for the opportunity to comment on the 2019 Federal Election. We welcome and agree with Senator McGrath’s comments on the “rise of social media manipulation and disinformation campaigns” as a threat to our democracy.

Our submission intends to add to this growing dialogue on how digital platforms are adversely affecting our democratic process. We look forward to working with the Joint Standing Committee on protecting Australian society.

Who we are

RTA is an independent organisation committed to ensuring a just digital environment. We seek to ensure the safety of Australian citizens online whilst advocating for a free business ecosystem that values innovation and competition. In particular, we are concerned with the unregulated environment in which digital platforms currently operate and advocate for a considered approach to address issues of safety, democracy and to ensure economic prosperity.

1.0 EXECUTIVE SUMMARY OF RESPONSIBLE TECHNOLOGY AUSTRALIA’S VIEWS

  1. Agree that the rise of social media manipulation and disinformation campaigns pose a threat to democracy
  2. Acknowledge the investigation done by the Joint Standing Committee on Electoral Matters into democracy and disinformation, reviewing the potential threats, possible solutions and investigation into measures other countries are taking to reduce this threat
  3. View that whilst there were a few incidences of disinformation in the 2019 Federal election, the potential for harm in future Australian elections is what the Committee should concentrate on
  4. Recommend several measures to reduce the threat of these platforms interfering in our democracy (but to also take these recommendations with those suggested in the ACCC Digital Platforms Inquiry in mind) including:
    1. Extending political advertising blackout laws to social media platforms and digital streaming services
    2. Ensuring greater transparency of political advertising on social media
    3. Initiate an Inquiry into the potential harm that advertisers could cause to individuals, society and/or the democratic process in Australia, through the use of the digital platforms
    4. Empower an independent regulator to proactively audit the type and magnitude of amplified content these algorithms are serving to Australian audiences - focusing particularly on divisive and/or disinformation that has the potential to influence political outcomes

2.0 CONTEXT - THE DIGITAL THREATS TO DEMOCRACY

2.1 Context

The business model of digital platforms like Facebook and YouTube is built on the capitalisation of user data and attention. As consumers, our data and attention is captured by these platforms and subsequently sold to advertisers, resulting in what is known as the ‘attention economy’.

Through this current unregulated and limitless data collection, digital platforms have built profiles on their users that includes their interests, their vices, and their vulnerabilities. This information is then used to intimately target users, to serve them content that is better geared to keep then engaged. The architecture of these digital platforms combined with our personal information leaves us vulnerable to different forms of manipulation that threaten our democratic process and Australian society at large.

This manipulation is facilitated by the digital platforms in two ways:

  1. Targeted Advertising | The unfettered approach to data collection has amassed history’s largest data set, allowing advertisers to push beyond normal constraints to deliver direct and granular targeting of consumers. This micro-targeting uses key emotional trigger points and personal characteristics to drive outcomes.
  2. Algorithmic Curation | As the primary aim of these platforms is to maximise user time spent on them (to increase their advertising revenue potential), the algorithms are incentivised to serve material that is calculated to engage users more. This content tends to be more extremist or sensationalist or untrue - as it has been shown to be more captivating,.

2.2 Effects on Democracy and Society

Foreign and malicious actors have been able to exploit both of these mechanisms in order to achieve their intended outcomes. The effects of this manipulation has already begun to be seen in Western democracies around the world, weaponising our personal information to drive division and interfere for geopolitical or financial gain.

In particular, the capacity for micro-targeting on the digital platforms is completely unprecedented, exacerbating disinformation in political advertising whilst also making it much harder to regulate.

‘unlike heritage media, digital and social… can be done in the “dark,” so your opponents may not even be aware of the message you are pushing out’.

This was clearly seen in the:

  • intentional Russian interference in the 2016 US Presidential election, with bought ads designed to exploit division in society for political gain,
  • Cambridge Analytica scandal which leveraged user data to serve curated Brexit messaging,
  • recent Federal Election with false and/or exaggerated political ads (in particular from the United Australia Party) that were spread over social media,.

Algorithmic curation of engaging (usually extremist) content also has potential to undermine our democracy, maybe in even more sinister ways. Due to the fact that divisive, sensationalist clickbait has been shown to spread faster online, foreign actors have been able to ‘game’ this system to peddle mass amounts of content with the intention of driving polarisation.

This has been seen with:

  • Actors from Kosovo pushing a range of extremist political content to Australians, driven by the financial incentive to profit from manipulating public sentiment
  • Untrue ‘death tax’ claims during the recent Federal election that was circulated and amplified on social media, prompting the ACCC’s Rod Simms to rebuke Facebook for not doing more to curb misinformation.

3.0 RECOMMENDATIONS

The digital platforms have both re-written the rules of engagement during elections and revealed vulnerabilities in our democratic process, and we recognise the journey to combat these harms and ensure a robust and functioning Australian society is complex and nuanced.

Our view is that future regulatory approaches should be harmonised and be able to adapt to the rapidly evolving landscape of these platforms. In particular, the findings and recommendations of the Final Report of the ACCC Inquiry on Digital Platforms should be incorporated into the Committee’s considerations.

In particular, the Final Report posited that specific processes and research was needed to build an evidence base on how these platforms might harm Australian society. Our recommendations build upon this and encourage a proactive and expedited approach to inform future regulation. This approach should grant regulators the oversight needed to understand the ways in which algorithms and ad systems amplify or incentivise problematic content, thus ensuring that our democracy and society are protected.

RECOMMENDATION 3.1: Expanding election advertising blackout laws to include digital platforms and digital streaming services.

With nearly 37% of Australians turning to social media as their main source of news, a comprehensive approach to political advertising blackouts nearing elections should be implemented.

These restrictions are in place on all forms of heritage media (televation, radio and print) to give a grace period for Australians to decide how they will cast their vote. These laws however, carry little effect as the digital platforms are exempt, meaning that political advertising continues unabated across all social media channels during elections. By having a cohesive approach to regulation, this will ensure that the original intentions of the advertising ban are upheld.

RECOMMENDATION 3.2: Increasing transparency of political advertising on digital platforms to improve literacy.

Efforts should be made to increase the transparency of digital campaigning. This is a particularly important first step in ensuring digital literacy and protecting our democratic process.

This has already started to be seen in the US and the UK, with Facebook creating an in-platform tool that allowed users to see who was paying for political ads. Australia should consider adopting the same measures, whilst also expanding similar transparency measures to other digital platforms (such as Twitter and Youtube). Additionally, an independent body (most probably the AEC) should be empowered with specific responsibilities that ensure fair conduct on the digital platforms such as:

  • All digital election ads to be clearly labelled with information about who paid for it, how much money was spent and the demographic it was targeting
  • Digital platforms to have publicly available online databases of political ads that are currently and have been previously run
  • Increased transparency on how spending on digital election ads is reported to the AEC

RECOMMENDATION 3.3: Conduct an Inquiry into the potential harm that advertisers could cause to individuals, society and/or the democratic process in Australia, through the use of the digital platforms.

This Inquiry should explore several key areas such as;

  • Reviewing the potential for social, emotional and political manipulation via digital platforms ad systems, including investigating previous and existing international cases (e.g. Brexit) as well as understanding the scale and depth of the data points available for advertisers to use for targeting
  • Determining the level of risk Australia faces when advertisers leverage user data to manipulate public sentiment and influence political outcomes, as demonstrated through the Cambridge Analytica scandal
  • Reviewing the data sets of Digital Platforms’ advertising partners (such as Experian and Quantium) including the data points and customer segments made available to advertisers, as well as the volume of Australians on their lists
  • Recommending proposed changes to the advertising process to minimise the potential for harms

How this could be implemented:

The taskforce established within an independent regulator, such as ACMA, would be briefed to conduct an investigation over several months into the potential mechanisms in which the advertising functions on digital platforms could be exploited by malicious actors.

This would require the cooperation of both the digital platforms and advertising partners to identify vulnerabilities, including the extent of data targeting available, the identification verification process, the advertising content checks and restrictions, and other procedures set forward by digital platforms and their partners. The outcome of the inquiry would be a set of recommendations for specific platforms to strengthen their advertising systems with the intention of informing future regulation.

RECOMMENDATION 3.4: Ongoing and proactive auditing of the content that algorithms amplify to users, focusing on the spread of harmful or divisive content that has the potential to influence political outcomes.

These expanded responsibilities of the above regulator will begin to build an evidence base on how algorithms prioritise and distribute certain content and the impact of this on the democratic process in order to inform future regulation. This should focus on (but not be limited to) the following:

  • Investigate the nature of algorithmic delivery of content which is deemed to be fake news, propaganda or disinformation
  • Audit the extent of algorithmic delivery on the diversity of content to any given user to investigate the impact of ideological filter bubbles
  • Audit of the amplification of polarising and extremist political content by these algorithms

How this could be implemented:

Algorithmic audits of these platforms would need to be determined in collaboration with the relevant companies. This could be set up in a way in which the platforms self-publish what content is being amplified and served in Australia by these algorithms in real-time, allowing regulators to focus on the outcomes of the algorithms rather than the design of the algorithm itself.

This would allow for the mitigation of two key concerns:

  1. The inherent sensitivity of allowing external scrutiny of algorithms that are the intellectual property of private enterprises and represent significant trade secrets
  2. This would allow regulators to focus on the outcomes of algorithmic curation - as looking at the algorithms’ code itself is unlikely to provide insights to the type of content surfaced as algorithms.

This mechanism would allow an independent regulator to gather evidence required to assess whether news and other content being recommended by algorithms is in line with societal expectations, and whether actions need to be taken by the platforms to tweak their algorithms to ensure content appropriateness, quality and diversity in line with our media regulation frameworks. An example method to explore modelling algorithmic auditing is available at Algo Transparency, which provides a snapshot of the videos recommended on Youtube.

4.0 CONCLUSIONS

RTA acknowledges the scale of the task ahead to begin to adequately regulate these digital platforms and mitigate the societal harms they inadvertently cause. We look forward to working together to bring about the best outcomes for businesses, consumers and society at large.

Stay up to date with our work by signing up for our newsletter.