Open letter to Meta: 24 questions about safeguarding the integrity of our Federal election
The 2022 Federal Election is now underway. It is imperative that Australian democracy is protected from mis- and disinformation and hate speech that polarises and divides our communities, and alters voter behaviour. The role of disinformation in influencing elections catapulted into public consciousness through Brexit, the 2016 US Presidential election, and the 2019 federal election in Australia.
The undersigned list of experts from various academic and research institutions are writing to express our concern that your election preparation is not commensurable with what is at stake: the integrity of the Australian election. While Meta has stated it has a comprehensive strategy, information to support this assertion is not available to the public, lawmakers and Australian Electoral Commission. And so, we write to ask for a set of straightforward factual clarifications on the public record:
Moderation of harmful and misleading content
1. How many dedicated human content moderators will be bolstering your AI enabled system specifically for the Australian election?
2. What languages do these content moderators speak? (for instance, the top five spoken languages in Australian homes other than English are Cantonese, Mandarin, Italian, Arabic and Greek)
3. Where are the content moderators dedicated to the Australian election based? If not in Australia, can their security and integrity be ensured during this time of geo-political instability?
4. How has your content moderation system taken into account the nuances of Australian English slang? (Using Australian slang is a common strategy for those seeking to evade detection by content moderation system on social media.)
5. Who has been consulted in the development of election-related content moderation policies? How will you ensure these policies are adaptive and responsive to the events of the election?
6. What AI enabled content moderation system will be deployed during the election, including image recognition technology? What are the error rates?
7. What provisions have been made to protect communities from foreign interference?
8. What avenues are in place to enable civil society organisations to flag harmful and false content (beyond the reporting mechanism of the eSafety Commissioner)?
9. In the United States you have shared data about removed coordinated inauthentic behaviour networks with independent researchers. Why have you not implemented this in Australia? (ideally all content and accounts removed under election-related policies should be stored and shared for post-election scrutiny)
10. During the last United States election you labelled posts that were believed to be state-controlled media outlets. Why have you not implemented this in Australia?
Increasing transparency over third-party fact checking
11. Given that over 20% of Australians speak a language other than English at home, what languages will the third-party fact checks be translated into?
12. What non-English language publications will third-party fact checks be provided to?
13. How will the speed of fact checking be measured during the election?
14. How will the reach of fact checking be measured during the election (i.e. how many people have viewed the content and their demographics)?
15. How will your response to fact checking be measured during the election? (i.e. content take down, or reporting to the appropriate authority)
Algorithmic prioritisation of quality news content over disinformation
16. During the last United States election, Facebook’s algorithm was adapted to reduce the distribution of sensational and misleading material, prioritising content from authoritative sources. Why has this measure not been implemented in Australia?
17. During the last United States election, the distribution of live videos related to the election was limited. Why has this measure not been implemented in Australia?
Reducing the influence of social issue, electoral and political ads
18. Google has restricted the targeting for election ads in Australia. Has Meta considered this? If so, why has it not been implemented?
19. During the last United States election, Meta implemented changes to ensure fewer people saw social issue, electoral and political ads that had a “paid for by” disclaimer. Why has this measure not been implemented in Australia?
20. During the last United States election, the creation of new ads about social issues, elections or politics in the last few days of the election was blocked. Why has this measure not been implemented in Australia?
21. What measures do you have in place to screen the placement of ads to ensure all political ads are properly identified and labelled by the advertiser?
22. Beyond the Ad Library, will Meta be making available a comprehensive public archive of all sponsored political content, including targeting data and aggregated engagement statistics by target audiences (accessible by API)?
‘Break glass’ (or emergency) safety measures
23. What ‘break glass’ measures are on standby during the election?
24. What type of event (in terms of reach and impact) would trigger the implementation of ‘break glass’ measures?
Ultimately, upstream regulation that assesses the risks and harms of platforms’ systems and processes (i.e. algorithms and platform design features) is necessary for protecting our democracy in the longer term. Many current strategies, which attempt to regulate content and users downstream, are not enough. Given that adequate regulatory frameworks are not yet in place, we urge you to take seriously your responsibility to the Australian public by implementing more comprehensive measures to limit the spread of disinformation and hate speech.
We are aware of a persistent pattern of under-investment in safety and integrity measures, which are heightened in jurisdictions outside of the US. The Federal election requires prioritising public interest over engagement metrics and profits. As such,we intend to publish your responses to the above questions in a publicly available report.
Dhakshayini Sooriyakumaran, Director of Tech Policy, Reset Australia / PhD Scholar, ANU School of Regulation and Global Governance (RegNet)
Dr Rys Farthing, Dr Rys Farthing, Director of Data Policy, Reset Australia & Research Associate IALS, University of London
Julia Powles, Director, Minderoo Tech & Policy Lab / Associate Professor, Law and Technology, UWA Law School
Professor Rebecca Giblin, Professor of Law / ARC Future Fellow / CREATe Fellow / Director – Intellectual Property Research Institute of Australia (IPRIA)
Professor Kathryn (Kate) Henne, Director, ANU School of Regulation and Global Governance (RegNet) / Professor and Research Lead, ANU Justice and Technoscience Lab (JusTech)
Dr Jake Goldenfein, Senior Lecturer, Melbourne Law School, University of Melbourne and Associate Investigator, ARC Centre of Excellence for Automated Decision-Making and Society
Professor Kimberlee Weatherall, The University of Sydney Law School/ARC Centre of Excellence for Automated Decision-Making and Society
Rita Jabri-Markwell, Advisor to Australian Muslim Advocacy Network (AMAN)
Jenna Imad Harb, PhD Scholar, Justice and Technoscience Lab (JusTech), ANU School of Regulation and Global Governance (RegNet)
Kirsty Anantharajah, PhD Scholar, Justice and Technoscience Lab (JusTech), ANU School of Regulation and Global Governance (RegNet)
Professor Tama Leaver, Professor of Internet Studies / Chief Investigator in the ARC Centre of Excellence for the Digital Child, Curtin University
Professor Axel Bruns, Australian Laureate Fellow and Professor in the Digital Media Research Centre at Queensland University of Technology / Chief Investigator in the ARC Centre of Excellence for Automated Decision-Making and Society
Dr Timothy Graham, Senior Lecturer, School of Communication | Academic Lead (Research) and ARC DECRA Fellow, Faculty of Creative Industries, Education and Social Justice | Queensland University of Technology
Are you also working to ensure the integrity of our elections? Click here to add your name to this letter.