Submission on Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020
Who we are
Reset Australia is an independent, non-partisan organisation committed to driving public policy advocacy, research, and civic engagement to strengthen our democracy within the context of technology. We are the Australian afliate of Reset, the global initiative working to counter digital threats to democracy. As the Australian partner in Reset’s international network, we bring a diversity of new ideas home and provide Australian thought-leaders access to a global stage.
We commend the Government for putting forward a proposal which seeks to reign in the influence of the digital platform companies, especially within a sector which is vital for a well-functioning democracy. Our core recommendations can be found in our prior submission to the consultation run by the ACCC on the final draft of this proposed amendment in 2020. In summary, our primary recommendations:
Work to clearly define within the legislative text:
a) The scope and types of data required under Section 52R, Subsection (3)(a)
b) The measure of ‘significant efect’ under Sections 52S-U, Subsections (1)(c)
Ensure a rights-based framework of user protections, similar to the EU GDPR prior to the passage of wholesale data sharing provisions outlined under Section 52R
Institute an audit authority under an independent regulator to both verify the provisions set out under the Minimum Standards of this Code, and empowered to investigate/audit the impact of algorithmic amplification on Australian society
Ensure proper regulatory oversight and guidance for how the information from the data sharing and advance notifications provisions under the Minimum Standards should be used
Impose specific investigative powers on the independent regulator to conduct inquiries on market implications of new products and/or services
Conduct a comprehensive assessment into how this Code has impacted the information and news media landscape annually 1 Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020 Submission 54
We must recognise, that whilst the stated purpose of this amendment is to address bargaining power and competitive imbalances between media companies and the digital platforms, the true impact of this legislation will be changes to the news, media and journalism landscape in Australia. In order for us to strengthen our democracy, we must ensure that this impact is positive, with a goal of promoting greater diversity and pluralism within our media landscape.
This goal must be the guiding principle for both the final version of this amendment, and any iterations and complementary pieces of legislation that might be built from this Code into the future.
We implore the Government not to lose sight of the real goal: of ensuring a diverse and pluralistic media landscape.
This submission will endeavour to focus on recommendations that focus on rectifying market power imbalances between the platforms and media companies, however we stress that Big Tech are in the middle of a complex web of issues, and a singular commercial lens for a single industry seems short sighted. The ‘harmonised framework’ of the ACCC Digital Platforms Inquiry must be expanded beyond just the media to recognise the integral and expansive ways the digital platforms shape our lives. 2 Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020 Submission 54
1.0 Primary Recommendations - Minimum Standards
Our recommendations centre on the provisions set out under Division 4, Subdivision B - Minimum Standards.
1.1 Vague Definitions
Currently the data explanation and provision (Section 52R) requirements and advance algorithmic notification (Section 52S, 52T and 52U) requirements have impossibly vague definitions that, based on interpretation, have far-reaching and significant implications on data rights and privacy as well as limiting the ability for this Code to be meaningfully implemented.
The specific passages are:
Section 52R, Subsection (3)(a) - [that lists and explanations of data must be given] that relates to interactions of users of the designated digital platform service with covered news content made available by the designated digital platform service.
Whilst some eforts have been made to explain what ‘interactions’ means, both in Section 52C of the Bill and under point 1.107 of the Explanatory Materials, the wording remains vague. This opens up a spectrum of interpretation with significant downstream implications. All interactions of users related to news media content? Profile data of users who engage with this type of content? If a user shares a news link with another user, is that profiling data also shared?
For comments around why this level of transparency without the appropriate safeguards is concerning and further recommendations, please see Section 1.2 of this document.
Section 52S, 52T and 52U, Subsection (1)(c) - Changes to algorithm or practice will bring about significant efect on the respective components under each subsection.
Similarly, the term ‘significant efect’ has not been adequately defined, despite attempts to clarify under Section 52W of the Bill. Whilst the Explanatory Materials point 1.127 mentions that a significant efect constitutes a 20% or more change in referral trafc, as this is not reflected in the Bill text, it is again open to a risk of difering interpretation. Furthermore, there is no guidance around what ‘significant’ means in relation to Section 52U and changes to the distribution of advertising.
Recommendation: Work to clearly define within the legislative text:
- The scope and types of data required under Section 52R, Subsection (3)(a)
- The measure of ‘significant efect’ under Sections 52S-U, Subsections (1)(c)
1.2 Section 52R - Explanation and provision of data and implied data privacy risk
Whilst we support the intention behind the data sharing and explanation requirements detailed under Section 52R, we are concerned about the lack of a rights-based user protections framework that will support these provisions.
The EU’s proposal of the Digital Markets Act (DMA) has provided pathways for business users to gain access to the data generated from the usage of the digital platform’s services. This provision is a good step in balancing market imbalances and we are tentatively supportive of the Government’s proposal to incorporate similar measures within this Code. Additionally stipulations should also be considered to clarify how platforms should not impose barriers and facilitate this release of data, such as mandating that they must provide high quality APIs free of charge.
However, our support for this section is conditional on several fundamental changes to ensure that user protections are guaranteed.
Firstly - addressing vague definitions outlined under Section 1.1 of this submission.
Secondly - this provision must be supported by a rights-based framework to data privacy.
Whilst the EU DMA sits under protections aforded by the General Data Protection Regulation (GDPR), there is no Australian equivalent protective framework. The lack of end user rights around consent, data processing, erasure, automated individual decision-making (profiling) amongst others is especially concerning when taken in concert with the vague definition of scope. Therefore, we believe that the current passage of this Section is untenable until we update our privacy framework to recognise data rights. We respect that this is a current and ongoing process, and you can find more of our comments within our submission to the Privacy Act Review.
Recommendation: Ensure a rights-based framework of user protections, similar to the EU GDPR prior to the passage of wholesale data sharing provisions outlined in this Code.
1.3 Audit Authority - The need for verification and algorithmic audits
Ensuring that the Minimum Standards would work: No matter how the provisions under the Minimum Standards are interpreted, under the current Code there is no way to verify if the information provided by the digital platforms is accurate. The digital media platforms operate with near-monopoly status and hold a tight grip of control over how they use their data. Whilst this Bill orders them to share information, how will the media companies ascertain whether this data is meaningful or not, even if it is ‘accurate’ under the proposed law.
This entire section of the Bill becomes redundant with no verification measures. Thus there is an integral need for an audit authority to be instituted sitting under the independent regulator, most likely the ACCC.
The case for a broader remit and algorithmic audits: Whilst verification provides a clear cut case for instituting an audit authority, a discussion on how this authority might work to ensure data transparency measures work to serve broader public interests must be had. We recognise that the harms caused by the digital platforms, ranging from foreign interference to disinformation, needs a holistic approach and the remit of this authority should expand to provide insights into bigger questions - such as how platform curation algorithms open up risk and create harm to the public. Importantly, this isn’t at the exclusion of platform/publisher content visibility issues remedied by this Bill, merely an expansion that might provide a systematic legislative approach, rather than one focussing on a specific sector.
A purely commercial lens to the data sharing and advance notification provisions (particularly 52R, 52S and 52T) completely misses the systematic impacts of algorithmic amplification- that is the promotion/demotion of content that is currently dictated by the digital platform’s internal algorithmic processes. It is an issue that goes far beyond trafc and advertising revenue, and requires an expansive remit to address. Whilst market imbalances are important and addressed (and they would be under this authority), unilateral algorithmic curation and amplification has an outsized impact on harming the Australian public and our democracy. Whilst this is most clearly seen within the news media sector, and as such this Bill provides the perfect springboard to enact this kind of reform, we must not ignore that these harms go far beyond just news content.
Information on these harms is held solely by the digital platforms, who do not make it available for transparent independent review under any circumstances. It seems extraordinary that the digital platform companies have all the data and tools needed to track, measure and evaluate these harms - indeed these tools are a core part of their business, but they make nothing available for public oversight, even as they avoid all but the most basic interventions to protect the public from harm.
Without mandated access, regulators are forced to rely on the companies to police themselves through inefective codes of conduct. This failed approach has been seen overseas and yet is still being tried here in Australia.
This is not an impossible suggestion as the digital platforms might make you believe. Algorithmic audits have been specifically proposed in the EU Digital Services Act (DSA), and represent a clear model to emulate here in Australia. Our legislative approach must be as flexible and encompassing as the harms we seek to address.
Recommendation: Institute an audit authority under an independent regulator to both verify the provisions set out under the Minimum Standards of this Code, and empowered to investigate/audit the impact of algorithmic amplification on Australian society.
What would an algorithmic audit authority do?
An audit authority under an independent regulator (most likely the ACCC) must have the tools and powers to verify the actions of the digital platforms, test the operation of algorithms and to undertake inspections themselves.
Its responsibilities with respect to this Bill would be:
- Verification: This authority must be empowered not just to oversee but to verify the obligations of the digital platforms under this Code are being met. Its expanded (and in our opinion necessary) responsibilities would be holistic investigation of how algorithmic curation systems impact wider society,
- Algorithmic Audits: An algorithmic audit is a review process by which the outputs of algorithmic systems (in this case the curation systems of the digital platforms which display news media content) can be assessed for unfavourable, unwanted and/or harmful results. In addition to assessing if design decisions within the digital platform algorithms are actively anti-competitive, this process can also be used to assess numerous online harms to wider society and democracy such as disinformation and foreign interference.
How would an audit authority work?
The authority must have the ability to carry out an algorithm inspection with the consent of the digital platform company; or if the company doesn’t provide consent, and there are reasonable grounds to suspect they are failing to comply with requirements, to use compulsory audit powers. The resourcing to carry out these investigations could sit within the ACCC, but they should also have the power to instruct independent experts to undertake an audit on their behalf. Examples for how this might be structured can be seen in multiple industries from aviation to drug therapeutics. Recommendation Institute an audit authority under an independent regulator to both verify the provisions set out under the Minimum Standards of this Code, and empowered to investigate/audit the impact of algorithmic amplification on Australian society.
1.4 Regulatory Oversight
This Code does little to define the responsibilities of the news media companies for what happens after the data sharing and advance notification provisions are enacted. Whilst we respect that commercial entities should be free to use this information (to a certain degree) as they wish, recognising that these impacts have implications beyond commercial competitiveness including safeguarding democracy, public health information and security - we recommend that the Commission or an appropriate independent regulator be tasked with the necessary oversight, guardrails and powers to address potential issues of harm.
For example, the Explanatory Materials states that the advance notification requirements are intended to capture internal practice changes, with examples including:
- Removal of inappropriate content
- Suspending user accounts
- Rules around permitted types of advertising content
It is our opinion the intended changes these advance notification seeks to capture as specifically referenced by the Government, represent a significant public and democratic interest that regulatory oversight must be incorporated.
This might include:
- clear limitations and guidance around usage of shared information obtained through the Minimum Standards
- provide a transparent procedural route for organisations to contest decisions (such as for content takedown and user removal)
- mandated risk assessment prior to these provisions are enacted to ensure that the information isn’t used to harm the public
- public reporting on how these provisions have been enacted and the results annually
Recommendation: Ensure proper regulatory oversight and guidance for how the information from the data sharing and advance notifications provisions under the Minimum Standards should be used.
1.5 Market Investigations
Impose specific powers, under a standardised framework, for the independent regulator to conduct investigations on the market implications of new products and/or services. By instituting these investigative powers in the natural policy lifecycle of this Code rather than relying on sporadic investigations led by the ACCC, we can ensure proper resourcing and agile delivery, as well as engendering an iterative and future-forward approach to the implementation and evolution of this Code. These powers should allow for the regulator to deepdive into anti-competitive practices and implications to the wider sector as the use case for digital platforms change, and as new products and services enter the market.
This should be modelled after corresponding sections of the DMA. Chapter IV details the specific circumstances in which these investigations can be instigated, of particular relevance is Article 17 around new products and/or services.
Recommendation: Impose specific investigative powers on the independent regulator to conduct inquiries on market implications of new products and/or services
1.6 Monitoring and Evaluation
As mentioned in our previous submission, annual impact assessment of this Code must be resourced and undertaken to ensure that it is actively working towards increasing media diversity in Australia.
Understanding how this Code has impacted the media landscape is vital to ensuring that this legislation is appropriately iterated to adapt to the rapidly evolving information landscape.
Questions may include:
- Has this Code contributed to an increase in the number of journalists, news media companies and news innovation?
- Has this Code contributed to an increase in the quality and objectivity of reporting?
- Has this Code contributed to an increase in diversity within the Australian news media landscape?
- Has this Code inadvertently concentrated bargaining power amongst a few news media outlets?
- How has this Code affected regional, minority and independent news media companies and journalists?
Recommendation: Conduct a comprehensive assessment into how this Code has impacted the information and news media landscape annually
2.0 Future Directions
Attempts to rectify market and information imbalances between digital platforms and news media companies must not end with this Code. The impact of the digital platforms on society is expansive and emerging, and a broad outlook at reform must be had to ensure that legislation will impact these issues systematically and fairly.
1. News, Media and Journalism: Looking solely within the news and information sector, anti-competitive practices that ‘steal ad revenue’ isn’t the sole reason for the decline of news media organisations. From adequately resourcing public interest journalism to potentially exploring a digital ‘sin’ tax to do so, there are a suite of policy options that have not been explored to ensure a pluralistic media landscape. The Government must constantly reafrm its goal to achieve a diverse media landscape, and work to iterate, evolve and build upon this Code to achieve this.
2. Holistic Regulation: We must move beyond just the news sector as the impact of the digital platforms doesn’t just impact news publishers. From commerce to small business, innovation to defence, the policy approach to regulating the digital platforms must reflect the diverse and interdependent impacts they have on our society. To achieve the core intent of this proposed Code, we must open a discussion on what a holistic framework underpinned by integral user rights would look like.
The EU’s proposal for the DSA, European Democracy Action Plan and DMA provides an example of how disparate pieces of legislation (such as this Code) might be tied together under a cohesive framework. As the Government moves forward with scheduled (namely the Privacy Act review, Online Safety Act and Code on Disinformation) and future regulatory action, we look forward to contributing to the work to organise this action under an appropriate framework.
Please review our previous submission, where we made additional recommendations on:
- Adequately resourcing public interest journalism
- Exploring new frameworks to holistically classify the digital platforms in order to appropriately address their impacts and ensure that they are systematically addressed
We thank the Government for allowing the opportunity to share our thinking on an exciting and leading piece of legislation that will curb the over-sized influence of Big Tech in Australia. News media is a fundamental pillar of our democracy, and we are looking forward to working with you, the platforms and civil society to ensure that the media remains open and accessible.