Senator Jim Molan's further additional comments

I would like to begin by acknowledging the work of the committee in undertaking this important inquiry, and thank the witnesses who participated in giving evidence.
I concur with much of the report, but wish to articulate some concerns I have with regard to some of the evidence provided to the committee and proposed recommendations.
The release of this interim report is occurring within a political context where there is substantial appetite to regulate Big Tech and rein in its influence over the Australian and global information ecosystems. The landmark Australian Competition and Consumer Commission's digital platforms inquiry (section 2.30), the ongoing review of the Privacy Act 1988 (section 2.31), the new Enhancing Online Privacy Bill 2021 (section 2.33), the Australian Code of Practice on Disinformation and Misinformation (section 2.35), and related inquiry into social media (by the House Select Committee on Social Media and Online Safety) currently underway evidence the growing momentum in this regard.
The mechanisms through which foreign interference spreads on social media (outlined in Chapter 3), the breadth of the resultant societal harms (outlined in Chapter 4), and the current inadequacy of governance arrangements (outlined in Chapter 5) mean that thoughtful consideration of these risks are required as the Federal election approaches.
Reviewing the evidence in its entirety makes it patently apparent that the business model of social media companies, and the opaque design of its algorithms and platform features, need to be incorporated into our development of governance arrangements. Fundamentally, it is the business model, one based upon capitalisation of user attention, and amplification of fringe and extremist voices that drive division - that makes social media vulnerable to exploitation by malign foreign forces. As Ms Kara Hinesley from Twitter stated, we must approach the issue as a broad geopolitical challenge, not one of content moderation.1
And yet, much of the evidence provided to the committee from social media companies revealed their reliance on automated systems of content moderation and human investigators to address the threat of foreign interference on their platforms. Their testimonies largely focused on declaring the efficacy of these content moderation systems (section 3.26 - section 3.31). Despite impressive claims about the quantity of content flagged and removed, the lack of transparency and data sharing requirements (section 4.70 - section 4.79) based upon existing company policies and the Australian Code of Practice on Disinformation and Misinformation do not allow for the verification of these statistics. We not only remain in the dark about social media companies' management of the 'downstream' aspects of suspected foreign interference (including disinformation and coordinated inauthentic behaviour, and other offensive and harmful content), but also note their silence on 'upstream' interventions (that focus on how their business model and platform function).
We are aware that platforms are able to put in place more structural and substantive elections safeguards. We have observed this in other jurisdictions. For example, for the last US election, Facebook temporarily adapted its algorithm to prioritise authoritative content and dampen the spread of hyperpartisan content. Google also launched a comprehensive political ad library (which also exists for United Kingdom, the European Union, India, Israel and New Zealand).
Beyond streamlining regulatory oversight of foreign interference through social media (as per recommendation 1, 3, 7), increasing transparency over governance arrangements (recommendation 4, 5), and reducing the fragmentation of reporting and communication between platforms and government departments (recommendation 6), what other mechanisms are required to ensure greater platform transparency and accountability during this critical moment for our democracy? How can we ensure that Australia is provided with the full suite of election safeguards that have been known to work in other contexts?
The COVID-19 misinformation and disinformation (section 4.80 - section 4.88), and the sections of report focused on groups who are common targets of foreign interference (section 1.11, section 4.51 and section 4.52) reveal the need to focus on actors and narratives being imported from the United Kingdom, Europe and United States to Australia, in ways that stoke the rise of right-wing extremism. The protection of groups that are common targets of foreign interference (recommendation 2) will not be possible without further surveillance and policing of coordinated and malicious actors based in these jurisdictions seeking to exploit this time of high anxiety and uncertainty, and instigate further civil unrest.
I expect the final report of our committee will provide an opportunity for a more detailed discussion of these challenging issues.
Senator Jim Molan AO DSC
Deputy Chair
Liberal Senator for New South Wales

  • 1
    Ms Kara Hinesley, Director of Public Policy, Australia and New Zealand, Twitter, Committee Hansard, 30 July 2021, p. 47.

 |  Contents  |