Chapter 5

Governance

5.1
Various departments across the Australian Government have responsibility for issues associated with foreign interference through social media. This chapter examines the existing legislative framework, including the new voluntary code of practice that applies to some social media platforms. It also assesses the effectiveness of the current departmental arrangements and cooperation activities occurring between social media platforms and government departments, particularly in the context of an upcoming Federal Election.

Relevant legislation

5.2
While several pieces of legislation apply to social media and the online environment more generally, such as those that seek to protect user privacy, and prevent criminal activity online, none specifically address the problem of foreign interference through social media.
5.3
The Department of Home Affairs provided examples of legislation that regulates social media platforms, including:
Privacy Act 1988 (Privacy Act), which applies to organisations with an annual turnover of more than $3 million and operating in Australia; this includes organisations such as Facebook, Instagram, Twitter, Snapchat and LinkedIn. The personal information shared on such platforms is protected by the data protection obligations under the Privacy Act.1
Enhancing Online Safety Act 2015, which established a two-tiered scheme for the removal of harassing or abusive material from participating social media services allowing tier 1 services to participate on a cooperative basis and requiring tier 2 services to comply on a compulsory basis.2
Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Act 2018, which empowers the eSafety Commissioner to issue removal notices that require the providers of social media services, relevant electronic services, designated internet services and hosting services to take all reasonable steps to support the removal of intimate images, or to cease hosting the image.3
Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019, which requires content, internet and hosting providers, including social media platforms, to—within a reasonable time—report to the Australian Federal Police abhorrent violent conduct and remove abhorrent violent material. The Act also provides the eSafety Commissioner with the power to notify a service provider that abhorrent violent material is available on their service.4
5.4
Additionally, the Australian Parliament recently passed the Online Safety Act 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Ac 2021. These acts retain and replicate certain provisions in the Enhancing Online Safety Act 2015, including:
maintaining the non-consensual sharing of intimate images scheme;
specifying basic online safety expectations;
establishing an online content scheme for the removal of certain material; creates a complaints-based removal notice scheme for cyber-abuse being perpetrated against an Australian adult;
broadening the cyber-bullying scheme to capture harms occurring on services other than social media;
reducing the timeframe for service providers to respond to a removal notice from the eSafety Commissioner;
bringing providers of app distribution services and internet search engine services into the remit of the new online content scheme; and
establishing a power for the eSafety Commissioner to request or require internet service providers to disable access to material depicting, promoting, inciting or instructing in abhorrent violent conduct for time-limited periods in crisis situations.5

Australian Code of Practice on Disinformation and Misinformation

5.5
Many social media platforms have adopted a voluntary code of conduct that was produced by the Australian Communications and Media Authority (ACMA) and the industry-led Digital Industry Group (DIGI), a not-for-profit organisation whose members include major social media platforms.
5.6
As noted in Chapter 1, following the Australian Competition and Consumer Commission's Digital platforms inquiry, ACMA was tasked with reporting to government on the adequacy of the platforms' measures and the broader impacts of disinformation by June 2021.6 This process included ACMA consulting with digital platforms, government and other relevant stakeholders to develop principles and minimum expectations for a voluntary code of conduct.7
5.7
A final code of practice for social media platforms was published in February 2021, entitled the Australian Code of Practice on Disinformation and Misinformation. The Code is an opt-in framework for platforms,8 with Twitter, Google, Facebook, Microsoft, Redbubble, TikTok, Adobe and Apple doing so.9 The code outlines several guiding principles that platforms ought to abide by, namely:
protection of freedom of expression;
protection of user privacy;
policies and processes concerning advertising placements;
empowering users;
integrity and security of services and products; and
supporting independent researchers.10
5.8
Following the adoption of this code, ACMA was required to report to government on a number of matters. These included initial compliance with the code by signatories, the state of disinformation and misinformation on the platforms, and the code’s effectiveness in responding to the problems identified by the Australian Competition and Consumer Commission's Digital Platforms Inquiry.11
5.9
ACMA provided this report to government in June 2021.12 However, this report is not publicly available13 and is currently being considered by the relevant minister.14 Although ACMA has already reported on the code's functioning over a five-month period,15 on 30 July 2021 Ms Sullivan stated that the code was in its 'early stages' and that 'we need to see how the code works over the forthcoming months'.16

Government departments

5.10
Several bodies, strategies or schemes are responsible for addressing issues relating to foreign interference through social media, as well as related misinformation and disinformation. The roles of several departments and strategies in addressing particular elements of foreign interference are described below, as is the role of the cross-departmental Electoral Integrity Assurance Taskforce.

Australian Electoral Commission

5.11
The Australian Electoral Commission (AEC) is responsible for ensuring that Australia has an impartial and independent electoral system. The AEC noted that its practices, and the Australian legislative framework for elections and electoral integrity are 'frequently cited as exemplars of global best practice'.17 However, it also noted that 'maintaining the highest levels of electoral integrity is a continually evolving challenge'.18 The AEC further noted that '[t]he broad notion of "integrity" now encompasses cyber security and disinformation, in addition to longer term issues such as physical security and sound operating procedures'.19
5.12
The AEC does not monitor electoral communications to ensure that they are accurate, as the Commonwealth Electoral Act 1918 (the Electoral Act) does not require truth in electoral communication.20 The AEC noted in its submission that the Electoral Act does require electoral matter to be authorised to provide voters with the source of a communication, which extends to electoral matter that is published on social media.21
5.13
In its submission, the AEC outlined measures it took leading into and throughout the 2019 Federal Election campaign, including:
formalising the Electoral Integrity Assurance Taskforce (EIAT) to address risks to the integrity of the electoral system;
the 'Stop and Consider' campaign, which encouraged voters to check the source of material they consumed to avoid being misled by potential disinformation;
engagement with social media organisation and digital platform providers to ensure that the content on these platforms complies with the relevant provisions of the Electoral Act;
countering electoral disinformation: the AEC monitored social media information online during the election and, where they could, corrected the record as it related to their organisation; the AEC also utilised explanatory infographics and animations to provide key information regarding the election process in an easily consumable format suited to digital media; and
investigating electoral communications complaints.22
5.14
While the AEC does not have a legislative role regarding the truth of electoral communications, it can and does take action on disinformation in relation to the process of administering the election.23 The AEC noted that, due to the high volume of communications during elections, the AEC 'does not proactively seek out communications that may not comply with the requirements in the Electoral Act, but rather acts on complaints and information provided to us'.24
5.15
During the 2019 Federal Election the AEC investigated 528 complaints relating to electoral communications; of these complaints, 109 of these were based on social media content and 28 breaches of the Electoral Act were identified.25 The AEC stated that 'there were only eleven items of social media communication that resulted in requests by the AEC to the relevant social media company to remove the illegal communication (all of our requests were promptly responded to)'.26 The AEC stated that it is continuing to build on its relationship with social media organisations.27

Counter Foreign Interference Diplomatic Strategy and Counter Foreign Interference Coordinator

5.16
The Counter Foreign Interference Diplomatic Strategy is a pilot program that 'focuses on cooperation with regional partners to enhance their resilience, as well as efforts to build support for stronger international norms against foreign interference'.28 The strategy is led by the National Counter Foreign Interference Coordinator, which sits within the Department of Home Affairs.29 The broad goals of diplomatic action under the Counter Foreign Interference Diplomatic seek to counter foreign interference activity by:
delivering clear messaging to ensure foreign actors understand what kinds of actions Australia finds unacceptable and that foreign interference is viewed as a core national security concern;
showing foreign interference actors that their actions can and will be revealed and will generate a meaningful response;
convincing foreign interference actors that their actions will have costs – and that these costs outweigh the benefits – including through international reputational damage and by underscoring both the strength of Australia’s systems and the sophistication of our detection and enforcement capabilities;
demonstrating that the opportunities for foreign interference are narrowing in Australia and the region, including by increasing regional awareness, reducing vulnerabilities, strengthening institutions; and
mobilising international collaboration to counter foreign interference and establish globally accepted norms of behaviour.30
5.17
Mr Neil Hawkins, Acting Deputy Coordinator and Acting First Assistant Secretary, National Counter Foreign Interference Coordination Centre, Department of Home Affairs, noted that the national Counter Foreign Interference Coordinator's office is 'responsible for coordinating a whole-of-government response to foreign interference'.31
5.18
Mr Lachlan Colquhoun, First Assistant Secretary, National Security Division, Department of the Prime Minister and Cabinet, described the Department of Home Affairs as the lead department and the Counter Foreign Interference Coordinator as the lead area for responding to foreign interference through social media:
The lead agency is the Department of Home Affairs—and all agencies of government are aware of that—through the office of the Counter Foreign Interference Coordinator. All responses to any matter relating to foreign interference, whether it's via social media, via activities within Australia or via activities from offshore, are the responsibility of the Commonwealth Counter Foreign Interference Coordinator.32
5.19
Additionally, the National Counter Foreign Interference Coordinator works with the Counter Foreign Interference Taskforce, which sits within the Australian Security Intelligence Organisation.33 The Counter Foreign Interference Taskforce seeks to 'investigate, disrupt and counter' foreign interference.34

International Cyber Engagement Strategy

5.20
The Department of Foreign Affairs and Trade leads the implementation of Australia’s International Cyber Engagement Strategy (ICES). Released in October 2017, the ICES seeks to 'maintain an open, free and secure cyberspace that drives economic growth, protects national security and fosters international stability'.35
5.21
The ICES has seven key themes, which are to:
maximise opportunities for economic growth and prosperity through digital trade;
foster good cyber security practices;
reduce the risk of cybercrime;
promote peace and stability in cyberspace;
advocate for multi-stakeholder Internet governance;
promote respect for human rights and democratic principles online; and
encourage the use of digital technologies to achieve sustainable development.36

Foreign Influence Transparency Scheme

5.22
The Foreign Influence Transparency Scheme Act 2018 created the Foreign Influence Transparency Scheme, which came into effect on 10 December 2018.37 The Attorney-General's Department has ownership of the scheme. Under the scheme, a person is required to publicly register with if they undertake a 'registrable activity' in Australia for the purpose of political or governmental influence on behalf of a foreign principal. A registrable activity can be:
general political lobbying;
parliamentary lobbying;
communications activity; or
disbursement activity.38
5.23
Foreign principals can be 'foreign governments, foreign political organisations, foreign government related entities, and foreign government related individuals'.39
5.24
Significantly, this scheme is designed to address foreign influence—an entirely legal activity, which is regularly undertaken by many governments—as opposed to foreign interference.40 The Attorney-General's Department noted that such influence activities 'when conducted in an open and transparent manner, are a normal aspect of international relations and diplomacy and can contribute positively to public debate'.41
5.25
During election periods, the Foreign Influence Transparency Scheme imposes additional obligations. From the day that the writs are issued to when the last polling stations close on voting day, activities must be lodged with the Attorney-General's Department within seven days rather than the usual 14. Following this, the Department must publicly publish those activities within 48 hours, rather than within the normal prescribed time of four weeks.42
5.26
During the 2019 Federal Election, the Attorney-General's Department received referrals from the Electoral Integrity Assurance Taskforce to 'consider whether any registrable activities were being undertaken, and whether the posts on social media needed to be registered and contain the appropriate disclosures'.43 The Attorney-General's Department described the difficulties in assessing the referral of such posts:
In making a determination about whether registration obligations would apply, there were a number of factors the department needed to take into consideration – some of which were difficult to establish with a strong degree of certainty. In particular, the number of social media posts and different platforms used in the federal election to share information and opinions on candidates was significant and it was often not clear whether the posts were on behalf of a foreign actor.44
5.27
The Attorney-General's Department further stated that, despite these difficulties, where material was identified that may not have complied with the Foreign Influence Transparency Act 2018, the department 'engaged with government counterparts and, where appropriate, social media companies to work cooperatively to assess whether the obligations under scheme applied to the material'.45

Electoral Integrity Assurance Taskforce

5.28
The Electoral Integrity Assurance Taskforce (EIAT) is a cross-departmental taskforce. Prior to the 2018 by-elections and 2019 Federal Election, the EIAT was formed to 'address risks to the integrity of the electoral system'.46 The AEC stated that the taskforce 'comprised a range of Commonwealth agencies who were co-located during the federal election and provided timely guidance and expertise to the AEC on a broad range of integrity issues, including cyber security and disinformation'.47 Member agencies of the EIAT include:
Australian Electoral Commission;
Department of Finance;
Department of Prime Minister and Cabinet;
Department of Infrastructure, Transport, Regional Development and Communications;
Attorney-General’s Department;
Department of Home Affairs; and
Australian Federal Police.
5.29
The EIAT is also supported by intelligence agencies where required,48 including the Office of National Intelligence, the Australian Signals Directorate and the Australian Security Intelligence Organisation.49 The EIAT is also overseen by an oversight board, which comprises the same members as the EIAT itself.50
5.30
The EIAT is also supported by the Electoral Integrity Intelligence Forum, which Mr Patrick Hallinan, Acting First Assistant Secretary, Counter Foreign Interference Coordination Centre, Department of Home Affairs, described as a body chaired by the Counter Foreign Interference Coordination Centre, comprised of representatives from the national security community, that 'provide[s] coordinated support and advice to the EIAT board and, through the board, to the commissioner on any national security concerns from an intelligence perspective'.51

Mandate and terms of reference

5.31
Despite the EIAT being established in 2018, the EIAT has not received a formal mandate by government. Mr Lachlan Colquhoun, Department of the Prime Minister and Cabinet, described to the committee how, the establishment of the EIAT was 'almost organic' and that it was not established with a set mandate.52 In response to a request refer to a document that outlined the EIAT's role, Mr Colquhoun stated that one did not exist 'at this point' and that 'there is some work underway within government to consider how to more concretely codify the role of the task force'.53 Mr Colquhoun further stated:
The Australian Electoral Commission has started preparing a paper, basically formalising the task force and making sure that there's a common understanding of its role and remit, but that paper doesn't have any status at this point, and I wouldn't want to go too much further given it has not been put to government formally yet.54
5.32
While no mandate for the EIAT's activities exists, the EIAT operate under a terms of reference, which were endorsed by the oversight board in April 2021.55 However, this document is regarded as classified material. Mr Jeff Pope, Deputy Electoral Commissioner, ACE, stated:
We do have terms of reference, but it is a classified document and, therefore, we've been limited in our ability to be able to share that more broadly.56
5.33
When asked if the terms of reference would be made public, Mr Tom Rogers, AEC, stated:
We will be releasing information about the activities of the task force, yes.57
5.34
Mr Tom Rogers, AEC, confirmed that the EIAT's primary role is as an information sharing forum, which also existed to provide advice to the Electoral Commissioner.58 Mr Jeff Pope, AEC, expanded on the role of the EIAT:
Essentially, the role of the task force is for all of the agencies, within their legislative roles and functions, to collaborate and assess information and referrals that might be referred into the task force or that they may detect in their own right; to determine whether there are any matters that may impact on the potential integrity of election processes and election results; and to provide advice to the Electoral Commissioner with respect to those matters.59
5.35
Mr Patrick Hallinan, Department of Home Affairs, described the broadness of the EIAT's remit regarding electoral integrity:
The EIAT focuses on a number of things. It focuses on, obviously, electoral integrity related matters more generally, but, to step that out for you, foreign interference is one of the elements of electoral integrity that the EIAT is concerned with, but it's also concerned with things to do with physical security, whether that's terrorism or process related activity. It's also concerned with things to do with cybersecurity. I think it's attempting to bring a broader consideration of the full range of activities which may impede the conduct of an election or otherwise affect the conduct of an election and provide that support primarily to the Electoral Commissioner.60
5.36
Mr Tom Rogers, AEC, noted that the EIAT is not 'empowered to make decisions' regarding public communications.61 Mr Rogers further stated that '[t]he task force would be a focus of discussion where agency heads that are represented by those on the task force would then make those decisions'.62 Mr Rogers confirmed that the EIAT does not have any further decision-making capacity, rather that it is primarily an information sharing forum that also provides advice to the Electoral Commissioner.63
5.37
When asked about the EIAT's ability to brief members of cabinet, Mr Nathan Williamson, Deputy Secretary, Governance and Resource Management, Department of Finance, also confirmed that the EIAT itself did not have ability to brief members of cabinet, rather that individual departments would do so in line with their usual responsibilities.64
5.38
Additionally, should the EIAT become aware of an instance of foreign interference in an Australian election, the information would be referred to the Electoral Commissioner. Mr Patrick Hallinan, Department of Home Affairs, described how this process would occur:
I would expect that, were there to be an instance of a deliberate targeted campaign which constituted foreign interference in an electoral context, that matter would be raised with the Electoral Integrity Assurance Taskforce construct in the first instance, and that that advice would be provided to the Electoral Commissioner, and the Electoral Commissioner would be free to do whatever they so determined to do in respect of that. More generally, agencies who comprise the Electoral Integrity Assurance Taskforce construct would no doubt be providing advice within their chains as appropriate, whether it was to ministers in accordance with the caretaker conventions or at their own initiative under their statutory responsibilities.65

Preparations for the upcoming Federal Election

5.39
Officials provided evidence to the committee asserting that the role of the EIAT will increase in importance as the next Federal Election draws nearer. In preparation for the next Federal Election, Mr Peter Rush, Assistant Secretary, Parliamentary and Government Branch, Department of the Prime Minister and Cabinet, stated that the EIAT has been 'increasing its tempo' since March 2021:
Since about March this year the task force has been increasing its tempo to be prepared for the next federal election. The board has been meeting a little bit more regularly and the task force has also been getting together on a more regular basis. They've started consulting online media platforms to discuss the processes that will be in place during the electoral process.66
5.40
Further to this, Mr Lachlan Colquhoun, Department of the Prime Minister and Cabinet, noted that the role of the EIAT is limited between election cycles:
… the task force is a bit of a virtual task force. All the people who participate have day jobs. In between election cycles their involvement will be very small, and it will ramp up to being almost full-time in the lead-up to an election.67
5.41
Regarding the upcoming Federal Election, Mr Tom Rogers, AEC, stated that the EIAT would be co-located in the AEC's Command Centre, which 'is currently being constructed following the budget initiative in October last year'.68 Mr Rogers also noted that the EIAT has started meeting with social media platforms in preparation for the Federal Election.69

Cooperation between social media platforms and government

5.42
Social media platforms reported to the committee that they had interactions with various government departments and bodies. Given that social media platforms are currently the arbiters of what content remains on their platforms, government engagement with cooperative social media platforms is critical. This section describes the types of interactions that various social media platforms have with Australian government departments.
5.43
Facebook submitted that it had been working with Australian electoral authorities and that, prior to the 2019 Federal Election, it had 'established a productive working relationship with members of the Government's election integrity taskforce'70 and noted that it 'worked closely to quickly respond to all issues raised with us by Australian Government agencies'.71
5.44
Facebook added that it also works with state and territory electoral commissions to 'establish similar referral arrangements before elections in their states and territories'.72 At the committee's public hearing, Facebook also reported that it had been receiving referrals from multiple parts of the Australian government regarding inappropriate content, including the Department of Home Affairs, Department of Health, and DFAT.73
5.45
TikTok noted its work with the Australian Communications and Media Authority.74 In giving evidence to the committee, the Department of Home Affairs also noted that TikTok had provided assistance in 'a particular exercise that [the Department of Home Affairs] ran on the vector of online harmful content in terms of a terrorist incident'.75
5.46
WeChat described how it has been engaged in ongoing, cooperative relationships with Australian government agencies, including the Department of Home Affairs, the Australian Electoral Commission, and the Attorney-General's Department.76
5.47
Google stated that it works with Australia's law enforcement and intelligence community, including the National Counter Foreign Interference Coordinator.77 Mrs Lucinda Longcroft noted the wide variety of agencies that Google was working with, which included the Australian Federal Police, eSafety Commissioner, the Australian Competition and Consumer Commission, Australian Securities and Investments Commission, the Australian Taxation Office, and ACMA.78
5.48
Twitter has previously worked with the AEC, DFAT and Department of Home Affairs, as well as the EIAT. Ms Kara Hinesley, Director of Public Policy, Australia and New Zealand, Twitter, noted that Twitter worked with the EIAT during the 2019 Federal Election and has already begun 'facilitating conversations and meetings' ahead of the upcoming Federal Election.79
5.49
Government departments do occasionally contact Google to request the removal of content. However, Google noted that, while government departments do occasionally flag content with the company for removal, the vast majority of its content is self-identified, with only 86 of the 9.6 million videos removed by Google being flagged by the Australian Government.80
5.50
Government departments likewise described their interactions with social media platforms. The AEC noted that it has established and maintained ongoing relationships with prominent social media and digital platform providers in order to ensure the content on these platforms complies with the relevant provisions of the Electoral Act.81 The AEC described how its level of engagement with the social media platforms increased in the lead-up to the last Federal Election:
The level of engagement with these organisations was both vastly increased and improved for the 2019 federal election when compared to previous electoral events. We engaged, in person, with Facebook, Twitter, Google and WeChat in relation to the 2019 federal election in order to better understand their platforms, any relevant initiatives (e.g. political advertising transparency libraries), their policies and establish procedures to address electoral communications that breached electoral laws (e.g. was not properly authorised).82
5.51
Mr Tom Rogers, AEC, described how the EIAT is engaging with social media platforms prior to the next Federal Election:
One of the most relevant planning activities has included proactive meetings with prominent social media companies. These meetings provide an opportunity to re-establish key contact points and procedures to ensure we can respond quickly to address any issues that may emerge at election time.83
5.52
The relationship between social media platforms and governments is not always an easy one. Dr Richard Johnson, First Assistant Secretary, Social Cohesion, Department of Home Affairs, reported how the department had run into difficulties when attempting to report extremist content to social media platforms:
Some of the platforms … do not have a referral mechanism at all. Some of the offshore platforms which have built an ethos around freedom of expression et cetera, will not have a referral mechanism, so we can't refer it to them.84
5.53
There have also been historic difficulties with the AEC's attempt to engage with social media platforms. The Law Council of Australia also highlighted reports that challenges have arisen for the AEC when attempting to counter unauthorised online advertising that originates from overseas.85 In providing an example, the Law Council of Australia outlined Facebook's previous noncompliance with Australia's domestic advertising laws. It was reported in 2019 that Facebook had not adequately applied the rules set out by the Electoral Act to paid political advertising on its platform and did not respond to AEC inquiries about the source of advertising in a timely manner.86
5.54
Additionally, there is confusion from the social media platforms' perspective regarding reporting requirements (or lack thereof) to the Australian government. Following questioning regarding reporting arrangements, Mr Lee Hunter, General Manager, TikTok Australia and New Zealand, noted that—should TikTok find evidence of foreign interference on its platform—it was not aware of any requirement to report it, although he noted TikTok would voluntarily provide this information.87 Further to this, it was not clear which department ought to be reported to.88
5.55
Mr Neil Hawkins, Department of Home Affairs, when asked which department social media platforms would report CIB from a foreign state actor to, stated:
I'm not aware. I don't think they would talk to us [Department of Home Affairs]. They may talk to the Australian Cyber Security Centre, but I couldn't answer that point.89
5.56
When asked if platforms ought to report such content to DFAT, Mr Robert Hawkins, Assistant Secretary, DFAT, said that the department would 'welcome that' but noted that '[w]e have engagements with them, but it's not necessarily on, for example, an alert reporting basis.90
5.57
When asked about TikTok's statement regarding a lack of a clear reporting mechanism, Mr Hamish Hansford, First Assistant Secretary, Cyber, Digital and Technology Policy Division, Department of Home Affairs, stated that:
…it depends on the particular threat. If it's an image based abuse complaint or a particular issue on their platform, Cyber Report, through the eSafety Commissioner, is the reporting mechanism. If it's a cybersecurity incident, it's ReportCyber, through the partnership of the Australian Cyber Security Centre. There are obviously reporting mechanisms available through business liaison with the Australian Security Intelligence Organisation. So it really depends on the particular threat.91
5.58
In response to a follow-up question, which asked if any department was 'providing cogent guidance to the platforms about what their obligations are and what the appropriate communication channels are', Mr Hansford stated:
I think the answer is that there are a range of different places within government. There is no single place where social media companies can go to get comprehensive, whole-of-nation advice about each of the different vectors.92
5.59
Aside from this lack of a singular reporting mechanism, some social media platforms have specifically requested further cooperation with government in other areas. Facebook submitted that it 'believe[s] there are greater steps the Australian Government could take to engage in information-sharing with digital platforms and industry more broadly about foreign interference or influence operations'.93 Facebook added that it also works with state and territory electoral commissions to 'establish similar referral arrangements before elections in their states and territories'.94
5.60
Ms Kara Hinesley, Twitter, likewise raised the importance of further cooperation between governments and social media platforms:
What is important is to approach the issue as a broad geopolitical challenge, not one of content moderation. Removal of content alone will not address this challenge. The threat we face requires extensive partnership and collaboration with government entities, civil society, experts and industry peers. We each possess information that others do not have, and our combined efforts are more powerful together in combating these threats.95
Senator Jenny McAllister
Chair

  • 1
    Department of Home Affairs, Submission 16, p. 9.
  • 2
    Department of Home Affairs, Submission 16, p. 9.
  • 3
    Department of Home Affairs, Submission 16, p. 9.
  • 4
    Department of Home Affairs, Submission 16, p. 9.
  • 5
    The Hon. Paul Fletcher MP, Minister for Communications, Urban Infrastructure, Cities and the Arts, House of Representatives Hansard, 24 February 2021, p. 1785.
  • 6
    Australian Communications and Media Authority (ACMA), Submission 15, p. 2.
  • 7
    ACMA, Submission 15, p. 2.
  • 8
    Digital Industry Group (DIGI), Australian Code of Practice on Disinformation and Misinformation, 22 February 2021, p. 3.
  • 9
    DIGI, 'Australian Code of Practice on Disinformation and Misinformation', https://digi.org.au/disinformation-code/ (accessed 9 August 2021).
  • 10
    DIGI, Australian Code of Practice on Disinformation and Misinformation, 22 February 2021, pp. 4-5.
  • 11
    ACMA, 'Digital Platforms commit to action on Disinformation', https://www.acma.gov.au/articles/2021-02/digital-platforms-commit-action-disinformation (accessed 9 August 2021).
  • 12
    ACMA, 'Online misinformation and news quality in Australia: Position paper to guide code development', https://www.acma.gov.au/australian-voluntary-codes-practice-online-disinformation (accessed 9 August 2021).
  • 13
    As at the time of the committee's public hearing on 30 July 2021: see Mr Mike Makin, Assistant Secretary, News and Media Industry Branch, Department of Infrastructure, Transport, Regional Development and Communications, Committee Hansard, 30 July 2021, p. 38.
  • 14
    Ms Pauline Sullivan, First Assistant Secretary, Online Safety, Media and Platforms Division, Department of Infrastructure, Transport, Regional Development and Communications (DITRDC), Committee Hansard, 30 July 2021, p. 38.
  • 15
    From its publication in February 2021 to ACMA's reporting date in June 2021.
  • 16
    Ms Pauline Sullivan, First Assistant Secretary, DITRDC, Committee Hansard, 30 July 2021, p. 38.
  • 17
    Australian Electoral Commission (AEC), Submission 14, p. 1.
  • 18
    AEC, Submission 14, p. 1.
  • 19
    AEC, Submission 14, p. 1.
  • 20
    AEC, Submission 14, p. 1.
  • 21
    AEC, Submission 14, p. 1.
  • 22
    AEC, Submission 14, pp. 1-3.
  • 23
    AEC, Submission 14, p. 3.
  • 24
    AEC, Submission 14, p. 3.
  • 25
    AEC, Submission 14, p. 3.
  • 26
    AEC, Submission 14, p. 3.
  • 27
    AEC, Submission 14, p. 4.
  • 28
    Department of Foreign Affairs and Trade (DFAT), Submission 10, p. 3.
  • 29
    DFAT, Submission 10, p. 3.
  • 30
    Department of Foreign Affairs and Trade, Submission 10, p. 3.
  • 31
    Mr Neil Hawkins, Acting Deputy Coordinator and Acting First Assistant Secretary, National Counter Foreign Interference Coordination Centre, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 2.
  • 32
    Mr Lachlan Colquhoun, First Assistant Secretary, National Security Division, Department of the Prime Minister and Cabinet, Committee Hansard, 30 July 2021, p. 22.
  • 33
    Mr Neil Hawkins, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 7.
  • 34
    Mr Neil Hawkins, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 7.
  • 35
    DFAT, Submission 10, p. 4.
  • 36
    DFAT, Submission 10, p. 5.
  • 37
    Attorney-General's Department, Submission 13, p. 4.
  • 38
    Attorney-General's Department, Submission 13, p. 8.
  • 39
    Attorney-General's Department, Submission 13, p. 8.
  • 40
    Attorney-General's Department, Submission 13, p. 8.
  • 41
    Attorney-General's Department, Submission 13, p. 8.
  • 42
    Attorney-General's Department, Submission 13, p .9.
  • 43
    Attorney-General's Department, Submission 13, p. 9.
  • 44
    Attorney-General's Department, Submission 13, p. 9.
  • 45
    Attorney-General's Department, Submission 13, pp. 9-10.
  • 46
    AEC, Submission 14, p. 2; and Department of Home Affairs, Submission 16, p. 6.
  • 47
    AEC, Submission 14, p. 2.
  • 48
    AEC, 'Electoral Integrity Assurance Taskforce', https://www.aec.gov.au/elections/electoral-advertising/electoral-integrity.htm (accessed 14 August 2021).
  • 49
    Department of Home Affairs, Submission 16, p. 6.
  • 50
    Mr Lachlan Colquhoun, Department of the Prime Minister and Cabinet, Committee Hansard, 30 July 2021, p. 19.
  • 51
    Mr Patrick Hallinan, Acting First Assistant Secretary, Counter Foreign Interference Coordination Centre, Department of Home Affairs, Committee Hansard, 30 July 2021, p. 41.
  • 52
    Mr Lachlan Colquhoun, Department of the Prime Minister and Cabinet, Committee Hansard, 30 July 2021, p. 20.
  • 53
    Mr Lachlan Colquhoun, Department of the Prime Minister and Cabinet, Committee Hansard, 30 July 2021, p. 20.
  • 54
    Mr Lachlan Colquhoun, Department of the Prime Minister and Cabinet, Committee Hansard, 30 July 2021, p. 20.
  • 55
    Attorney-General's Department, Questions on Notice, Answers to questions on notice (30 July 2021 public hearing; received 13 August 2021), p. 1.
  • 56
    Mr Jeff Pope, Deputy Electoral Commissioner, AEC, Committee Hansard, 30 July 2021, p. 25.
  • 57
    Mr Tom Rogers, AEC, Committee Hansard, 30 July 2021, p. 25.
  • 58
    Mr Tom Rogers, AEC, Committee Hansard, 30 July 2021, p. 28.
  • 59
    Mr Jeff Pope, AEC, Committee Hansard, 30 July 2021, p. 25.
  • 60
    Mr Patrick Hallinan, Department of Home Affairs, Committee Hansard, 30 July 2021, p. 42.
  • 61
    Mr Tom Rogers, AEC, Committee Hansard, 30 July 2021, p. 25.
  • 62
    Mr Tom Rogers, AEC, Committee Hansard, 30 July 2021, p. 25.
  • 63
    Mr Tom Rogers, AEC, Committee Hansard, 30 July 2021, p. 28.
  • 64
    Mr Nathan Williamson, Deputy Secretary, Governance and Resource Management, Department of Finance, Committee Hansard, 30 July 2021, p. 28.
  • 65
    Mr Patrick Hallinan, Department of Home Affairs, Committee Hansard, 30 July 2021, p. 42
  • 66
    Mr Peter Rush, Department of the Prime Minister and Cabinet, Committee Hansard, 30 July 2021, p. 20.
  • 67
    Mr Lachlan Colquhoun, Department of the Prime Minister and Cabinet, Committee Hansard, 30 July 2021, p. 19.
  • 68
    Mr Tom Rogers, AEC, Committee Hansard, 30 July 2021, p. 24.
  • 69
    Mr Tom Rogers, AEC, Committee Hansard, 30 July 2021, p. 24.
  • 70
    Facebook, Submission 27, p. 19. The organisations Facebook states that it worked with are the AEC, the National Counter Foreign Interference Coordinator, the Department of Home Affairs, and the Department of Communications and the Arts (now DITRDC).
  • 71
    Facebook, Submission 27, p. 19.
  • 72
    Facebook, Submission 27, p. 19.
  • 73
    Mr Josh Machin, Head of Policy, Australia, Facebook, Committee Hansard, 30 July 2021, p. 5.
  • 74
    TikTok, Submission 26, p. 4.
  • 75
    Mr Hamish Hansford, First Assistant Secretary, Cyber, Digital and Technology Policy Division, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 16.
  • 76
    WeChat, Submission 30, p. 3.
  • 77
    Mr Richard Salgado, Director, Law Enforcement and Information Security, Google, Committee Hansard, 30 July 2021, p. 11 and p. 13.
  • 78
    Mrs Lucinda Longcroft, Director, Government Affairs and Public Policy, Australia and New Zealand, Google Australia, Committee Hansard, 30 July 2021, p. 13.
  • 79
    Ms Kara Hinesley, Director of Public Policy, Australia and New Zealand, Twitter, Committee Hansard, 30 July 2021, p. 49.
  • 80
    Mrs Lucinda Longcroft, Google Australia, Committee Hansard, 30 July 2021, p. 11.
  • 81
    AEC, Submission 14, p. 3.
  • 82
    AEC, Submission 14, p. 3.
  • 83
    Mr Tom Rogers, AEC, Committee Hansard, 30 July 2021, p. 24.
  • 84
    Dr Richard Johnson, First Assistant Secretary, Social Cohesion, Department of Home Affairs, Committee Hansard, 30 July 2021, pp. 43-44.
  • 85
    Law Council of Australia, Submission 18, p. 33.
  • 86
    Law Council of Australia, Submission 18, p. 32.
  • 87
    Mr Lee Hunter, General Manager, TikTok Australia and New Zealand, TikTok Australia, Committee Hansard, 25 September 2020, p. 11.
  • 88
    Mr Lee Hunter, TikTok Australia and New Zealand, TikTok Australia, and Mr Brent Thomas, Director of Public Policy, Australia and New Zealand, TikTok Australia Committee Hansard, 25 September 2020, pp. 11-12.
  • 89
    Mr Neil Hawkins, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 5.
  • 90
    Mr Robert Hawkins, Assistant Secretary, DFAT, Committee Hansard, 11 December 2020, p. 6.
  • 91
    Mr Hamish Hansford, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 17.
  • 92
    Mr Hamish Hansford, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 17.
  • 93
    Facebook, Submission 27, p. 20.
  • 94
    Facebook, Submission 27, p. 19.
  • 95
    Ms Kara Hinesley, Twitter, Committee Hansard, 30 July 2021, pp. 47-48.

 |  Contents  |