Chapter 1

Committee view and recommendations

Context and objectives of this inquiry

1.1
Over the coming years, Australia will have to grapple with some big policy questions. How do we respond to climate change? Where will we find our future sources of shared economic prosperity? What role should we play in our region?
1.2
These are not easy questions. They ask us to consider what type of a nation we want Australia to be. Our best chance of finding good answers lies in leveraging our collective capacity for informed and inclusive public discussion. These discussions can be difficult to have at the best of times. They are next to impossible, however, if our digital public squares are crowded with disinformation and populated by inauthentic actors.
1.3
This senate inquiry was initiated following experiences in other democratic jurisdictions that showed what such a scenario may look like. Public inquiries into the conduct of the 2016 United States Presidential Election and the 2019 United Kingdom General Election showed some of the ways foreign (or foreign backed) actors, states and groups can use social and digital media to interfere with democratic discourse and election processes.
1.4
It would be naive to imagine that Australian elections and public debates have not, and will not, be the subject of similar attempts. The policy challenges facing Australia over the coming years are of interest to more than just Australians. There are a range of foreign governments, organisations and individuals who stand to win or lose from Australia's political and policy decisions. Experiences from overseas show us there are some foreign actors who also seek to introduce discord and social conflict as an aim unto itself. Technological developments mean that these actors have more options available than ever before to influence Australia's processes.
1.5
This present inquiry is not an investigation into the conduct of previous Australian elections—that task remains the responsibility of the Joint Standing Committee on Electoral Matters (JSCEM). Neither is it intended to be a definitive and exhaustive statement of the problem—much of the material the committee would need to undertake this task is not in the public domain, and the material that is available is subject to constant revision and revelations.1
1.6
Instead, this inquiry seeks to undertake a point-in-time assessment about the key risks posed by foreign interference via social media and the headline responses available.
1.7
This interim report sets out the evidence received from stakeholders about the nature of the problem facing Australia. It makes recommendations about the key steps government should urgently consider to enable our policy infrastructure to respond to the challenge in light of the impending federal election.
1.8
This report includes summaries of the evidence received about the motivations for foreign interference, the forms that it can take, the elements of social media platforms and their usage that foreign interference attempts often seek to exploit, and the social and cultural features that foreign interference attempts interface with. The committee will be calling for further submissions about the policy responses regarding social media platforms and users, civil society, news media and the information environment. Further consideration of these policy responses and relevant recommendations will be the subject of a future report by this committee.

The nature of the threat to Australian democratic processes

1.9
There has been considerable media coverage and public discussion of foreign interference efforts affecting other western democracies. It is natural for Australians to wonder: have we also been the subject of a concerted foreign interference attempt via social media?
1.10
The JSCEM concluded that there was 'limited evidence of social media manipulation within Australia, including minimal use of bots'2 during the 2019 election. The Australian Strategic Policy Institute (ASPI) observed some examples of cyber enabled foreign interference that justified Australia's inclusion in its 2020 report on the issue.3
1.11
The Department of Home Affairs noted that it regularly observes 'campaigns unfolding on social media that involve disinformation'.4 Some have been linked to foreign state actors:
In 2017, following a terrorist attack in Brighton, Melbourne, the Department identified Tweets associated with accounts that have since been publicly attributed by Twitter to a foreign government entity.
In another Australian example from 2017, accounts linked to the same foreign government entity were involved in discussions related to a plot to bomb an Etihad airlines flight departing Sydney International Airport. One account used the disrupted plot to promote and amplify the hashtags "#MuslimBan" and "#StopImportingIslam". In this instance, hostile foreign state actors used social media to interfere in Australia's public discourse and attempt to undermine social cohesion.5
1.12
Submitters to this inquiry did not contend, however, that Australia had been the target of any large-scale, coordinated attempts.
1.13
This committee strongly believes that this is not a reason for inaction.
1.14
It is possible, if not likely, that Australia will face such an attempt in the future.
1.15
An explanation of the mechanisms by which foreign interference can be undertaken using social media is set out in chapter 2 of this report. In short, however, as Dr Wallis from ASPI noted:
Authoritarian states have identified influence operations as a cheap yet effective mechanism for influencing and weakening liberal democratic societies and regional alliances.6
1.16
Foreign interference is more complex than just trying to boost one candidate over another in an election. Submitters to this inquiry spoke of a range of actors trying to do a range of things: actively sowing misinformation about particular issues, trying to inflame existing social divisions, or just creating a general environment of distrust.7
1.17
The consequences for Australia of a serious attempt could be severe in ways that are difficult to predict. Even a clumsy, unsophisticated effort runs the risk of undermining our ability as a nation to have the public discussions we need to deal with complex issues. As seen in other jurisdictions, it is enough to question the authenticity of a result for confusion and disunity to follow.
1.18
This committee believes that government should adopt an approach analogous to the precautionary principle in preparing to meet this challenge.
1.19
Waiting for a serious attempt before acting would be a mistake. Examples from overseas show that malign actors often seek to exploit existing social fissures or hotbutton issues. A serious, concerted attempt at foreign interference could involve the amplification or manipulation of views that already have a domestic audience. It is not inconceivable that some Australian companies, organisations or even domestic political actors could believe that they may benefit from the relevant social media activity and so become unwitting participants in a foreign interference campaign. All of this would make mounting an effective response in real time challenging.
1.20
This committee believes that government must approach the problem of foreign interference through social media with urgency and seriousness in order to create the institutional architecture needed.
1.21
Unfortunately, the government's actions so far have fallen short of this.

The adequacy of Australia's response to the challenge

1.22
Experts have been clear that what is required is a coordinated, cohesive response.
1.23
Ms Katherine Mansted told this committee:
… we need a coordinated approach, we need a strategic approach and we need one that blends issues of domestic and foreign policy, issues of traditional security and non-traditional security …
I think that what matters most is having a body that has the ability to look through multiple different sides of this problem, is resourced to do so and has both the informal and formal authority to take that step. This isn't just a national security problem either; it is also an economic problem.8
1.24
No such body has been established.
1.25
Nor has the government developed a coordinated approach. The committee was concerned by the convoluted answer to a simple question: who is in charge?
CHAIR: Mr Hawkins, do you consider that Home Affairs is in the lead for policy development on disinformation and misinformation?
Mr Hawkins: My area, at least, is in the lead for countering foreign interference. Disinformation and misinformation do not necessarily need to be foreign interference. They could be domestic. Our interests—
CHAIR: Let me be more specific, then, Mr Hawkins. Are you the policy lead for foreign interference through digitally enabled, cyber, foreign interference?
Mr Hawkins: Cyber is another area. We have a cyber team here, but there's the Australian Cyber Security Centre. We have our Ambassador for Cyber Affairs, who's here as well. But we're not responsible for the cyber elements, no.
CHAIR: No, but for disinformation and misinformation that would constitute foreign interference, you are the policy lead?
Mr Hawkins: If it's foreign interference, if it's a foreign actor and accords with those three, then, yes, we would be.9
1.26
The end result is that departments and officials are not across the work that is happening internally.
1.27
For example, the First Assistant Secretary of the Department of the Prime Minister and Cabinet's National Security Division was unaware that the COVID19 taskforce was undertaking work to combat online disinformation and misinformation:
CHAIR: Were you not aware of that process, Mr Colquhoun?
Mr Colquhoun: I may have seen a product of it somewhere, but, other than that, I'm not involved at all, no.
CHAIR: And not aware of it?
Mr Colquhoun: No.10
1.28
Officials from the Department of Home Affairs (the supposed policy lead) were not aware which platforms were supposed to report foreign interference attempts to:
CHAIR: I'm thinking about the somewhat proactive stance that's been taken by, for example, Facebook and Twitter in identifying coordinated inauthentic activity on their platforms. They publish regular reports about it. If they identify coordinated inauthentic activity that they attribute to a foreign state actor in the Australian context targeted at Australians, who do they talk to?
Mr Hawkins: I'm not aware. I don't think they would talk to us. They may talk to the Australian Cyber Security Centre, but I couldn't answer that point.11
1.29
The platforms themselves were confused as well. Representatives from TikTok did not know if they were required to report any coordinated foreign interference attempts that they detected on their platform, let alone who they could even report this to. This is unsurprising given that, at the time TikTok appeared before the committee, the government had never contacted them about their expectations.
CHAIR: Mr Thomas, do we know who we would notify if we saw something happening?
Mr Thomas: I expect that would be some combination of DFAT, the Defence and the department of communications.
CHAIR: But no request has been made of you, or no clear instruction has been provided, about who to notify and under what circumstances?
Mr Thomas: That's correct.12
1.30
Members of the committee have examined the aims of the relevant peak documents of the Departments of Home Affairs, Foreign Affairs and Trade, Defence, and other relevant agencies and consider that there is little cohesion in how they are written and limited interaction between them.
1.31
There is also no published strategy for combatting foreign interference via social media. Australia's Counter Foreign Interference Strategy is five dot points and six supporting sentences on a webpage.13 There is nothing specific relating to foreign interference via social media.
1.32
Existing institutions for bringing together agencies to address foreign interference are unfortunately limited in scope. The Election Integrity Assurance Taskforce has answers to relatively simple questions, such as who should respond to a disinformation campaign about the electoral process. Its evidence to the committee showed it less able to answer more complex questions.
1.33
For example, it is not clear who is responsible for responding to a disinformation campaign that targets the information environment in an election period. A campaign of this kind could involve social media activity that attacks certain issues or particular participants such as unions, political parties, industry associations or ethnic groups. The Department of Home Affairs told the committee that responding to this would be the role of the Australian Electoral Commission (AEC); however, the AEC does not believe its legislation provides any basis for responding to such a campaign.
1.34
Likewise, the Taskforce does not seem to have been requested by the government to develop a clear framework to guide any response to foreign disinformation campaigns during the formal election period. The uncertainty about how the response to a disinformation campaign would interact with caretaker obligations is central to this. The decision to reveal or conceal evidence of a foreign-backed disinformation attempt is one that could have enormous implications during an election campaign. Both the Department of the Prime Minister and Cabinet and the Department of Home Affairs told the committee that while it was open to the Minister to brief the opposition, there was no obligation to do so under the caretaker conventions.
1.35
The committee believes this is not an acceptable state of affairs. Creating space for partisan decision-making about disinformation creates vulnerabilities in our institutional arrangements that malign foreign actors could exploit.
1.36
Transparency about the operations of institutions like the Election Integrity Assurance Taskforce would be an important first step. However, key aspects of the Taskforce are hidden. The functions and operations procedures are unclear even to its own members. There is a lack of certainty about responsibilities. Although the members can articulate their qualifications to be on the taskforce (for example, the Department of Communications is an expert on the social media platforms), there is no certainty about what their responsibilities and powers are, let alone the powers of others. The Taskforce is governed by terms of reference that have been kept secret to this committee and the public at large.

Next steps

1.37
Government needs to prioritise building the institutional architecture needed to respond effectively and proactively to the threat of foreign interference.

Recommendation 1

1.38
The committee recommends that the Australian Government clearly delegate lead accountability for cyber-enabled foreign interference to a single entity in government.
1.39
The committee notes the testimony of Dr Michael Jensen, referencing the comments made in the Australian Security Intelligence Organisation's Annual report for Financial Year 2020-21, which stated that 'based on current trends, we anticipate that espionage and foreign interference will supplant terrorism as Australia's principal security concern over the next five years.'14
1.40
Foreign interference is increasingly occurring online. ASPI's report on Cyber enabled foreign interference in elections and referendums found that 'the use of cyber-enabled techniques to interfere in foreign elections and referendums has increased significantly.'15
1.41
Australia is not immune from this challenge. Dr Jake Wallis and Mr Thomas Uren's submission to the committee noted that '[d]uring the 2019 Australian federal election financially-motivated actors from Kosovo, Albania and the Republic of North Macedonia used nationalistic and Islamophobic content to target and manipulate Australian Facebook users.'16

Recommendation 2

1.42
The committee recommends that the Australian Government take a proactive approach to protecting groups that are common targets of foreign interference but are not classified as government institutions.
1.43
The committee notes that there are currently no clear protections for groups that influence Australia's democracy but sit outside of government, such as diaspora groups, research institutions and political parties.
1.44
The committee considered that in the context of high-profile attacks on these groups around the world by authoritarian countries, such as the Pegasus spyware, the government needs to be more proactive in protecting these groups from cyber-enabled foreign interference, such as by offering tools and advice on current threats and how to mitigate them, including in-language resources.

Recommendation 3

1.45
The committee recommends that the Australian Government establish appropriate, transparent, and non-political institutional mechanisms for publicly communicating cyber-enabled foreign interference in our elections and review the processes and protocols for classified briefings for the Opposition during caretaker with respect to cyber-enabled foreign interference.
1.46
The committee refers to the comments of Parliamentary Joint Committee on Intelligence and Security (PJCIS) in its Advisory report on the Security Legislation Amendment (Critical Infrastructure) Bill 2020 and Statutory review of the Security of Critical Infrastructure Act 2018.
1.47
The PJCIS found that 'foreign interference, disinformation and cyber-attacks are new risks to the free and fair conduct of elections in Australia, the Committee recommends that the caretaker conventions be updated to reflect this new context.'17

Recommendation 4

1.48
The committee recommends that the Australian Communications and Media Authority's report into the functioning of the Australian Code of Practice on Disinformation and Misinformation be publicly released as a matter of priority.
1.49
To date, the government has not released the report by the Australian Communications and Media Authority (ACMA) into of the Australian Code of Practice on Disinformation and Misinformation, which covers the adequacy of digital platforms' measures and the broader impacts of misinformation in Australia.18 While departmental witnesses noted that the report was currently before the relevant minister,19 there is a clear public interest for the report to be released in order for the efficacy of the currently voluntary code to be assessed by policymakers and the wider research community. At present, ACMA is waiting on the Minister's feedback on the report before further regulatory activity can commence,20 despite the rapidly approaching Federal Election. ACMA's decision to 'seek to see how things progress over the next couple of months' demonstrates a lack of urgency within government on this matter.21

Recommendation 5

1.50
The committee recommends that the Australian Government publicly release the Electoral Integrity Assurance Taskforce's terms of reference.
1.51
The committee also found that departmental arrangements and division of responsibilities around foreign interference through social media were complicated, onerous and lacked transparency. Despite existing since 2018, the Electoral Integrity Assurance Taskforce (EIAT) has not made its terms of reference publicly available.22 External confusion regarding different agencies' roles within the EIAT is a problem, as is the lack of activity by the EIAT to clearly communicate its approach to the upcoming Federal Election.
1.52
As the EIAT does have a clear role—or, at least, its constituent agencies do—in combatting foreign interference through social media, it is imperative that its terms of reference be released to the public before the next Federal Election. It is the committee's view that public confidence in government bodies, especially those dedicated to electoral integrity, is diminished when a taskforce's terms of reference are hidden. The Australian Government's decision not to publish the terms of reference is unfortunate.

Recommendation 6

1.53
The committee recommends that the Australian Government establish clear requirements and pathways for social media platforms to report suspected foreign interference, including disinformation and coordinated inauthentic behaviour, and other offensive and harmful content, and formalise agency remits, powers and resourcing arrangements accordingly.
1.54
Lastly, the committee was told that existing arrangements for social media platforms to engage with government were inefficient and unclear. At present, should a social media platform identify foreign interference it is optional for them to report it to government.
1.55
While the committee notes that many departments have sought to engage with social media platforms, the lack of a clear reporting process for social media companies is a problem. The committee was concerned by TikTok's acknowledgement that it was unsure who it should inform if it detected foreign interference on its platform.23 Given the impending Federal Election, it is imperative that the government establish clear policies and procedures for social media platforms to refer potential foreign interference for consideration by the relevant government departments or entities.

Recommendation 7

1.56
The committee recommends that the Election Integrity Assurance Taskforce undertake an audit to assess capability relevant to detecting disinformation prior to the coming election and, further, that the Australian Government consider providing information about relevant capabilities and resourcing to this committee as appropriate to assist in our deliberations.

  • 1
    For example, just in the month prior to this report being finalised new information surfaced regarding the operations of Clearview, and there was reporting from a trove of leaked internal Facebook documents. It seems possible, if not probable, that further revelations about various platforms will come to light that augments our understanding of their operations.
  • 2
    Joint Standing Committee on Electoral Matters (JSCEM), Report on the conduct of the 2019 federal election and matters related thereto, December 2020, p. 122.
  • 3
    See 2.13 and following in this report.
  • 4
    Department of Home Affairs, Submission 16, p. 7.
  • 5
    Department of Home Affairs, Submission 16, p. 7.
  • 6
    Dr Jake Wallis, Senior Analyst, International Cyber Policy Centre, Australian Strategic Policy Institute (ASPI), Committee Hansard, 22 June 2020, p. 10.
  • 7
    See 4.2 and following of this report.
  • 8
    Ms Katherine Mansted, Committee Hansard, 22 July 2020, p. 19.
  • 9
    Mr Neil Hawkins, Acting Deputy Coordinator and Acting First Assistant Secretary, National Counter Foreign Interference Coordination Centre, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 3.
  • 10
    Mr Lachlan Colquhoun, First Assistant Secretary, Department of the Prime Minister and Cabinet, Committee Hansard, 11 December 2020, p. 3.
  • 11
    Mr Hawkins, Department of Home Affairs, Committee Hansard, 11 December 2020, p. 5.
  • 12
    Mr Brent Thomas, Director of Public Policy, Australia and New Zealand, TikTok Australia, Committee Hansard, 25 September 2020, p. 12.
  • 13
    See Department of Home Affairs, 'Australia's Country Foreign Interference Strategy', https://www.homeaffairs.gov.au/about-us/our-portfolios/national-security/countering-foreign-interference/cfi-strategy (accessed 7 December 2021).
  • 14
    Australian Security Intelligence Organisation, 2020-21 Annual Report, 19 October 2021, p. 4.
  • 15
    Sarah O'Connor, Fergus Hanson, Emilia Currey, and Tracy Beattie (ASPI), Cyber-enabled foreign interference in elections and referendums, 28 October 2020, p. 6.
  • 16
    Dr Jake Wallis and Mr Thomas Uren, ASPI, Submission 2, p. 1.
  • 17
    Parliamentary Joint Committee on Intelligence and Security (PJCIS), Advisory report on the Security Legislation Amendment (Critical Infrastructure) Bill 2020 and Statutory review of the Security of Critical Infrastructure Act 2018, September 2021, p. 54.
  • 18
    Australian Communications and Media Authority (ACMA), 'Development of a voluntary code', https://www.acma.gov.au/online-misinformation (accessed 9 August 2021).
  • 19
    Ms Pauline Sullivan, First Assistant Secretary, Online Safety, Media and Platforms Division, Department of Infrastructure, Transport, Regional Development and Communications (DITRDC), Committee Hansard, 30 July 2021, p. 38.
  • 20
    Ms Pauline Sullivan, DITRDC, Committee Hansard, 30 July 2021, p.38.
  • 21
    Ms Pauline Sullivan, DITRDC, Committee Hansard, 30 July 2021, p.38.
  • 22
    Mr Jeff Pope, Deputy Electoral Commissioner, Australian Electoral Commission, Committee Hansard, 30 July 2021, p. 25.
  • 23
    Mr Lee Hunter, General Manager, TikTok Australia and New Zealand, TikTok Australia, and Mr Thomas, TikTok Australia, Committee Hansard, 25 September 2020, p. 12.

 |  Contents  |