Chapter 2

Inquiry details

2.1
On 5 December 2019, the Senate resolved to establish a Select Committee on Foreign Interference through Social Media to inquire into and report on the risk posed to Australia's democracy by foreign interference through social media, with particular reference to:
(a)
The use of social media for purposes that undermine Australia's democracy and values, including the spread of misinformation;
(b)
responses to mitigate the risk posed to Australia's democracy and values, including by the Australian Government and social media platforms;
(c)
international policy responses to cyber-enabled foreign interference and misinformation;
(d)
the extent of compliance with Australian laws; and
(e)
any related matters.1

Conduct of the inquiry

2.2
In accordance with usual practice, the committee advertised the inquiry on its website and wrote to relevant individuals and organisations inviting submissions. The initial date for receipt of submissions was 13 March 2020. On 27 May 2021, the committee announced that submissions would reopen until 31 October 2021. Thus far, the committee has received 43 submissions, which are listed at Appendix 1.
2.3
Despite the substantial disruption caused by the ongoing COVID-19 pandemic, the committee held public hearings in Canberra on:
22 June 2020;
25 September 2020;
11 December 2020; and
30 July 2021.
2.4
A list of the organisations and witnesses that attended these public hearings can be found within Appendix 2. The public submissions, additional information received and Hansard transcripts are available on the committee's website: www.aph.gov.au/Parliamentary_Business/Committees/Senate/Foreign_Interference_through_Social_Media

Report structure

2.5
The report contains the following chapters:
Chapter 1 contains the committee's view and associated recommendations;
Chapter 2 summarises the context and administrative details of the inquiry, as well as previous domestic reports into the issue of foreign interference through social media in Australia;
Chapter 3 outlines the mechanisms that are currently being used to combat foreign interference in Australia;
Chapter 4 explores the key issues raised by submitters and witnesses regarding foreign interference through social media in Australia; and
Chapter 5 examines the current arrangements the Australian Government has in place to respond to foreign interference through social media.

Scope of the report

2.6
This interim report outlines the nature of foreign interference through social media and the challenges presented to Australia’s democracy. In doing so, the report focuses on Facebook,2 Twitter, Google,3 TikTok and WeChat as these social media platforms have significant reach globally and within Australia. While foreign interference through social media is a global issue and international examples are referred to in this report, the committee's terms of reference necessitate a focus on the domestic context in Australia. Accordingly, the report considers the issue of foreign interference through social media in the context of the upcoming Federal Election and Australia’s ongoing response to the COVID-19 pandemic. The report also considers the arrangements that the Australian Government has established to identify and respond to instances of foreign interference through social media in Australia.

Definitions

2.7
In this report, foreign interference, foreign influence, misinformation and disinformation are utilised to describe distinct phenomena. The Department of Home Affairs observes that these concepts cover a range of activities and, as such, not all influence activities can be considered foreign interference. For example:
Foreign Interference: Clandestine activities carried out by, or on behalf of, a foreign actor which seek to interfere in decision-making, political discourse or other societal norms. Foreign interference is coercive, covert, deceptive or corrupting and is contrary to Australia's sovereignty, values and national interests.
Foreign Influence: Overt activities to advocate for particular outcomes or shape consideration of issues important to foreign actors. When conducted in an open and transparent manner, these activities can contribute positively to public debate.
Disinformation: False information designed to deliberately mislead and influence public opinion or obscure the truth for malicious or deceptive purposes. Disinformation can be intended for financial gain (such as clickbait stories), but have an incidental effect on public opinion or debate.
Misinformation: False information that is spread due to ignorance, by error or mistake with good intentions/without the intent to deceive.4
2.8
The committee's terms of reference specifically refer to both foreign interference and misinformation. The impact of foreign influence is not referred to in the committee’s terms of reference. Foreign influence is therefore primarily referred to in this report with regard to the Foreign Influence Transparency Scheme, due to the scheme's increased role during election periods and its interactions with the Electoral Integrity Assurance Taskforce.
2.9
Additionally, disinformation—while not directly anticipated by the committee's terms of reference—has many of the same malicious impacts as misinformation and is therefore considered within the report. The 2019 Australian Code of Practice on Disinformation and Misinformation describes disinformation as:
(a)
digital content that is verifiably false or misleading or deceptive;
(b)
propagated amongst users of digital platforms via inauthentic behaviours; and
(c)
the dissemination of which is reasonably likely to cause harm.5

Other inquiries and reports

2.10
A range of inquiries and reports have explored the risks posed to Australian democracy by foreign interference through social media. While there are numerous international reports that consider the issues associated with foreign interference through social media and propose solutions in these areas, the reports in this section consider Australia's specific domestic context.

The Australian Strategic Policy Institute

2.11
The Australian Strategic Policy Institute (ASPI) has undertaken a number of relevant reports into foreign interference through social media, as well as the spread of misinformation and disinformation online. A selection of ASPI's reports is outlined below.

Hacking democracies: Cataloguing cyber-enabled attacks on elections

2.12
On 17 May 2019, ASPI released the report Hacking democracies: Cataloguing cyber-enabled attacks on elections. This report sought to catalogue cyberenabled foreign interference that occurred in elections following Russia's interference in the 2016 United States election, cataloguing the interference into three groups:
interference targeting voting infrastructure and voter turnout;
interference in the information environment; and
longer term efforts to erode public trust in governments, political leadership and public institutions.6
2.13
The report found that '[o]f the 97 national elections in free or partly free countries reviewed for this report during the period from 8 November 2016 to 30 April 2019, a fifth (20 countries) showed clear examples of foreign interference, and several countries had multiple examples'.7 The report recommended that nations:
recognise the source of foreign interference (namely, Russia and China);
build up their detection capabilities;
fund research to measure impact and measure the effectiveness of education campaigns to address public concerns;
publicly fund the defence of political parties;
impose costs on adversaries;
look beyond digital interference in democracies; and
look beyond other nations as sources of interference, recognising that other actors may be seeking to interfere.8
2.14
The report's appendix lists detailed examples of interference in elections that was primarily suspected of being perpetrated by Russia and China, including the Australian 2019 Federal Election. ASPI reported that interference was via the targeting of the Liberal, Labor and National Parties and attempts to access servers located at Parliament House.9

Cyber-enabled foreign interference in elections and referendums

2.15
On 28 October 2020, ASPI released its report Cyber-enabled foreign interference in elections and referendums, which 'identified 41 elections and seven referendums between January 2010 and October 2020 where cyber-enabled foreign interference was reported'.10 The report further noted that '[d]emocratic societies are yet to develop clear thresholds for responding to cyber-enabled interference, particularly when it's combined with other levers of state power or layered with a veil of plausible deniability'.11
2.16
The report found that cyber-enabled interference had occurred on six continents, including Australia. It further observed that 33 countries—also including Australia—had experienced cyber-enabled foreign interference in at least one election cycle or referendum.12
2.17
The report's view that Australia had previously experienced foreign interference in an election was based on the 2019 Federal Election. The reported stated:
According to the Sydney Morning Herald, Australian Prime Minister Scott Morrison confirmed on 18 February 2019 that a hacker group had targeted the Liberal, Labor and National parties and accessed the fileservers at Parliament House ahead of the federal election. The Prime Minister noted that the breach, which occurred on 8 February 2019, was the work of a 'sophisticated' but did not make any formal attributions. A number of sources within the Australian Signals Directorate (ASD)—Australia's cyber intelligence agency—confirmed that their investigation had concluded China was responsible.13

Joint Standing Committee on Electoral Matters

2.18
The Joint Standing Committee on Electoral Matters (JSCEM) has undertaken a number of inquiries that examine issues related to foreign interference through social media in Australia, most prominently through its reporting on the conduct of the 2019 Federal Election.14
2.19
In December 2020, the JSCEM released its final report on the 2019 Federal Election, entitled Report on the conduct of the 2019 federal election and matters related thereto, in December 2020. The report noted that '[o]ver the past few years there has been a significant rise in the proliferation of disinformation and misinformation, particularly on social media and search platforms'.15
2.20
Ultimately, the committee reported that it had not found substantive foreign interference via social media during the 2019 Federal Election:
The JSCEM also found limited evidence of social media manipulation within Australia, including minimal use of bots. However, given the significant rise in organised social media manipulation campaigns, we must remain vigilant.16
2.21
The report made a number of pertinent recommendations that are pertinent to this inquiry, including:
Recommendation 14: The Committee recommends that the current work of the Australian Competition and Consumer Commission and the Australian Communications and Media Authority to adapt regulation so it can keep pace with technological change, clearly addresses electoral and political advertising. It also recommends these agencies form a working group with the Australian Electoral Commission and other key stakeholders to ensure this important area is addressed as a priority.17
Recommendation 15: The Committee recommends that the Electoral Integrity Assurance Taskforce be engaged permanently to prevent and combat cyber manipulation and electoral/foreign interference in Australia's democratic process and to provide post-election findings regarding any pertinent incidents to the Joint Standing Committee on Electoral Matters, including through in camera and open briefing.18
2.22
At the time of writing, the Australian Government has not responded to this report.

Australian Competition and Consumer Commission

2.23
The Australian Competition and Consumer Commission (ACCC) has undertaken two inquiries into digital platforms, which are described below.

Digital platforms inquiry

2.24
On 26 July 2019, the ACCC released its final report on its digital platforms inquiry. The inquiry examined the impact of digital platforms on consumers, businesses using platforms to advertise to and reach customers, and news media businesses that also use the platforms to disseminate their content.
2.25
The report found that the ubiquity of the Google and Facebook platforms has placed them in a privileged position. Moreover, the report found that the opaque operations of digital platforms and their presence in interrelated markets means that it is difficult to determine precisely what standard of behaviour these digital platforms are meeting.19
2.26
The ACCC noted that the collection of user data is central to the business models of most advertiser-funded platforms. User data enables digital platforms to offer highly targeted or personalised advertising opportunities to advertisers, enabling platforms to provide highly tailored products to advertisers.20
2.27
The ACCC observed that while Australian consumers benefit from the many 'free' services offered by digital platforms and most users have at least some understanding that certain types of user data and personal information are collected in return for their use of a service, few consumers are fully informed of, fully understand or effectively control the scope of data collected and the bargain they are entering into with digital platforms when they sign up for or use their services.21
2.28
The ACCC expressed concern that the existing regulatory frameworks for the collection and use of data have not held up well to the challenges of digitalisation and the practical reality of targeted advertising that rely on the monetisation of consumer data and attention.22 These concerns were not limited to digital platforms, with an increasing number of businesses across the economy collecting and monetising consumer data.
2.29
The ACCC recommended that the Privacy Act 1988 (the Privacy Act) be amended to ensure consumers are adequately informed, empowered and protected as to how their data is being used and collected. The ACCC also suggested that now is the time to consider the current and likely future issues associated with digital platforms and their business models and to put in place frameworks that enable adverse consequences to be addressed and that reduce the likelihood of new issues arising.23

Government response and ongoing activities

2.30
On 12 December 2019, the Australian Government released its response to the digital platforms inquiry's final report. The Australian Government committed to:
establishing a special unit in the ACCC to monitor and report on the state of competition and consumer protection in digital platform markets, take enforcement action as necessary, and undertake inquiries as directed by the Treasurer, starting with the supply of online advertising and ad-tech services;
addressing bargaining power concerns between digital platforms and media businesses by tasking the ACCC to facilitate the development of a voluntary code of conduct;
commencing a staged process to reform media regulation towards an end state of a platform-neutral regulatory framework covering both online and offline delivery of media content to Australian consumers; and
ensuring privacy settings empower consumers, protect their data and best serve the Australian economy by building on our commitment to increase penalties and introduce a binding online privacy code announced in the 201920 Budget, through further strengthening of Privacy Act protections, subject to consultation and design of specific measures as well as conducting a review of the Privacy Act.24
2.31
The Attorney-General's Department has commenced its review of the Privacy Act. On 30 October 2020, the Department released an issues paper that outlined current privacy laws and sought feedback on potential issues relevant to reform, which included the terms of reference for the review. The review will cover:
the scope and application of the Privacy Act;
whether the Privacy Act effectively protects personal information and provides a practical and proportionate framework for promoting good privacy practices;
whether individuals should have direct rights of action to enforce privacy obligations under the Privacy Act;
whether a statutory tort for serious invasions of privacy should be introduced into Australian law;
the impact of the notifiable data breach scheme and its effectiveness in meeting its objectives;
the effectiveness of enforcement powers and mechanisms under the Privacy Act and how they interact with other Commonwealth regulatory frameworks; and
the desirability and feasibility of an independent certification scheme to monitor and demonstrate compliance with Australian privacy laws.25
2.32
On 30 July 2021, Ms Julia Galluccio, Assistant Secretary, Information Law Branch, Attorney-General's Department, updated the committee on the progress of the report, stating that the Department had received 200 submissions in response to the issues paper and was now 'in the process of finalising a discussion paper which will, again, be released for public consultation'.26 Ms Galluccio stated that, following this discussion paper, the Department would begin its final report.27
2.33
Additionally, on 25 October 2021 the Attorney-General's Department released an exposure draft of the Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021, which seeks to strengthen the Privacy Act through 'enabl[ing] the introduction of a binding online privacy code for social media and certain other online platforms, and increases penalties and enforcement measures'.28
2.34
Another element of the Federal Government's response to the Digital platforms inquiry was that the major digital platforms would put in place a voluntary code of conduct for disinformation and news quality.29 The Australian Communications and Media Authority (ACMA) had been tasked to report to government on the adequacy of the platforms' measures and the broader impacts of disinformation by June 2021.30 This process entailed ACMA consulting with digital platforms, government and other relevant stakeholders to develop principles and minimum expectations for a voluntary code of conduct.31 This report is yet to be released.
2.35
On 26 June 2020, ACMA released a position paper outlining its expectations for a voluntary code or codes of practice on misinformation and news quality to be developed by digital platforms.32 A final code of practice for social media platforms was published in February 2021, entitled The Australian code of practice on disinformation and misinformation. The code has been adopted by Twitter, Google, Facebook, Microsoft, Redbubble, TikTok, Adobe and Apple.33

Digital platform services inquiry 2020-2025

2.36
Following the digital platforms inquiry, on 10 February 2020 the Australian Government directed the ACCC to conduct an inquiry into markets for the supply of digital platform services, which has been entitled Digital platform services inquiry 2020-2025. Matters to be undertaken by the inquiry include:
the intensity of competition in markets for the supply of digital platform services, with particular regard to the concentration of power, the behaviour of suppliers, mergers and acquisitions, barriers to entry or expansion and changes in the range of services offered by suppliers of digital platform services;
practices of suppliers in digital platform services markets which may result in consumer harm;
market trends that may affect the nature and characteristics of digital platform services; and
developments in markets for the supply of digital platform services outside Australia.34
2.37
Thus far, the ACCC has released two interim reports. The September 2020 interim report, which was released on 23 October 2020, examines online private messaging services in Australia, updates the ACCC's previous analysis in relation to search and social media platforms, and identifies competition and consumer issues common across these platforms.35
2.38
The March 2021 interim report, which was released on 28 April 2021, provides in-depth consideration of competition and consumer issues associated with the distribution of mobile apps to users of smartphones and other mobile devices. It specifically focuses on the two key app marketplaces used in Australia, the Apple App Store and the Google Play Store.36

News and Media Research Centre

2.39
The University of Canberra's News and Media Research Centre (NMRC) has released two digital news reports, the Digital news report: Australia 2020 (16 June 2020) and Digital news report: Australia 2021 (23 July 2021). These reports are part of a long running international survey coordinated by the Reuters Institute for the Study of Journalism, an international research centre based at the University of Oxford. The Digital news reports deliver comparative data on media usage in 40 countries and across six continents.37

Digital news report: Australia 2020

2.40
Digital news report: Australia 2020 examined the domestic media context in Australia and particularly how Australians consume their news. Some of the report's key findings included that:
Australian news consumers are accessing news more frequently, but their interest in news is declining;
half of Generation Z,38 and 30 per cent of people aged 74 and over, use Facebook for news;
trust in news fell to 38 per cent in January and February of 2020, but trust in news about COVID-19 during the pandemic was much higher (53 per cent);
more than half (54 per cent) of news consumers say they prefer impartial news, but 19 per cent want news that confirms their worldview; and
more than half (58 per cent) believe tech platforms should block false political ads and 24 per cent say they should not.39
2.41
The report particularly noted that there is rising community concern about political advertising on social media:
There is more concern about political advertising on social media than on TV. Half of Australians think political ads on TV are OK, but don't feel comfortable about social media. When it comes to false ads, the majority think the tech platforms, Google and Facebook, should block them. This is particularly true of leftwing consumers. The success of the Coalition advertising and the failure of the Labour campaign during the 2019 federal election might help explain some of this difference. However, about one-quarter of Australians do not think that it is the responsibility of tech companies to decide what is true or false. In an age where political mendacity appears to be rising, these are important discussions for the public, news media and legislators to have.40

Digital news report: Australia 2021

2.42
Digital news report: Australia 2021, like its predecessor, examined news consumption behaviour in Australian. The report tracks the impact of the COVID-19 pandemic on news consumers during the 2020-21 period.41 The report's key findings were that:
trust in news increased globally over the past 12 months: in Australia, trust in news has risen (+5) to 43 per cent, close to the global average (44 per cent);
Australians' interest in news dropped during the pandemic in line with other countries, and interest in the news has been consistently declining among Australian audiences;
general concern about false and misleading information online in Australia is high (64 per cent), and much higher than the global average (56 per cent);
women, younger generations and those with low income are less likely to see themselves or their views as being fairly or sufficiently reflected in the news; and
the majority of Australians (66 per cent) are either unaware that commercial news organisations are less profitable than they were 10 years ago, or they don't know about the current financial state of the news media.42

Australian perspectives on misinformation

2.43
The November 2020 report Australian perspectives on misinformation arises from the results of two of the NMRC's previous reports, Digital news report: Australia 2020 and COVID-19: Australian news and misinformation. The report addresses the targeting of Australian society by foreign interference operations, specifically by its examination of a Russian-based Twitter campaign that occurred in the lead-up to the 2016 Federal Election.43

  • 1
    Journals of the Senate, No. 35, 5 December 2019, pp. 1128-1129.
  • 2
    Facebook also owns WhatsApp and Instagram.
  • 3
    Google also owns YouTube.
  • 4
    Department of Home Affairs, Submission 16, p. 4.
  • 5
    Digital Industry Group (DIGI), Australian Code of Practice on Disinformation and Misinformation, 22 February 2021, pp. 4-5.
  • 6
    Fergus Hanson, Sarah O'Connor, Mali Walker and Luke Courtois, Hacking democracies: Cataloguing cyber-enabled attacks on elections, Australian Strategic Policy Institute, Report No. 16/2019, p. 5.
  • 7
    Fergus Hanson, Sarah O'Connor, Mali Walker and Luke Courtois, Hacking democracies: Cataloguing cyber-enabled attacks on elections, Australian Strategic Policy Institute, Report No. 16/2019, p. 8.
  • 8
    Fergus Hanson, Sarah O'Connor, Mali Walker and Luke Courtois, Hacking democracies: Cataloguing cyber-enabled attacks on elections, Australian Strategic Policy Institute, Report No. 16/2019, pp. 17-18.
  • 9
    Fergus Hanson, Sarah O'Connor, Mali Walker and Luke Courtois, Hacking democracies: Cataloguing cyber-enabled attacks on elections, Australian Strategic Policy Institute, Report No. 16/2019, pp. 26.
  • 10
    Sarah O'Connor, Fergus Hanson, Emilia Currey, and Tracy Beattie, Cyber-enabled foreign interference in elections and referendums, Report 41/2020, p. 3.
  • 11
    Sarah O'Connor, Fergus Hanson, Emilia Currey, and Tracy Beattie, Cyber-enabled foreign interference in elections and referendums, Report 41/2020, p. 3.
  • 12
    Sarah O'Connor, Fergus Hanson, Emilia Currey, and Tracy Beattie, Cyber-enabled foreign interference in elections and referendums, Report 41/2020, p. 9.
  • 13
    Sarah O'Connor, Fergus Hanson, Emilia Currey, and Tracy Beattie, Cyber-enabled foreign interference in elections and referendums, Report 41/2020, p. 25.
  • 14
    The committee has also released a report into the 2016 election: see Joint Standing Committee on Electoral Matters (JSCEM), Report on the conduct of the 2016 federal election and matters related thereto, November 2018.
  • 15
    JSCEM, Report on the conduct of the 2019 federal election and matters related thereto, December 2020, p. 105.
  • 16
    JSCEM, Report on the conduct of the 2019 federal election and matters related thereto, December 2020, p. 122.
  • 17
    JSCEM, Report on the conduct of the 2019 federal election and matters related thereto, December 2020, p. xviii.
  • 18
    JSCEM, Report on the conduct of the 2019 federal election and matters related thereto, December 2020, p. xviii.
  • 19
    Australian Competition and Consumer Commission (ACCC), Digital platforms inquiry: final report, June 2019, p. 1 and p. 7.
  • 20
    ACCC, Digital platforms inquiry: final report, June 2019, p. 2.
  • 21
    ACCC, Digital platforms inquiry: final report, June 2019, p. 2.
  • 22
    ACCC, Digital platforms inquiry: final report, June 2019, p. 3.
  • 23
    ACCC, Digital platforms inquiry: final report, June 2019, p. 3.
  • 24
    Treasury, 'Government Response and Implementation Roadmap for the Digital Platforms Inquiry', Government Response, 12 December 2019.
  • 25
    Attorney-General's Department, Privacy Act review: issues paper, 30 October 2020, p. 2.
  • 26
    Ms Julia Galluccio, Assistant Secretary, Information Law Branch, Attorney-General's Department, Committee Hansard, 30 July 2021, p. 36.
  • 27
    Ms Julia Galluccio, Attorney-General's Department, Committee Hansard, 30 July 2021, p. 36.
  • 28
    Attorney-General's Department, 'Online Privacy Bill Exposure Draft', https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/ (accessed 7 December 2021).
  • 29
    ACMA, Submission 15, p. 1.
  • 30
    ACMA, Submission 15, p. 2.
  • 31
    ACMA, Submission 15, p. 2.
  • 32
    ACMA, 'ACMA releases guidance to digital platforms on voluntary misinformation and news quality code', Media release, 26 June 2020.
  • 33
    DIGI, 'Australian Code of Practice on Disinformation and Misinformation', https://digi.org.au/disinformation-code/ (accessed 9 August 2021).
  • 34
    ACCC, Digital platform services inquiry 2020-2025: September 2020 interim report, September 2020, p. 9.
  • 35
    ACCC, Digital platform services inquiry 2020-2025: September 2020 interim report, September 2020, p. 9.
  • 36
    ACCC, Digital platform services inquiry 2020-2025: March 2021 interim report, April 2021, p. 3.
  • 37
    Sora Park, Caroline Fisher, Jee Young-Lee, Kieran McGuinness, Yoonmo Sang, Mathieu O'Neil, Michael Jensen, Kerry McCallum, and Glen Fuller, Digital news report: Australia 2020, July 2020, p. 4.
  • 38
    Primarily includes people born in mid-to-late 1990s to those born in the early 2010s.
  • 39
    Sora Park, Caroline Fisher, Jee Young-Lee, Kieran McGuinness, Yoonmo Sang, Mathieu O'Neil, Michael Jensen, Kerry McCallum, and Glen Fuller, Digital news report: Australia 2020, July 2020, pp. 12-13.
  • 40
    Sora Park, Caroline Fisher, Jee Young-Lee, Kieran McGuinness, Yoonmo Sang, Mathieu O'Neil, Michael Jensen, Kerry McCallum, and Glen Fuller, Digital news report: Australia 2020, July 2020, p. 104.
  • 41
    Sora Park, Caroline Fisher, Kieran McGuinness, Jee Young Lee, and, Kerry McCallum, Digital news report: Australia 2021, June 2021, p. 8.
  • 42
    Sora Park, Caroline Fisher, Kieran McGuinness, Jee Young Lee, and, Kerry McCallum, Digital news report: Australia 2021, June 2021, pp. 9-11.
  • 43
    Dr Mathieu O'Neil and Dr Michael Jensen, Australian perspectives on misinformation, November 2020, p. 9.

 |  Contents  |