Chapter 7 - Building the capacity of civil society

Chapter 7Building the capacity of civil society

7.1Crucial to a comprehensive and coordinated approach to countering foreign interference through social media is inclusion of, and collaboration with, civil society. A common recommendation across many submissions was the need to improve the capacity and resilience of both individuals and civil society to respond to disinformation, degradation of public trust in democratic institutions, and foreign interference through social media.[1]

7.2In addition to government and platform-based solutions discussed in Chapters4 to 6 this chapter discusses the role for non-government institutions and civil society to address foreign interference through social media.

7.3Initiatives to build capacity and resilience in civil society should at times be supported and/or funded by government, with the groundwork primarily undertaken by non-government bodies, specifically through:

independent fact checking, expert centre monitoring, research and solutions development;

civil litigation;

building resilience in communities through public and school education;

impartial and independent reporting through mainstream media; and

improving knowledge, capacity and community cohesion in diaspora and culturally and linguistically diverse (CALD) communities.

7.4Nevertheless, Reset Australia warned against overly relying on social media end–user empowerment, arguing for a more comprehensive government and platform response to foreign interference through social media rather than shifting the burden to citizens and communities.[2]

A role for civil society organisations, expert centres and researchers

7.5Fact checkers are engaged directly by social media platforms to mitigate against disinformation, as discussed in Chapter 6. However, beyond these, other civil societybased researchers provide additional independent commentary and scrutiny to help inform and highlight disinformation and foreign influence.

7.6This unique role played by independent civil society organisations (such as fact checkers and expert centres) and researchers was highlighted by submitters to the inquiry, with Reset Australia calling these bodies 'the canaries in the coal mine sounding the alarm for emerging threats to Australian society'.[3] Theimportance of these services was reiterated by Mr Kenny Chiu, a former Canadian politician, who recommended that:

Australia should perhaps provide a seal of approval to certain nongovernment organisations that provide fact-checking services to the constituents in their own language, be it Persian, Russian or even Chinese and other languages that are commonly exploited. This would be effective in providing a way for the diaspora community to at least realise that they are being used and manipulated and that their interests are actually being hurt by these predatory regimes that are infiltrating us.[4]

7.7The Defence and Security Institute within the University of Adelaide suggested that civil society expert centres (such as the Defence and Security Institute) have the ability to 'play a significant role in the development of a national capability to detect, defend, and disrupt social influence operations both online and offline' and contribute to the development of policy and public education responses.[5]

7.8As discussed in Chapter 6, social media platforms provide some data and tools to help improve their transparency. However, as highlighted in a joint submission from University technical expert centres, 'the technical capacity to detect and analyse disinformation operations is concentrated in the hands of the platforms and a few specialised research and monitoring bodies'. The submission highlighted that although this capacity is limited it is crucially important, with the resulting information enabling civil society to 'monitor platform and government action against ongoing or real-time disinformation operations'.[6]

7.9ProfessorRory Cormac of the University of Nottingham noted that while he viewed fact checking as important, he advised that it has limitations—people may not believe the fact checkers, particularly those 'who are inherently sceptical of the state'.[7]

7.10Submitters suggested that utilisation of civil society-based expertise could be more inclusive and draw on a greater number of expert voices, work more innovatively across disciplines, potentially place more pressure on platforms and government to address disinformation and 'prevent the appearance of government and platforms acting like "Big Brother", secretively and only selectively cracking down on some disinformation while tolerating others'.[8]

7.11The Information and Influence University Partnership went further, recommending the establishment of a new organisation to coordinate monitoring and operational responses to malicious information threats, drawing on a pool of expert, multidisciplinary academic and industry partners.[9]

7.12However, several witnesses highlighted that access to platforms’ data (in a way that preserves privacy) can be challenging.[10] Mr Josh Machin from Meta explained that:

… many folks within the industry are broadly supportive of transparency and working with academics, but there can be very challenging considerations to work around, particularly from a privacy perspective … it's an area in which we continue to work very hard and make more information available to support academics, experts and policy makers in scrutinising content on our services.[11]

7.13Submitters and witnesses also advised that adequate funding is essential to enable civil society organisations, independent researchers and expert centres to effectively fulfill this monitoring, research and solutions development role.[12]

7.14While Australian Government agencies, such as the Department of Defence and the Australian Research Council, already provide some funding to expert centres to conduct relevant research, Reset Australia contended that 'they suffer from extremely limited resourcing'.[13] CyberCX called for the AustralianGovernment to facilitate more independent research into 'investigating, exposing and analysing social media interference and malign influence', for example through grant funding.[14]

7.15Social media platform Meta, likewise supported engagement with industry and civil society experts as well as technical research to improve policy and operational responses and the detection of malicious threats.[15]

Litigation

7.16Australian Supporters of Democracy in Iran discussed the use of courts and litigation by civil society as a potential means to address disinformation campaigns and the distribution of hate material. It also noted that Australian policy and governance appears to be under-developed in this area, with civil law primarily impacting on social media 'mainly through libel and defamation cases', rather than action relating to foreign interference.[16]

Building resilience through public education

7.17Many submitters raised the need for robust public education campaigns to counter to disinformation and foreign interference through social media, with a particular focus on improving information and media literacy, and empowering individuals to manage their online interactions.[17]

7.18Witnesses drew attention to the wider context of rising distrust in public institutions, more polarised and extreme societal and political divisions, and an increase in conspiratorial thinking.[18]

7.19Furthermore, the Law Council of Australia highlighted the growing market for 'disinformation-for-hire' which 'differs from genuine political campaigns because it is covert, deceptive, and manipulates public discourse to undermine democratic participation'.[19]

7.20The News and Media Research Centre of the University of Canberra called for information and media literacy measures to be considered within this bigger picture with a quotation from the North Atlantic Treaty Organization's Centre for Media Literacy:

"Democracy stands or falls on people. The challenge for democracies is to find ways to preserve the freedoms that come with more access to information, while protecting against the threats that come with it. The most democratic way to address this challenge is teaching [members of] society to be wiser information consumers and producers through critical thinking and a pedagogy that empowers them to evaluate, analyse, and choose critically whether to act on information. Media literacy education facilitates this critical thinking and thereby, risk management".[20]

7.21The Department of Home Affairs (Home Affairs) outlined its current engagement and education initiatives, including in a variety of community languages, through the Australian Values social media channels, and in partnership with other government agencies, non-government organisations and community members. The Australian Cyber Security Centre also monitors threats and provides security advice to citizens and government to ensure account holders are aware of social media risks and their social media accounts are secured.[21]

7.22Social media platforms, including Meta and TikTok, have committed to promoting and investing in digital literacy education, and provide fact checkers, tools, policies and processes to help their users make informed decisions about content, to identify authentic sources of information and stay safe.[22]

7.23CyberCX submitted that the public is not sufficiently well-informed about the risks and responses associated with foreign interference through social media:

The information available to Australian citizens regarding foreign interference orchestrated through social media is patchy and generic, despite the key role citizens, the private sector and state and local governments play as vectors and victims of interference.

Citizens and private organisations lack the expertise, resources and access to understand the risks of every social media platform they engage with. This problem is exacerbated by the opacity and complexity of platforms’ software and terms of service and, in some cases, links with authoritarian governments.[23]

7.24Public education campaign models used overseas, as discussed in Chapter 2, could provide models for similar Australian campaigns. Submitters to the inquiry recommended a range of public education approaches to 'improve digital literacy and awareness, as well as to build resilience to misinformation and disinformation' and foreign interference through social media,[24] including:

increased visibility or exposure of the scale of foreign interference in Australia;[25]

more comprehensive and 'actionable, specific advice' regarding individual platforms;[26]

how communities and individuals can report instances of foreign interference through social media, including through workshops;[27]

education and guidance materials on values, critical thinking, how users can identify and counter misinformation and disinformation, how algorithms use personal data to tailor online experiences and how users can protect themselves on social media;[28] and

accessible and tailored education materials to meet the needs of all Australians, including across age demographics and CALD communities.[29]

7.25Mr Albert Zhang of ASPI highlighted the role that government has in providing information to support civil society through public education:

The key takeaway and what the government could do is to actually be more transparent and offer more information where individuals, society or businesses can't do the research or do the due diligence to verify the information itself. This is really pertinent when it comes to foreign based or state backed disinformation or foreign interference. The individual is not going to be able to detect whether an authoritarian security agency is running an influence campaign against them. So, in those cases, it's really important for government to disclose their intelligence or disclose their findings to the public who are being targeted.[30]

7.26The News and Media Research Centre suggested better engagement with media professionals and associations, peak professional bodies and community sector groups with expertise in media and information literacy training and in the provision of provide public education programs and resources. The centre further recommended funded research into the impacts of new media technology and how best to deliver media and information literacy.[31]

7.27At the April 2023 public hearing, Dr Andrew Dowse from RAND Australia drew attention to a training package it is developing for a government client 'to help people identify, understand and protect themselves from foreign interference … efforts to increase media literacy are critical in relation to that challenge, but they are not the only solution'.[32]

7.28Similarly, Mr Richard Salgado, Director, Law Enforcement and Information Security, Google, noted that such government-run campaigns are not a panacea for the problem of foreign interference and disinformation on social media. MrSalgado stated that:

I would say my experience with this and what we have seen so far would suggest that there is probably no single voice that's going to be effective; it has to be a chorus of voices. There are going to be plenty of people who are not going to believe a government agency when they say it, just as there would be plenty who won't believe it if a company or security researchers were to say it. I don't think that there is an answer in the terms of a single voice that will be heard by all those who you want the message to get to.[33]

Electoral public education

7.29There has not been any identified foreign interference compromising the integrity of the 2022 Australian federal election. However, there are alleged examples of foreign interference in other democratic countries, most notably Russian interference in the 2016 United States' Presidential election and 2022midterm elections.[34] As noted by Internet 2.0:

… elections are a strategic opportunity to influence our democratic process. They [Australia's authoritarian competitors] believe that it is to their benefit if the voting population have less confidence in the true results of elections. By conducting divisive campaigns, they seek to divide our societies, to weaken us, and to fracture our uniting values.[35]

7.30The Information and Influence University Partnership further explained the dangers of foreign interference in electoral processes, including through social media:

Foreign interference can thwart democratic participation in nation-building. … Disinformation operations can be used broadly to normalise anti-democratic views and actions, microtargeted to cultivate extremist views, and even to radicalise some segments of the population. Such operations may recruit and incite radicalised individuals and groups to engage in anti-democratic activities and assaults on the institution of democracy itself. Moreover, public displays of anti-democratic behaviours (and foreign operations to fuel them) may undermine the legitimacy of democracy as an institution, erode the citizens’ trust in the apparatus of democracy—parliament, government, and judiciary—and cause disillusionment, disenchantment, and apathy, thereby threatening democratic participation.[36]

7.31As such, public education in relation to electoral matters was a specific area of interest raised by several witnesses. The Australian Electoral Commission (AEC) highlighted that:

Determining the success of any parliamentary election is ensuring that not only is every ballot counted, re-counted and results certified, but that Australians have confidence in the outcome.[37]

7.32The AEC noted that Australia's laws regulate electoral and political communications, however they do not, in general, regulate truth in electoral advertising:

… with one limited exception, it does not regulate truth in electoral communication. Furthermore, the High Court has found there is an implied freedom of political communication in the Commonwealth Constitution that limits the scope of the Parliament to make laws restricting political communications.[38]

7.33Nevertheless, a survey conducted by the AEC found that 80 per cent of those surveyed:

… believed the Australian Government has a responsibility to educate people about disinformation on social media. In addition, 48 per cent of those surveyed reported seeing false or misleading content on social media (an increase from 38 per cent at benchmark) and 67 per cent agreed that groups/individuals deliberately spread false information online about voting.[39]

7.34Witnesses drew attention to preparations for the 2022 federal election and the range of measures adopted by the AEC. These included a nationwide education and advertising campaign, establishing a disinformation register, working with platforms, including the development of a Statement of Intent for the Federal Election 2022, referring disinformation to online platforms for review and responding directly to disinformation online (for example through Twitter).[40]

7.35In March 2023, it was reported that the AEC 'is looking to go a step further and set up information sharing arrangements with Australian Associated Press (AAP) Fact Check, RMIT FactLab and RMIT ABC Fact Check' in the lead up to the upcoming referendum on an Indigenous Voice to Parliament.[41]

7.36Platforms also have a role in 'the promotion of authoritative sources of information, partnering with fact checking organisations, and collaborating with the Australian Electoral Commission' to address election-related disinformation and to promote authoritative AEC content, including in the lead up to The Voice referendum.[42]

7.37TikTok drew attention to its election-related measures, including its work with partners such as AAP FactCheck, as well as prohibitions on political advertising, which in 2023 are being extended to restrict solicitations for campaign funding and political party donations.[43] It also highlighted its in-app election guide which 'provided detailed, authoritative information on the election process, including information on where and how to vote and preferential voting explainers developed by the Commission [AEC]'.[44]

Australian Electoral Commission's 'Stop and Consider' campaign

7.38For the 2022 federal election, the AEC established a strong digital presence and expanded on its 2019 'Stop and Consider' public education campaign, aimed at 'encouraging voters to stop, check and consider the source of electoral information they are consuming'.[45]

7.39The AEC also placed advertising and other media placements across social media platforms, digital displays and search term advertising, as well as information in over 20 languages.[46]

7.40The AEC observed that its approach was successful, and that its:

… digital presence [was] widely recognised as one of the most forthright and informative from a government agency. Delivering on the reputation management principles, the AEC took an approach focusing on being human in tone, swift and regular in reply, and knowledgeable. This approach helped solidify our digital presence as the authoritative voice on electoral processes, providing a defence to Australian democracy from the threat of disinformation.[47]

7.41The AEC attributed the campaign's success to the fact it directly addressed disinformation online and its friendly, often humorous approach which engaged the public. Even the Washington Post reported on the campaign—quoting political scientist Ariadne Vromen—'their meme game is pretty strong … and the informal language is really important. It’s personalized. It’s using everyday norms of engagement. And that is the kind of thing that people will notice and will share'.[48]

7.42TheAEC's very deliberate approach to inform and engage the public and encourage reporting of disinformation resulted in numerous social media users reporting and tagging misleading content, opening the door to the AEC stepping in with correct information.[49]

7.43In their submission the Law Society of New South Wales Young Lawyers recommended that the Electoral Integrity Assurance Taskforce be obliged to produce a periodic report on foreign interference through social media in Australian elections, stating that 'such a report would support vigilance among the Australian public as to what potential threats they may encounter online in upcoming elections'.[50]

School-based education

7.44Regarding media literacy, the Joint Standing Committee on Electoral Matters (Electoral committee) previously recommended in its Report on the conduct of the 2016 federal election and matters related thereto that:

… increased social media literacy, as part of a strengthened civics and electoral education curriculum, is a vital component in facing the challenges posed by this new social media environment. Australians must be better equipped to critically discern and judge any media which seeks to influence their voting behaviour.[51]

7.45The Electoral committee subsequently recommended that:

Recommendation 30 | … the Australian Government consider ways in which media literacy can be enhanced through education programs that teach students not only how to create media, but also how to critically analyse it.

Recommendation 31 | … the Australian Electoral Commission examine ways in which media literacy can be incorporated into a modern, relevant civics education program.[52]

7.46A number of submitters to this inquiry highlighted the importance of schoolbased information and media literacy campaigns to address disinformation and foreign interference through social media.

7.47Ms Emily Moseley from ASPI highlighted the effectiveness of social media disinformation education in schools in northern Europe, stating that 'in particular, younger generations that are more influenced by social media are coming up in the generation and learning and determining which news sources are relevant or what to look for in terms of combatting disinformation'.[53]

7.48Home Affairs noted that in 2022–23 the Australian Government provided $6million funding over three years to the Alannah and Madeline Foundation to:

… develop and deliver digital and media literacy education products for primary and secondary school students. These products will be made freely available to every Australian primary and secondary school, improving digital literacy and helping students to be critical, responsible and active citizens online.[54]

7.49In 2022, the Australian Capital Territory Education Directorate also funded News and Media Research Centre to co-create primary school educational resources, successfully demonstrating the value of civic online reasoning education.[55]

7.50However, the News and Media Research Centre of the University of Canberra suggested that the current information literacy education model has failed given the wide variations between state and territory institutional responses and the teaching strategies used are commonly 'outmoded' and 'ineffectual'.[56]

7.51Submitters, including University technical expert centres and the AustralianHuman Rights Commission (AHRC), recommended curriculum changes to ensure that information and media literacy is included in Australian schools and adapted to the current environment.[57] The News and Media Research Centre further recommended appropriate education and development programs for teachers to equip them with the required knowledge and skills.[58]

7.52In discussing what content could be included in a school setting, Principle Co proposed teaching critical thinking and media literacy skills to students, which would allow them to 'sceptically question what [they] see and read to develop informed opinions from an early age', as well as examining the impacts of psychological processes such as cognitive miserliness, cognitive dissonance, confirmation bias, motivated reasoning, and repetition error.[59]

Mainstream media

7.53Witnesses recognised that public broadcasters and mainstream media also have public education responsibilities to provide reliable, accurate and apolitical information about current events and political matters, with ASPI arguing that this information forms 'critical infrastructure' for informed decision-making.[60]

7.54Discussing the declining trust in institutions, the News and Media Research Centre noted 'when people think the mainstream media is not holding industries and governments to account, they view it as a mouthpiece for elite interests, and may be more likely to accept information that challenges conventional beliefs'.[61]

7.55Social media is increasingly the source of news content for many Australians. Dr Seth Kaplan, a US political expert, told the committee 'there are significant numbers of people using it [social media] for news and for information and are being exposed to that. I think that has great implications for your social cohesion and the trust of your government in those vulnerable communities'. DrKaplan further argued that for foreign language speakers the news they consume is based on a single dominant view.[62]

7.56This view was reiterated by Mr Kenny Chiu, a former Canadian MP:

I believe that in the multicultural society that we live in, respecting everybody's rights for the flow of information, unfortunately some of the diaspora community members are living in relative information isolation, and therefore that presents a huge opportunity for these regimes such as the CCP [Chinese Communist Party] to exploit and spread disinformation among them … they don't know what they don't know, and they have no way of fact-checking in their comfortable and preferred language.[63]

7.57In order to address the risks posed to Australia's democracy by foreign interference through social media, Human Rights Watch made a range of recommendations including promoting independent and professional journalism (in Chinese), investing in training and similar programs, and investing in other open-source technologies to provide other channels of communication to enable communities to more easily bypass censorship.[64]

7.58Principle Co similarly noted that governments need to 'avoid undermining' the work of journalists, stating that:

Strong safeguards should be developed to avoid future repeats of recent police raids on journalists, unlawful accessing of metadata to identify a journalist's source, and a spree of new national security legislation which criminalises journalistic activity.[65]

Diaspora communities

7.59This section considers how communities and individuals—in particular diaspora communities—can be empowered to improve their resilience against foreign interference through social media, as well as mitigating the impacts of disinformation, information campaigns, amplification and manipulation, propaganda and censorship.

7.60The AHRC advised the committee of its concerns that foreign interference is increasing tensions and polarisation in Australia, writing that it:

… is increasingly disturbed by the role misinformation and disinformation plays in diminishing social cohesion, promoting distrust and division, and undermining principles of equality, respect and human dignity …

Where social media is utilised by foreign actors to sow discontent and division in pursuit of their own agendas, disinformation can have a serious impact on the rights and freedoms of all Australians.[66]

7.61This view was supported by ASPI, which submitted that 'social media has created an unprecedentedly connected global community but this extensive reach has also allowed malign actors to intimidate, coerce and threaten violence beyond their borders, often with impunity and anonymity'.[67] It is also impacting on Australian's relationships with their family and friends overseas.[68]

7.62As outlined in Chapter 4, the Australian government has taken various steps to counter foreign interference through social media, including by way of social cohesion, community liaison and outreach programs, media campaigns, publication of fact sheets by Home Affairs, ASIO and Australian Federal Police (AFP).[69] Ms Sally Pfeiffer expanded on the work of Home Affairs:

… the Department of Home Affairs has a network of community liaison officers who worked very closely with culturally and linguistically diverse communities on a range of issues relating to Home Affairs activities, including some engagement on issues such as foreign interference. Those engagements are very important to us and we work very closely with those communities to make sure they have awareness of the potential for risks of foreign interference, and then what they can do to manage them.[70]

7.63In July 2023, Home Affairs advised that since February 2023, engagements relating to foreign interference had been conducted with the Arab, Cambodian, Chinese, Eritrean, Indian, Iranian, Vietnamese, Kurdish, Russian, Rwandan and Ukrainian communities.[71]

7.64Mr Stephen Nutt, Commander of Special Investigations with the AFP told the committee of the importance of this type of work:

… my personal view is that the best strategy to defend against foreign interference is through high community and agency awareness and also knowledge of how to report it. If it's reported, we're able to understand it. And once we're able to understand it, we can work to disrupt and look at vulnerabilities that need to be corrected not only on individual case basis, but at more system level. So it's absolutely key.[72]

7.65This community outreach approach has been welcomed, with Dr Dowse from RAND Australia telling the committee:

I know that the Department of Home Affairs has been looking specifically at communities where English is not the first language, and I think that is a very intelligent priority because a lot of diaspora communities have their own sources of information which may not be mainstream and might not be traditional social media either. So I think understanding where those diaspora communities get their news and their information is important. It may change from diaspora to diaspora. I think understanding that is important if we're going to have a harmonious society, whether that's ideological or related to where the ethnic origins of portions of our society are. I think, to start with, that understanding where those potential sources of division might come from, including the sources of information and the influences … and then understanding how you can balance that where there's concern about unfactual sources of information is an important priority. So I was really happy to see that Home Affairs has got that as a priority.[73]

7.66The eSafety Commissioner also emphasised the targeted nature of the support it provides:

eSafety aims to provide support to a range of communities and groups, particularly to those most at risk of online harm. Our regulatory schemes can assist individuals who may be susceptible to foreign interference, such as diaspora groups, those with low digital literacy, and minority groups who may be the target of cyber abuse and illegal activity. These schemes intersect on a range of issues to provide multiple avenues of support.

In addition to targeted misogynistic abuse and gendered disinformation, women in the public eye who challenge autocratic leaders, traditional male power structures or human rights and gender issues tend to attract more organic and coordinated vitriol online.[74]

7.67Chapter 6 expanded on the steps platforms have undertaken to improve transparency and the awareness of disinformation within diaspora communities and the public. For example, platforms publishing enforcement and threat reports,[75] state media account labelling,[76] and Meta’s development of a policyfocussed paper on misinformation and disinformation amongst diaspora groups with emphasis on Chinese language.[77]

7.68However, other witnesses argued that more needs to be done. ASPI argued that law enforcement and intelligence agencies should increase their community engagement to raise awareness, establish a reporting scheme to 'counter transnational repression', and reassure targeted individuals and groups.[78]

7.69The Australian Tibetan Communities Association argued for better consultation with civil society, and affected communities by government,[79] including regarding the development of new measures to counter foreign interference through social media and drafting new legislation.[80]

7.70Specifically, regarding the Chinese-Australian community, the Council on Middle East Relations urged the Australian Government to establish and maintain a presence on Chinese-language social media platforms in order to engage Chinese-Australians and Taiwanese-Australians and provide resources to assist these groups in identifying misinformation and disinformation.[81] The China Policy Centre likewise encouraged such outreach to Chinese-Australians, stating that it would serve three key purposes:

First, it enables the better communication of government policy, strategy and priorities with the opportunity to hear community feedback …

Second, regular and effective outreach by the Australian Government can assist with early identification of community concerns, and new challenges and threats. Chinese-Australians are the main targets of China's propaganda, misinformation, cyber surveillance, and censorship efforts through social media. Regular outreach provides an avenue to counter Chinese-language misinformation, especially in relation to government policy.

Third, the support of Chinese-Australians is crucial for an effective strategy addressing China's foreign interference efforts.[82]

7.71The AHRC noted challenges to rights to freedom from discrimination and freedom of expression citing harassment, surveillance and censorship of Australians by social media. The AHRC argued that:

Transparency is the key to ensuring that censorship (including extraterritorial censorship) does not unduly restrict the exercise of free speech in Australia. With respect to the last of these examples, the Commission would endorse the recommendation previously made by the ASPI International Cyber Policy Centre that governments 'should mandate that all social media platforms publicly disclose, in detail, all the content they censor and make it an offence to censor content where that has not been publicly disclosed to users'.[83]

7.72Dr Bruce Arnold of the University of Canberra submitted that an informed populace was in a significantly better position to be resilient. DrArnold observed that 'it is … viable to emphasise community self-help, with participants in social media platforms being alert to the likelihood of misrepresentation and equipped to discern that statements are malicious'. DrArnold further noted that 'ultimately a savvy Australian population–with access to information through a public transparency regime–is the best defence against foreign interference and misbehaviour by domestic actors'.[84]

7.73Ms Shanthi Kalathil agreed, telling the committee:

We are really going to have to work on the long-term education and awareness raising of not just populations but different institutions of society …

I think the Nordic countries, for instance, have several decades of history of building education and awareness about information into their educational systems from a young age. There are certainly examples we can look to around the world that can provide, if not a blueprint, some kind of indication about what might work and what might not work. So I think we need to do some serious looking at some of these examples from around the world to understand this problem better.[85]

7.74Regarding what practical activities individuals can undertake immediately, Principle Co encouraged Australians to:

follow a wide spectrum of sources;

seek our diversity of perspective, including by making sure that you are not just following voices that you naturally agree with; and

ask questions, such as: who produced this content, what reaction is it seeking, and how did the information get to you?[86]

Footnotes

[1]See, for example: Information and Influence University Partnership (comprising University of Adelaide, University of Melbourne and University of New South Wales), Submission 28, p. [3]; Meta, Submission 32, p. 18; Department of Home Affairs, Submission 1, pp. 4–5; Australian Security Intelligence Organisation, Submission 2, p. 7; Australian Human Rights Commission Submission 9, p.9; News and Media Research Centre, University of Canberra, Submission 21, pp. 4–5.

[2]Reset Australia, Submission 26, pp. 2 and 7–8. See also: CyberCX, Submission 16, pp. 9–10.

[3]Reset Australia, Submission 26, p. 2; CyberCX, Submission 16, pp. 4 and 9, Australian Strategic Policy Institute, Submission 13, p. 12.

[4]Mr Kenny Chiu, Private capacity, Committee Hansard, 21 April 2023, p. 16.

[5]Emphasis in the original submission. Defence and Security Institute, Submission3, pp. 3 and 7. Seealso: Australia Institute, Submission 31.2: Foreign Interference through Social Media inquiry (46thParliament), p. 20.

[6]University of New South Wales (NSW) Allens Hub for Technology Law and Innovation, Deakin University Centre for Cyber Security Research and Innovation and Institute of Electrical and Electronics Engineers (IEEE) Society on Social Implications of Technology (University Centres Joint Submission), Submission 19, pp. 6–7. See also: Australia Institute, Submission 31: Foreign Interference through Social Media inquiry (46th Parliament), p. 4., p. 2.

[7]Professor Rory Cormac, Director, Centre for the Study of Subversion, Unconventional Interventions and Terrorism, University of Nottingham, Committee Hansard, 20 April 2023, p. 40.

[8]University Centres Joint Submission, Submission 19, pp. 6–7; Defence and Security Institute, University of Adelaide (Defence and Security Institute), Submission3, pp. 2–3 and 7.

[9]Information and Influence University Partnership, Submission 28, pp. [4–5].

[10]Improved transparency and access to platform data is discussed in Chapter 6. Reset Australia, Submission 26, p. 2; CyberCX, Submission 16, p. 19.

[11]Mr Josh Machin, Head of Public Policy, Australia, Meta, Committee Hansard, 11 July 2023, p. 8.

[12]Reset Australia, Submission 26, p. 2; CyberCX, Submission 16, p. 9.

[13]Defence and Security Institute, Submission 3, pp. 2–3 and Reset Australia, Submission 26, p. 2.

[14]CyberCX, Submission 16, p. 9.

[15]Meta, Submission 32, p. 17.

[16]Australian Supporters of Democracy in Iran, Submission 23, pp. 4–5.

[17]See, for example: News and Media Research Centre, University of Canberra, Submission 21, pp. 5 and 17; Law Council of Australia, Submission 27, p. 3; CyberCX, Submission 16, pp. 4 and 8; Australian Tibetan Community Association, Submission 31, p. 5; Department of Home Affairs, Submission 1, pp. 4–6; AustralianHuman Rights Commission, Submission 9, pp. 9–10; Information and Influence University Partnership, Submission 28, pp. [2–3]; RAND, Submission 11, p. 3; MsMiaGarlick, Regional Director of Policy, Meta, Committee Hansard, 11 July 2023, p. 9.

[18]See, for example: News and Media Research Centre, Submission 21, p. 5; CyberCX, Submission 16, pp.2 and 7–8; Reset Australia, Submission 26, pp. 1–2; Australian Muslim Advocacy Network, Submission 24, p. [3]; AustralianHuman Rights Commission, Submission 9, pp. 5–7; RAND Australia, Submission 11, p. 2.

[19]Law Council of Australia, Submission 27, p. 2.

[20]News and Media Research Centre, Submission 21, pp. 4 and 5.

[21]Department of Home Affairs, Submission 1, pp. 4–6; Department of Home Affairs, answers to questions on notice (no. 7), 12 July 2023 (received 18 July 2023).

[22]TikTok, Submission 30, pp. 1, 4–5 and 10; Meta Australia, Submission 32, pp. 14–16 and 18; DigitalIndustry Group Inc., Submission 36, p. 2.

[23]CyberCX, Submission 16, p. 8.

[24]Law Council of Australia, Submission 27, p. 3; Information and Influence University Partnership, Submission 28, pp. [2–3]; Australian Strategic Policy Institute, Submission 13, pp. 14–15.

[25]Law Council of Australia, Submission 27, p. 3.

[26]CyberCX, Submission 16, pp. 7–8; Ms Vicky Xu, Senior Fellow, Australian Strategic Policy Institute, Committee Hansard, 21 April 2023, pp. 35–36.

[27]Mr Kalsang Tsering, President, Australian Tiebetan Community Association, Committee Hansard, 21April 2023, p. 26.

[28]CyberCX, Submission 16, pp. 7–8; Australian Tibetan Community Association, Submission 31, p. 5; Australian Human Rights Commission, Submission 9, pp. 9–10 and 12; University Centres Joint Submission, Submission 19, Attachment 1,p. [4].

[29]Australian Human Rights Commission, Submission 9, p. 10; eSafety Commissioner, Submission 10, p.[2]; Cyber CX, Submission 16, p. 4.

[30]Mr Albert Zhang, Analyst, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p.24.

[31]News and Media Research Centre, Submission 21, p. 17.

[32]Dr Andrew Dowse, Director, RAND Australia, Committee Hansard, 20 April 2023, pp. 27 and 30.

[33]Mr Richard Salgado, Director, Law Enforcement and Information Security, Google, Committee Hansard:Foreign Interference through Social Media inquiry (46thParliament), 30 July 2021, p. 13.

[34]Australian Electoral Commission, Submission 8, p. 2; RAND Australia, Submission 11, pp. 2–3; Australian Strategic Policy Institute, Submission 13, pp. 2 and 4; Dr William Stolz, Senior Fellow, National Security College, Australian National University, Submission 18, pp. 15–16; Reset Australia, Submission 26, pp. 5–6; Rachel Lee, Prudence Luttrell, Matthew Johnson and John Garnaut, Submission 34, pp. 4–5; Professor Rory Cormac, University of Nottingham, Committee Hansard, 20 April 2023, p. 38.

[35]Internet 2.0, Submission 17, p. 6.

[36]Information and Influence University Partnership, Submission 28, p. [2].

[37]Australian Electoral Commission, Submission 8, p. 1. See also CyberCX, Submission 16, p. 2.

[38]Australian Electoral Commission, Submission 8, p. 2. See also Ms Sally Pfeiffer, Acting First Assistant Secretary, Counter Foreign Interference, Department of Home Affairs, Committee Hansard, 12 July 2023, pp. 8–9.

[39]Australian Electoral Commission, Submission 8, p. 3.

[40]Department of Home Affairs, Submission 1, p. 4. See also the Australian Human Rights Commission, Submission 9, p. 12; Australian Strategic Policy Institute, Submission 13, p. 9; Digital Industry Group Inc., Submission 35, pp. 4–5.

[41]Sam Buckingham-Jones and Mark Di Stefano, 'AEC eyes tie-up with fact-checkers for Voice referendum', Australian Financial Review, 19 March 2023.

[42]Department of Infrastructure, Transport, Regional Development, Communications and the Arts, Submission 7, p. 2. See also DIGI, Submission 35, p. 5; Ms Mia Garlick, Regional Director of Policy, Meta, Committee Hansard, 11 July 2023, p. 2.

[43]TikTok, Submission 30, pp. 5 and 6.

[44]TikTok, Submission 30, p. 6.

[45]Australian Electoral Commission, Submission 8, p. 3.

[46]Australian Electoral Commission, Submission 8, p. 4; Mr Tom Rogers, Electoral Commissioner, Australian Electoral Commission, Committee Hansard, 12 July 2023, p. 36.

[47]Australian Electoral Commission, Submission 8, p. 3.

[48]Michael E Miller and Frances Vinall, 'The Twitter account defending Australian democracy', The Washington Post, 14 May 2022. See also: Cait Kelly, '"Firm but friendly": how the AEC Twitter account is winning friends and influencing people', The Guardian, 11 February 2022; Michelle Elias, '"Leave the eggplant off": How the AEC is engaging online to counter fake claims about voting', SBS News, 16 March 2022.

[49]Australian Electoral Commission, Disinformation register: 2022 Federal Election, 10 January 2023 (accessed 21 July 2023).

[50]Law Society of New South Wales Young Lawyers, Submission 33, p. 8.

[51]Joint Standing Committee on Electoral Matters, Report on the conduct of the 2016 federal election and matters related thereto, November 2018, p. 61.

[52]Joint Standing Committee on Electoral Matters, Report on the conduct of the 2016 federal election and matters related thereto, November 2018, pp. xxviii-xxix.

[53]Ms Emily Mosley, International Cyber Policy Centre Coordinator, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p. 24.

[54]Department of Home Affairs, Submission 1, p. 6.

[55]News and Media Research Centre, Submission 21, p. 9.

[56]News and Media Research Centre, Submission 21, p. 6.

[57]University Centres Joint Submission, Submission 19, pp. 2–3, 6 and 16 and Attachment 1, p. [1]; Australian Human Rights Commission, Submission 90:Foreign Interference through Social Media inquiry (46thParliament), p. 10.

[58]News and Media Research Centre, Submission 21, p. 16.

[59]Principle Co, Submission 25:Foreign Interference through Social Media inquiry (46thParliament), p. 8.

[60]Australian Strategic Policy Institute, Submission 13, p. 7. See, for example: Department of Home Affairs, Submission 1, p. 5; Special Broadcasting Service, Submission 14, pp. 1–4.

[61]News and Media Research Centre, Submission 21, p. 5. See also: Australian Muslim Advocacy Network, Submission 24, p. [4].

[62]Dr Seth Kaplan, Private capacity, Committee Hansard, 20 April 2023, p. 15. See also: Sora Park, Caroline Fisher, Kieran McGuinness, Jee Young Lee, and, Kerry McCallum, Digital news report: Australia 2021, June 2021, p. 10; Badiucao, Private capacity, Committee Hansard, 21 April 2023, p. 37.

[63]Mr Kenny Chiu, Private capacity, Committee Hansard, 21 April 2023, p. 16.

[64]Human Rights Watch, Submission 12, p. 8. See also: Ms Melissa Harrison, Submission 5: Foreign Interference through Social Media inquiry (46thParliament), p. 26; Badiucao, Private capacity, Committee Hansard, 21 April 2023, p. 37.

[65]Principle Co, Submission 25: Foreign Interference through Social Media inquiry (46thParliament), p. 5.

[66]Australian Human Rights Commission, Submission 9, p. 7.

[67]Australian Strategic Policy Institute, Submission 13, pp. 3 and 15. See also: Dr William Stoltz, National Security College, Australian National University, Submission 18, pp. 17–18.

[68]Australian Tibetan Communities Association, Submission 31, pp. 2–3; Mr Kalsang Tsering, President, Australian Tiebetan Community Association, Committee Hansard, 21 April 2023, pp.2426; Ms Vicky Xu, Senior Fellow, Australian Strategic Policy Institute, Committee Hansard, 21April2023, p. 34.

[69]See, for example: Mr Stephen Nutt, Commander, Special Investigations, Counter Terrorism and Special Investigations Command, Australian Federal Police, Committee Hansard, 12 July 2023, pp. 18 and 22.

[70]Ms Sally Pfeiffer, Department of Home Affairs, Committee Hansard, 12 July 2023, p. 7.

[71]Department of Home Affairs, answers to questions on notice (no. 7), 12 July 2023 (received 18July2023).

[72]Mr Stephen Nutt, Australian Federal Police, Committee Hansard, 12 July 2023, p. 22.

[73]Dr Andrew Dowse, Director, RAND Australia, Committee Hansard, 20 April 2023, p. 30.

[74]eSafety Commissioner, Submission 10, p. [2].

[75]Meta Australia, Submission 32, pp. 15–16; Ms Emily Mosley, International Cyber Policy Centre Coordinator, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p. 23.

[76]TikTok, Submission 30, pp. 6 and 9; Mr Fergus Ryan, Analyst, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p. 20.

[77]Meta Australia, Submission 32, p. 14.

[78]Australian Strategic Policy Institute, Submission 13, p. 15.

[79]Australian Muslim Advocacy Network, Submission 24, pp. [18–19]; Australian Tibetan Communities Association, Submission 31, p. 5.

[80]Australian Tibetan Communities Association, Submission 31, p. 5.

[81]Council on Middle East Relations, Submission 25: Foreign Interference through Social Media inquiry (46thParliament), p. 4.

[82]China Policy Centre, Submission 4: Foreign Interference through Social Media inquiry (46thParliament), p.3.

[83]Australian Human Rights Commission, Submission 9, p. 16. See also: Human Rights Watch, Submission 12, p. 8.

[84]Dr Bruce Arnold, Submission 7: Foreign Interference through Social Media inquiry (46thParliament), p 7. See also John Abdelmalek, Kyron Johnson, Michael Barberio, and Mathew Bubica, with Dr James Scheibner, Submission 40: Foreign Interference through Social Media inquiry (46thParliament), p. 20.

[85]Ms Shanthi Kalathil, Private capacity, Committee Hansard, 20 April 2023, pp. 4–5.

[86]Principle Co, Submission 25: Foreign Interference through Social Media inquiry (46thParliament), p. 8.