Chapter 5 - Government: ideas for action

Chapter 5Government: ideas for action

5.1The Australian Security Intelligence Organisation (ASIO) described the threat of foreign interference as being at the highest level in Australia’s history.[1]

5.2Throughout this inquiry, the Select Committee on Foreign Interference through Social Media (committee) received advice from expert organisations, members of the community, government agencies and the platforms themselves on steps which are urgently required to improve the countermeasures taken against foreign interference through social media, and to protect Australia’s democratic resilience.

5.3A key theme emerged among the recommendations: the need for cross-sector, cross-border alliances, where government, platforms and civil society work together to tackle the threat of foreign interference. These recommendations are discussed in the next three chapters. While recognising the inter-connectedness of many of these recommendations, they have been broken down by who would be the lead entity for each recommendation: government, platforms or civil society. This chapter will outline proposals for action where government would be either the sole, or lead entity.

5.4The committee heard a common message from a range of experts and people from impacted communities consistently confirming that not enough was being done by government to address foreign interference and coordinated inauthentic behaviour (CIB) through social media.

5.5CyberCX submitted a range of recommendations and noted that it believes the Australian Government’s responses have not kept pace with the nature and scale of the risks of foreign interference through social media.[2]

5.6The Australian Strategic Policy Institute (ASPI) made a similar overall observation, and advised:

Today, our key recommendation for the Australian government is it needs to lead on the issue and not rely on social media platforms and civil society fact checkers to counter disinformation. This includes incorporating counter foreign interference through social media into a broader national security strategy and mandating platforms to disclose covert state backed influence operations…The ongoing issue of foreign interference through social media requires a whole-of-government and proactive strategy to protect the integrity of our democracy and the wellbeing of our citizens.[3]

Lead agency

5.7There have long been concerns expressed by the private sector expert community on the ‘lack of clarity on which parts of the Australian Government have leadership and authority in protecting which parts of Australia’s information environment—the public sector, the private sector and individual citizens—from interference by foreign actors acting with malicious intent’.[4]

5.8These concerns were repeated by many submitters and witnesses, who noted that although there were numerous specialist taskforces (outlined in Chapter 4), a single lead agency to take core responsibility is needed, while ensuring appropriate coordination and collaboration with other agencies, platforms and private sector experts.

5.9Indeed, the previous report of this committee also identified the absence of a single, lead focal point for cyber-enabled foreign interference. This problem has been a long-standing one, across multiple governments. It should be noted that submitters acknowledge a need for a whole of government approach, not replacement of such an approach with one lead agency.

5.10CyberCX recommended:

A single lead entity is needed to coordinate government policy and action on cyber-enabled interference. Importantly, it will also create a clear ‘onestop-shop’ for citizens and private sector stakeholders to report, and seek advice about, foreign interference risk linked to social media.[5]

5.11ASPI also recommended the government take a stronger lead on this issue:

…to ensure clarity on which government institutions are responsible for dealing with cyber-enabled foreign interference from both an operational and policy perspective. Currently, uncertainty as to whom in government has responsibility is creating a disincentive for victims to report while not providing disincentives to perpetrators to cease their malicious activity.[6]

5.12ASPI further recommended that ‘cyber-enabled foreign interference should be incorporated into broader cybersecurity and national security strategies’.[7]

5.13The Australian Human Rights Commission (AHRC) noted:

In order to address this specific risk, the Australian Government should establish a permanent whole-of-government taskforce dedicated to preventing and combating cyber-manipulation in Australia. The terms of reference for this taskforce should extend beyond those of the ElectoralIntegrity Assistance Taskforce to encompass not solely threats to the integrity of a federal election or electoral integrity, but threats to Australia’s democracy more broadly.[8]

5.14A joint submission from technical specialist University centres noted the complexity of the many varied government agencies and taskforces which have roles in this space, and noted that:

…it is unclear what exactly each governmental body’s role is and whether foreign interference on social media falls within each body’s scope of concern. Moreover, there is often little information about how much progress these various governmental bodies have made in monitoring or mitigating disinformation. Compounding this complexity, there does not appear to be a single body responsible for coordinating the government’s response to the risks posed to democracy online (whether it be through disinformation or other phenomena on social media and the internet more broadly). Nor is there a single point of contact for the public regarding these risks.[9]

Standards-based regulation

5.15The inquiry dove deeply into the issue of whether government should address the risks posed by specific platforms and applications by banning them, or whether a platform neutral approach should be taken to set standards that all platforms should meet. However, the committee noted there is a clear distinction between platforms which originate in authoritarian states rather than liberal democracies, with a specific set of risks attached, and that risk is discussed in the following chapter on platforms.[10]

5.16More than one witness described the approach of banning specific platforms as ‘whack a mole’, where dealing with risks platform by platform simply meant a new one would pop up to replace each banned platform.

5.17ASPI advised that an approach to use legislated standards rather than bans on individual applications was preferable but noted that ‘there's also a need for bespoke legislation that accepts the fact that there is a unique problem with social media apps that come from authoritarian countries’ and pointed to the United States RESTRICT Act as a step in the right direction.[11]

5.18Ms Lindsay Gorman of the Alliance for Security Democracy also recommended an approach which outlines the safety standards expected of platforms and apps, but that also takes into account the additional risks inherent in apps based within authoritarian states:

My first recommendation was to come up with this overarching framework that includes the threat of platforms from authoritarian regimes and really spells out what the concerns are. Is it scale? Is it the degree of ownership and influence? Is it the type of platform itself? ... I think there needs to be a much broader framework and a much clearer framework so that when authoritarian internet apps do come to democracies, as they inevitably will in open societies, we have a framework for addressing it, and we don't have to wait until it becomes this behemoth of an issue to deal with a specific platform.[12]

5.19Ms Gorman pointed to the Prague proposals on 5G internet as a ‘fantastic starting point for this kind of legislative development which does identify countries of origin while also trying to promote standards that apply to any internet platform’.[13]

5.20Ms Gorman further advised for separate regulation on AI generated content, because the ‘democratisation of the ability to create extremely realistic but completely fake content… increases avenues for propaganda [that is] designed to spoof and generate content that looks plausible, and sounds like it might be written by a human, but isn't necessarily true’. Ms Gorman expressed concern that ‘our knowledge base of what is actually true becomes at risk and our very trust in information becomes downgraded’.[14]

5.21For example, Ms Gorman pointed to existing content authenticity frameworks such as watermarks, which could be mandated ‘so you can tell when an image has been manipulated and when it hasn't, and you can track the history of that image’.[15]

5.22The AHRC recommended the Australian Government ‘introduce transparent user-data privacy and user-data protection frameworks that apply to all social media and internet companies’ and further recommended that any company that ‘refuses to comply with such frameworks should not be able to operate in Australia’.[16]

5.23ASIO noted that social media ‘has benefited from an era of unprecedented connectivity with limited regulation, thus creating the conditions to allow disinformation and misinformation to proliferate and indeed flourish’ and further noted that there are few laws governing online activity and the extraterritorial nature of the activity means that attribution to a foreign power ‘can be difficult to ascertain, even where such laws come into play’.[17]

Proportionality

5.24While calling for greater regulation of social media platforms to protect Australians from foreign interference, many expert organisations also noted the need for proportionality, whereby the solutions did not stray too far into the realm of curtailing the freedoms that are the bedrock of liberal democracies.

5.25The AHRC noted that ‘social media can be used for purposes that both strengthen or undermine Australia’s democracy and values’ and further advised:

Striking the right balance between regulating online activities and protecting free expression is an ongoing challenge. While there is a clear need to combat misinformation and disinformation online, there is also a risk that in doing so different perspectives and controversial opinions may be targeted. While reasonable minds may differ on exactly where the line should be drawn, if we fail to ensure robust safeguards for freedom of expression online, then the very measures taken to combat misinformation and disinformation could themselves risk undermining Australia’s democracy and values.[18]

5.26Mr Kenny Chiu, a former Canadian MP who had been subjected to foreign interference during an election also argued against ‘blunt tools’ and called for proportionality that will ‘protect what we have treasured and worked so hard for while at the same time countering the disinformation and interference that we are facing together—not just as Australians but also with other countries such as Canada, the UK and the US’.[19]

5.27Human Rights Watch noted that it does not support bans because it wants to see free speech and freedom of expression, ‘but we want there to be credible news sources, and for people in these communities to be aware of these alternative news sources’.[20]

5.28Meta also argued in favour of regulation being considered ‘with regard to individual human rights, such as freedom of expression, freedom of speech, access to information and, of course, privacy’.[21]

5.29There was extensive discussion on the issue of censorship and this discussion is captured below in the section on the new proposed powers for the AustralianCommunications and Media Authority (ACMA).

Proposed new powers for the Australian Communications and Media Authority

5.30In June 2023, the Department of Infrastructure, Transport, Regional Development, Communications and the Arts (Department of Communications) released an exposure draft of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 (Disinformation bill).

5.31In its report on digital platforms’ disinformation measures discussed in Chapter3, the ACMA noted that while steps being taken under the voluntary Australian Code of Practice on Disinformation and Misinformation (DIGI code) were good first steps, the ACMA recommended it have ‘a graduated set of new powers to combat misinformation and disinformation across the sector’ which would ‘increase transparency and ensure that digital platform services are held to account if voluntary industry efforts prove to be inadequate’.[22]

5.32In response to that report, the proposed powers in the Disinformation bill have been crafted to:

Enable the ACMA to gather information from platforms and require digital platform providers to keep certain records about matters regarding mis and disinformation.

Enable the ACMA to request industry develop a code of practice covering measures to combat mis and disinformation on digital platforms, which the ACMA could register and enforce.

Allow the ACMA to create and enforce an industry standard (a stronger form of regulation), should a code of practice be deemed ineffective in combatting mis and disinformation on digital platforms.[23]

5.33Digital Industry Group Inc. (DIGI), the industry body responsible for the current DIGI code, has expressed in principle support for the proposed powers outlined in the ACMA’s report.[24]

5.34The Department of Communications outlined that the intent of the Disinformation bill is to:

…tackle content that is reasonably likely to cause or contribute to serious harm. That, in a sense, is the intent of the bill: in a sense, for digital platforms to take steps and responsibility for the content on those platforms and, in doing so, take steps to address content that they judge could be likely to cause serious harm. So the focus is on the harm and the content rather than the intent or the source of the content, if that makes sense.[25]

5.35The Disinformation bill defines misinformation and disinformation as follows:

Misinformation is online content that is false, misleading or deceptive, that is shared or created without an intent to deceive but can cause and contribute to serious harm.

Disinformation is misinformation that is intentionally disseminated with the intent to deceive or cause serious harm.

Serious harm is harm that affects a significant portion of the Australian population, economy or environment, or undermines the integrity of an Australian democratic process.[26]

5.36ASPI did not agree that the above definitions were clear enough, ‘so it leaves subjectivity to the social media providers to decide what the litmus is for content that should be taken down’.[27]

5.37Meta described the current model of the voluntary DIGI code as ‘a good way to both address the concerns of policymakers and communities about the transparency of the approach that we take to combatting misinformation and foreign interference, while also balancing the very important imperative of protecting free expression and making sure we're not overly censoring people online’. Meta further argued that the draft legislation goes further than just enforcing the DIGI code, and Meta warned that it saw ‘some potential for that power to be abused or for it to be used in a way inadvertently chills free and legitimate political expression online’.[28]

5.38The Department of Communications disagreed with the premise that the Disinformation bill may have an unintended consequence of censorship or chilling of online discourse, however it did concede that overall effect of the draft bill would censor people who shared content that was unintentionally false, but only where that content causes serious harm.[29]

5.39Google and YouTube stated it was still working on reaching a formal position on the bill, but expressed initial support for the proposals, including giving the ACMA the powers to enforce the DIGI code. Google reminded the committee that it was one of the first signatories to the DIGI code, and wanted more companies to sign up to increase its coverage.[30]

5.40In media reports, Victorian barrister Peter Clarke described the Disinformation bill as a ‘dangerous piece of legislation’ and noted that the definition of what constitutes harm were ‘so vague as to be dangerous’. Similarly, Sydney barrister JeffreyPhillipsSC said the bill ‘poses the ability to censor and shuts down debate’ and was a ‘bad direction for us to be going in’.[31]

5.41The Department of Communications is undertaking a consultation on the exposure draft, seeking submissions by 6 August 2023, which will inform any changes to the draft bill.[32]

Legal remedies

5.42In addition to the sector-wide approach of regulating platforms to ensure they are compliant with meeting responsibilities to reduce CIB, the committee heard that more could be done to improve the legal frameworks for reporting and charging people over instances of suspected foreign interference.

Improved reporting

5.43The Australian Federal Police (AFP) stressed that reporting instances of suspected foreign interference is the first line of defence:

Senator, my personal view is that the best strategy to defend against foreign interference is through high community and agency awareness and also knowledge of how to report it. If it's reported, we're able to understand it. And once we're able to understand it, we can work to disrupt and look at vulnerabilities that need to be corrected not only on individual case basis, but at more system level.[33]

5.44The AFP advised that foreign interference allegations ‘come to us through a number of means … through Crime Stoppers, through direct reports to the AFP, referred from other departments, as well as through the National Security Hotline, as well as other government departments’.[34] ASIO likewise submitted that it has a number of channels for the reporting of concerns about being contacted for the purpose of foreign interference for clearance holders, members of the public, government organisations, and private sector companies.[35]

5.45Home Affairs likewise stressed the importance of people reporting instances of foreign interference through the National Security Hotline and noted that ‘there are certainly times when there is an increase in certain communities reporting suspected foreign interference or interference, and we work very closely with those communities and will step up engagement with those communities to respond to that when we see those occurring’.[36]

5.46Mrs Kateryna Argyrou of the Australian Federation of Ukrainian Organisations told the committee that her organisation made sure to report all instances of harassment and interference:

Every time we've had a community member who has been harassed online or threatened—and we've had many community members who've received death threats through Telegram channels, through Facebook Messenger and through other social media—we've reported that [to the] the police.'[37]

5.47Mr Peter Murphy of Australian Supporters of Democracy in Iran noted the challenges faced by people in reporting foreign interference but that once reported, it appeared to be taken seriously:

It was a bit of a merry-go-round. We'd ring the Federal Police, and they'd say go to the New South Wales police. We'd ring the New South Wales police, and they'd say go to the Federal Police, so eventually I had to go in person to the Federal Police during the pandemic because nobody would really talk on the phone. But I think it was a reasonable exchange. I was interrogated or questioned and was able to convey all the information I wanted to give, and it was taken seriously ... I’m more confident that our agencies are getting a clearer view of what is going on and are more willing to take action to protect people here than they might have been before.[38]

5.48However, Human Rights Watch raised concerns that members of diaspora communities in Australia who may be subjected to threats and intimidation ‘don't know what to do about those threats or who to report them to’ and recommended an information campaign to advise diaspora communities in Australia, including in other languages, not just English.[39]

5.49The AFP noted that due to the sensitivities of the reported issues, people reporting via the National Security Hotline were generally not kept informed of progress on the matter they reported, and sometimes were not even acknowledged that a report had been made. However, the AFP did stress the continued importance of making such reports, particularly because reports are then sent to ‘multiple agencies and agencies that need to know’.

5.50The AFP went on to advise that:

In terms of our response, we look at matters on a harm basis. So a high-harm report is something that is prioritised now. High harm in terms of community interference, or foreign interference in community, or transnational repression—it depends on which stream you're listening to—is really where there are actual threats or demands with menace that involve the potential for serious physical harm being directed at a person or a community. Those are the priority areas that we look at.[40]

5.51The AFP further advised that ‘The biggest challenge, as I pointed out earlier, is that foreign principal element. I know that there is a quantum of reporting where people are feeling intimidated or feeling that they're subject to foreign interference. In our case, if we are unable to establish a foreign actor involvement, that does limit our ability to respond under the foreign interference legislation’.[41]

Charges and prosecution

5.52Despite the prevalence of social media-based foreign interference, harassment and threats being experienced by diaspora communities in Australia—many of whom are Australian citizens—the AFP advised the committee there had been no instances where a person was charged under the National Security Legislation Amendment (Espionage and Foreign Interference) Act 2018 (Espionage Act) with foreign interference via a social media platform.[42]

5.53The AFP advised the committee of the range of elements that must exist in order for foreign interference to meet the threshold for offences under the EspionageAct:

… our primary consideration is the involvement of a foreign actor and whether an individual is working in collaboration with a foreign actor or as a proxy. That's a key requirement. And then obviously we have to look at the other elements of the offence, whether it goes to government processes, political processes, political rights or democratic rights being exercised, and whether an intelligence objective of the foreign principal is what the activity is related to or it's adverse against the national security of Australia. We then consider whether the conduct involved an element of covert and deception or whether there was a threat of serious harm or the demand with menace. That's the collective of what needs to be proven.[43]

5.54The AFP did note that where an action might not meet the threshold of the Espionage Act—which requires either a foreign principal or a proxy of a foreign principal to be undertaking the action—the action may still be a crime under other laws:

There also may be other criminal investigations, because conducting a threat or harassment or menace online is a separate offence, not under the foreign interference legislation…And of course if the posting is potentially hate crime related, this is not a Commonwealth jurisdiction; it's something that will fall to state and territory police.[44]

5.55The AFP stressed that ‘no report is lost’ and outlined that where there are individuals who are criminally targeting the community, the AFP does prioritise those reports. Where it does not fall within its jurisdiction, they refer to state and territory police and advised the committee that ‘we don't just not respond’.[45]

5.56ASIO also noted a range of activities it can undertake where charges cannot be laid:

There are other things we do in the course of our engagement. We can engage with victims as well as perpetrators of foreign interference or people that we believe are likely to be involved in perpetrating foreign interference. That can have an effect of adding resilience, rendering the environment less permissive, if you like. I wouldn't go into the nature of those engagements. They're confidential by their nature. The point I'd leave you with is that the taskforce has a range of options at its disposal to seek to disrupt and reduce the harm from foreign interference, including what we call community interference or transnational repression.[46]

5.57The Law Council of Australia noted that widespread use of the foreign interference offences under the Espionage Act and the Criminal Code Act 1995 (Criminal Code) was unlikely 'given the challenges that exist in relation to successfully investigating and prosecuting persons who commit this offence when the "conduct" occurs outside Australia'. Nevertheless, the Law Council of Australia suggested that these measures could still prove useful in conjunction with other approaches.[47]

5.58This was supported by evidence from witnesses such as Mrs Kateryna Argyrou from the Australian Federation of Ukrainian Organisations, who told the committee:

Last time I checked, we'd had over 40 official complaints registered with police, with registered numbers. We continue to add to that list. The Australian law enforcement agencies have been very open and willing to listen, but I think there is an issue here between where they can use the full force of the law and where they can just sympathise and say, 'We hear you, but there is really nothing we can do' … So this is where we would hope that you [this inquiry] can hear our concerns on that point of view, and something can be discussed or done on that front.[48]

5.59A joint submission from technical specialist University centres suggested that changes to relevant laws would be necessary to enable the use of technical methods and tools relating to the monitoring or investigation of foreign interference through social media and their results in judicial proceedings, writing:

Litigation is a potential, but currently inadequate, antidote to disinformation. Litigation removes individuals from their informational bubbles or echo chambers, assign neutral judges or regulators - instead of simply the marketplace of ideas - to render a judgment on the truth or falsity of certain beliefs and compel action by responsible parties. However, litigation has not yet delivered on its potential. For example, criminal prosecution of offenses of foreign interference through social media, while theoretically possible, is often dismissed as practically infeasible given the jurisdictional and evidentiary challenges.[49]

Countermeasures

5.60The committee heard that in addition to the significant gaps in identifying and reporting foreign interference, there are gaps in Australian Government countermeasures once foreign interference is detected.

5.61Overall, the committee heard the government should operate with greater transparency, including by being less reticent to ‘call out’ instances of foreign interference and providing advice on security concerns, and that countermeasure operations should be conducted under a more rigorous framework, such as those being used in the European Union (EU) and UnitedStates (US).

5.62Dr Andrew Dowes, Director of RAND Australia advised that governments should be taking an analytical approach to developing countermeasures:

It may be instructive to consider how an actor, whether it be foreign or domestic, might go about undertaking a disinformation campaign in order to understand how then that influence could be countered at each step of the disinformation process. In my view these problems of disinformation and information are not able to be defeated by a single action, but require a series of interventions, ranging from addressing the motivation of actors to addressing structural issues in social media networks, to various ways of reducing the likelihood of the audience believing or amplifying force content. In my view such interventions should be priorities for our government, as otherwise the risks and potential consequences of interference through social media will just continue to get worse.[50]

Transparent government communication

5.63The AHRC argued that any taskforce addressing foreign interference should, where possible, report more publicly than is currently done ‘to bring greater transparency to the ways in which misinformation and disinformation are being addressed both to enhance the public understanding of the risks to Australia, and ensure that other rights and freedoms are not disproportionately impacted’.[51]

5.64ASPI noted the success experienced by other governments being more willing to call-out interference operations:

So the question [is]…how do we shift the adversary's thinking about conducting these operations? For state actors, it might be about imposing diplomatic costs or pointing out that they are doing that and imposing a solution of just calling them out and sort of naming and shaming them. I think, at the international level, through multilateral agreements, there is a lot of scope for governments to make those public attributions. That does two things: it does deterrence but it does educate the public as well…Calling them out does really impose costs, and I think it is an easy win, really.[52]

5.65Ms Elaine Pearson, Asia Director, Human Rights Watch advised that:

…there is also a really important role that the Australian government can play in ensuring that the acts of intimidation, harassment and censorship that occur on apps like WeChat are publicised. Chinese-language communities in Australia need to be aware of that, and it shouldn't just be up to civil society organisations or think tanks, really, to be disclosing these incidents.[53]

5.66Dr William Stolz also argued that in addition to reacting to single instances of foreign interference, ‘we need to be engaging in our own information operations to push our own positive narratives—again accepting that effectiveness is going to be ambiguous, but it is perhaps better to be proactive in pushing our own narratives into what is a maelstrom information environment and hope that we achieve a positive impact’.[54]

5.67Professor Rory Cormac agreed and noted that this type of response should not be restricted to ‘setpiece events like elections or regime change’ but should ‘counter that daily drip of disinformation, interference and subversion on an intangible level, which just seeks to sow confusion and discord, exploit preexisting schisms in society, and gradually chip away at trust in institutions like democracy’. Professor Corman further noted the recent countermeasures taken by the UK’s National Cyber Force as showing ‘nuance, maturity and evolution’.[55]

Advising on security concerns

5.68As discussed in earlier chapters, TikTok was banned from being installed on Australian Government-issued devices. A gap was discussed in relation to organisations providing contracted services to government.

5.69The Attorney-General's Department advised that the Protective Security Policy Framework (PSPF) ‘requires non-corporate Commonwealth entities to be accountable for security risks arising from any contractual arrangements they have and to then place security requirements on those they contract’. This was explained to mean that any entity with obligations under the PSPF must ensure that external contractors who work with sensitive government information must also adhere to the same risk approach to applications such as TikTok.[56]

5.70However, the Attorney-General's Department did concede that level of granular detail was not expected to be reported by departments in their annual reporting on their compliance with the PSPF, so whether or not departments had turned their minds to applying the ban to government contractors would not be known.[57]

Regulated frameworks

5.71Chapter 3 on international issues provided an extensive outline of the foreign interference approaches being used in the EU and US, such as the Disinformation Analysis and Risk Management (DISARM) open-source framework for analysis of interference instances and its ‘kill-chain’ approach to countermeasures.

5.72RAND Australia submitted that because a disinformation campaign involves a sequence of steps ‘for false information to be created, disseminated, observed and amplified’, that creates opportunities for countermeasures at each step, only one of which needs to be successful to knock out the whole chain.

Countering efforts at each of these steps may help mitigate the impact of disinformation, including the use of information by actors that represents foreign interference. A strong countering strategy should combine such efforts within a survival chain, rather than expecting a singular solution to defeat disinformation.[58]

5.73Dr Andrew Dowse, Director of RAND Australia, expanded on this directly to the committee, citing the kill-chain philosophy being increasingly used in the EU as a tactic to address foreign interference:

The steps to enabling a disinformation campaign go from the considering of the strategy and the message to setting up the means by which the message can be provided and transmitting the message. So there are steps. The countering of that is not necessarily sequential but can be concurrent. Some of the countering of those steps could be proactive. Some could be responsive. None of them are mutually exclusive, which is why I suggested you need lots of lines of effort.

It's like a defence in depth, where some will get through the first step and some will get through the third but, through all those lines of effort, you should defeat a majority of those activities…this kill chain/survival chain concept has been applied in cybersecurity, very effectively, and I think it lends itself very well to attempting to defeat disinformation campaigns as well.[59]

International cooperation

5.74Ms Lindsay Gorman of the Alliance for Securing Democracy advised that Australia:

… join with democratic allies and partners to develop a comprehensive framework for addressing the threats posed by authoritarian internet apps and critical information infrastructure. Given Australia's strong work on foreign interference, it is naturally poised to take a leading role in these efforts.[60]

5.75Meta similarly called for a global collaborative approach, noting that ‘combatting foreign interference requires multisectoral, whole-of-society approaches towards building a strong security ecosystem not only within Australia but across the region and globally’.[61]

Cross-sector collaboration

5.76The committee heard that greater collaboration between government, tech firms and civil society experts would not only improve coordination of overall countermeasures, but would help each sector in the specific areas they are responsible for. The areas of threat intelligence and technical research were particularly called out.

5.77Meta argued for ‘greater information sharing of IO threat signals among tech companies and between platforms, civil society and government, while protecting the privacy of innocent users who may be swept up in these campaigns’.[62]

5.78Both ASIO and Home Affairs noted that they do not share intelligence on specific instances of suspected foreign interference. Home Affairs advised:

We don't undertake referrals about specific pieces of information in relation to foreign interference. Our engagement with social media platforms is to the level of the broader understanding of misinformation and disinformation and how it could be used and the processes that the social media platforms might themselves look at with their terms of use and the referrals they receive regarding concerns.[63]

5.79ASIO advised that:

One of the challenges we'd have there is that even in those circumstances where we can identify it, providing that intelligence to individuals or companies that don't employ individuals with clearances is a real challenge for us.[64]

5.80Both Home Affairs and ASIO noted there was no policy reason they were aware of that would preclude social media employees from seeking security clearances, and Home Affairs advised it would then allow them to share some information.

5.81In regards to technical research, the issue of funding arose. Meta recommended there should be more support for private and public ‘innovation and collaboration on technical detection of adversarial threats such as manipulated media, including deepfakes’.[65]

5.82Dr Dowse of RAND Australia noted the limited funding avenues to undertake research on social media data. He advised that RAND Australia had to seek US funding to undertake research on Australian issues, and stated ‘I would dearly like to see the Australian government perhaps providing more sources of funding so we can do this research’.[66]

5.83The following chapters, on platforms and civil society responsibilities, discusses the issue of technical research in greater detail.

Footnotes

[1]Lisa Visentin and Matthew Knott, ‘”It feels like hand-to-hand combat”: ASIO boss warns on spy hives, foreign interference’, Sydney Morning Herald, 21 February 2023.

[2]CyberCX, Submission 16, p. 7.

[3]Mr Albert Zhang, Analyst, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p. 18.

[4]Danielle Cave and Dr Jacob Wallis, ‘AUSMIN 2022: Cyber-enabled foreign interference’, The Strategist, 1 December 2022.

[5]CyberCX, Submission 16, p. 7.

[6]Australian Strategic Policy Institute, Submission 13, p. 12. See also: Mr Albert Zhang, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p. 21.

[7]Australian Strategic Policy Institute, Submission 13, p. 12.

[8]Australian Human Rights Commission, Submission 9, p. 8.

[9]UNSW Allens Hub for Technology, Law and Innovation, Society on Social Implications of Technology and Deakin University Centre for Cyber Security Research and Innovation, Submission19, p. 5.

[10]See: Dr William Stoltz, Private capacity, Committee Hansard, 20 April 2023, p. 38; Ms Lindsay Gorman, Senior Fellow for Emerging Technologies, Alliance for Securing Democracy, German Marshall Fund, Committee Hansard, 21 April 2023, p. 10.

[11]Mr Fergus Ryan, Analyst, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p.21.

[12]Ms Lindsay Gorman, Alliance for Securing Democracy, Committee Hansard, 21 April 2023, p. 10.

[13]Ms Lindsay Gorman, Alliance for Securing Democracy, Committee Hansard, 21 April 2023, p. 14. The Prague Proposals on Cyber Security of Emerging and Disruptive Technologies were introduced at the Prague 5G Security Conference in 2021. They promote strategic thinking about security of emerging and disruptive technologies in the context of national security.

[14]Ms Lindsay Gorman, Alliance for Securing Democracy, Committee Hansard, 21 April 2023, p. 11.

[15]Ms Lindsay Gorman, Alliance for Securing Democracy, Committee Hansard, 21 April 2023, p. 11.

[16]Australian Human Rights Commission, Submission 9, p. 14.

[17]Australian Security Intelligence Organisation, Submission 2, pp. 4 and 5.

[18]Australian Human Rights Commission, Submission 9, pp. 3 and 8–9.

[19]Mr Kenny Chiu, Private capacity, Committee Hansard, 21 April 2023, p. 18.

[20]Ms Elaine Pearson, Asia Director, Human Rights Watch, Committee Hansard, 21 April 2023, p. 6.

[21]Ms Mia Garlick, Regional Director of Policy, Meta, Committee Hansard, 11 July 2023, p. 2.

[22]Department of Infrastructure, Transport, Regional Development, Communications and the Arts (Department of Communications), Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023—Fact sheet, June 2023, p. 3.

[23]Department of Communications, Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023—Fact sheet, p. 1.

[24]Digital Industry Group Inc, ‘DIGI Welcomes The Government Providing ACMA With Oversight Powers Over Misinformation’, Media release, 20 January 2023.

[25]Mr Richard Windeyer, Deputy Secretary, Communications and Media Group, Department of Communications, Committee Hansard, 12 July 2023, p. 25.

[26]Department of Communications, Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023—Fact sheet, p. 1.

[27]Ms Emily Mosely, International Cyber Policy Centre Coordinator, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p. 21.

[28]Mr Josh Machin, Head of Public Policy, Australia, Meta, Committee Hansard, 11 July 2023, pp. 5–6.

[29]Mr Richard Windeyer, Department of Communications, Committee Hansard, 12 July 2023, p. 25.

[30]Ms Rachel Lord, Senior Manager, Government Affairs and Public Policy, YouTube, Google, Committee Hansard, 11 July 2023, p. 37.

[31]Rhiannon Down and Sarah Ison, ”Dangerous and Orwellian”: Tech giants and lawyers warn on Labor misinformation bill’, The Australian, 11 July 2023.

[32]Department of Communications, New ACMA powers to combat misinformation and disinformation,www.infrastructure.gov.au/have-your-say/new-acma-powers-combat-misinformation-and-disinformation (accessed 17 July 2023).

[33]Mr Stephen Nutt, Commander, Special Investigations, Counter Terrorism and Special Investigations Command, Australian Federal Police, Committee Hansard, 12 July 2023, p. 22.

[34]Mr Stephen Nutt, Australian Federal Police, Committee Hansard, 12 July 2023, p. 23.

[35]Australian Security Intelligence Organisation, Submission 2, p. 6.

[36]Ms Sally Pfeiffer, Acting First Assistant Secretary, Counter Foreign Interference, Department of Home Affairs, Committee Hansard, 12 July 2023, p. 8.

[37]Mrs Kateryna Argyrou, Co-Chair, Australian Federation of Ukrainian Organisations, Committee Hansard, 21 April 2023, p. 28.

[38]Mr Peter Murphy, Co-Secretary, Australian Supporters of Democracy in Iran, Committee Hansard, 21 April 2023, p. 21.

[39]Ms Elaine Pearson, Human Rights Watch, Committee Hansard, 21 April 2023, pp. 4–5.

[40]Mr Stephen Nutt, Australian Federal Police, Committee Hansard, 12 July 2023, p. 18.

[41]Mr Stephen Nutt, Australian Federal Police, Committee Hansard, 12 July 2023, p. 18.

[42]Mr Stephen Nutt, Australian Federal Police, Committee Hansard, 12 July 2023, p. 16.

[43]Mr Stephen Nutt, Australian Federal Police, Committee Hansard, 12 July 2023, p. 17.

[44]Mr Stephen Nutt, Australian Federal Police, Committee Hansard, 12 July 2023, p. 23.

[45]Mr Stephen Nutt, Australian Federal Police, Committee Hansard, 12 July 2023, p. 24.

[46]Mr Mike Noyes, Deputy Director-General, Intelligence Service Delivery, Australian Security Intelligence Organisation, Committee Hansard, 12 July 2023, p. 10.

[47]Law Council of Australia, Submission 27, pp. 2–3.

[48]Mrs Kateryna Argyrou, Australian Federation of Ukrainian Organisations, Committee Hansard, 21April 2023, p. 28.

[49]Joint University Submission, Submission 19, pp. 6–7.

[50]Dr Andrew Dowse, Director, RAND Australia, Committee Hansard, 20 April 2023, p. 27.

[51]Australian Human Rights Commission, Submission 9, p. 8.

[52]Mr Albert Zhang, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p. 25.

[53]Ms Elaine Pearson, Human Rights Watch, Committee Hansard, 21 April 2023, p. 3.

[54]Dr William Stoltz, Private capacity, Committee Hansard, 20 April 2023, p. 38.

[55]Professor Rory Cormac, Director, Centre for the Study of Subversion, Unconventional Interventions and Terrorism, University of Nottingham, Committee Hansard, 20 April 2023, p. 39.

[56]Ms Sarah Chidgey, Deputy Secretary, National Security and Criminal Justice, Attorney-General's Department, Committee Hansard, 12 July 2023, p. 20.

[57]Ms Sarah Chidgey, Attorney-General's Department, Committee Hansard, 12 July 2023, pp. 20–21.

[58]RAND Australia, Submission 11, p. 3.

[59]Dr Andrew Dowse, RAND Australia, Committee Hansard, 20 April 2023, p. 29.

[60]Ms Lindsay Gorman, Alliance for Securing Democracy, Committee Hansard, 21 April 2023, p. 9.

[61]Ms Mia Garlick, Meta, Committee Hansard, 11 July 2023, p. 2.

[62]Meta, Submission 32, p. 17.

[63]Ms Sally Pfeiffer, Department of Home Affairs, Committee Hansard, 12 July 2023, p. 12.

[64]Mr Mike Noyes, Australian Security Intelligence Organisation, Committee Hansard, 12 July 2023, p.12.

[65]Meta, Submission 32, p. 17.

[66]Dr Andrew Dowse, RAND Australia, Committee Hansard, 20 April 2023, p. 28.