Chapter 8 - Committee view and recommendations

Chapter 8Committee view and recommendations

Overall committee view

8.1In authoritarian states, citizens are subjected to government repression in the form of censorship, where people are restricted from freely discussing issues that would enable them to make fully informed social and political choices. Suchcensorship is anathema to the values of democratic states such as Australia. This provides authoritarian countries with an asymmetric advantage because they can readily censor narratives within their own country while simultaneously exploiting the openness of democratic societies to push their agendas offshore.

8.2Because authoritarian governments cannot directly control the information environment in democratic states via overt censorship, they seek to control the information environment by flooding our online spaces with disinformation that skews public debate, undermines trust in our democratic institutions, establishes narratives that favour the interests of the perpetrating state and make it difficult for the casual observer to separate truth from fiction.

8.3We cannot allow this rising authoritarianism to contaminate our democracy and undermine our social cohesion in the most challenging strategic environment since the World War II.

8.4The Director General of the Australian Security Intelligence Organisation (ASIO) recently declared that Australia is facing an unprecedented challenge from foreign interference and espionage, and that he is not convinced that we fully appreciate the damage that it inflicts on our security, our democracy, our economy and our social fabric.[1]

8.5Because social media has become such a dominant form of communication it is also a dominant vector for foreign interference via disinformation, coordinated inauthentic behaviour networks and transnational repression, which carries profound consequences for our liberal democracy.

8.6Whether we like it or not, modern social media platforms are not just a mode of communication—in democratic states they have also become the new public square, where important issues are debated, interrogated and resolved. They help shape public opinion and, ultimately, the choices democracies make. For this reason they are a highly attractive domain for our potential adversaries, who engage in information operations on these platforms to influence these decisions.

8.7Protecting people's rights to participate in our democracy in the digital town square is a balancing act. As the Australian Human Rights Commission (AHRC) raised with this committee, social media can be used for purposes that both strengthen or harm our democracy and values.[2] The difficult task for governments is to ensure any policy responses strike the right balance and protect the foundations of our democracy. All Australians must have their free speech protected while also preserving the fundamental right to participate in a democratic society that is free from covert influence and manipulation.

8.8This balancing act is even more precarious when the platform in question is headquartered in a country with a form of government that is fundamentally incompatible with our own. When asked why should a liberal democracy be uncomfortable with a platform like WeChat being the dominant source of information for a diaspora community if it is potentially controlled by a foreign government, Dr Seth Kaplan advised the committee that 'instead of your democracy being a debate among people who live in your country, there's an additional voice that plays a large part in the conversation, and that voice is controlled by a foreign government that does not have your best interests at heart … that [is] the most clear danger'.[3] The committee heeds this warning, and believes we must take action now to shine a light on these covert, malign sources of influence in our democracy to make Australia a harder target for cyberenabled foreign interference peddlers.

8.9Foreign interference does not only impact our public discourse. Many people living in Australia are not free of the long hand of their repressive former governments, reaching across the seas to continue to engage in acts of intimidation, harassment and violence including through social media. AsChinese-Australian artist and activist Badiucao testified to this committee, in relation to WeChat being used by the Chinese government for disinformation and propaganda purposes, we cannot allow a situation in which foreign authoritarian governments exploit social media platforms to export their censorship in a manner that is fundamentally at odds with our liberal democratic values and endangers our national security.[4] Australia must protect those people, who deserve to live a life free from fear, and they deserve the protection of the Australian Government when malicious actors try to take away that freedom that all Australians enjoy.

8.10This is not a partisan issue. Liberal democracies around the world agree that foreign interference and information manipulation, particularly from authoritarian states, is one of the greatest contemporary challenges to democracy.

8.11As outlined by the AHRC, social media can be used to either strengthen or undermine democracy. Getting the settings right requires a clear-eyed, nonpartisan approach to confront the dangers our nation faces via the weaponisation of social media to damage our democracy.

8.12We must also recognise the speed at which this challenge is moving as technology evolves exponentially. We can no longer afford to react only once the impacts of foreign interference become apparent, because at that point it is too late for countermeasures to be effective. We must learn to anticipate future risks to our democracy and social cohesion.

8.13That means our countermeasures must be agile and our regulatory framework adaptable to emerging technologies. Legislative settings must be platform and technology agnostic, based on the principles of safety and transparency with clear benchmarks for external oversight, and undergirded by meaningful sanctions when transparency principles are breached.

Platform transparency

8.14Transparency is a foundational principle of any democracy. It is a key ingredient to build accountability and trust, and arguably the strongest antidote to disinformation in a system that seeks to preserve free speech. Australians must be able to trust that the information they see on social media platforms has not been manipulated to serve the strategic interests of a foreign government whose values and interests may run counter to our own.

8.15Democracy also relies on the polity being confident that accurate and truthful information is not being withheld through censorship, and that people are able to engage in free and frank discussion. However, the use of coordinated inauthentic behaviour (CIB) to distribute disinformation is neither free nor frank, as it leverages automatic processes—often coordinated from offshore—to amplify the voice of an interested party and create the illusion of support on a much larger scale. As one expert succinctly put it, why should a bot have the same free speech rights as me?

8.16The committee heard from numerous experts on cyber security, human rights, politics, and community cohesion, all of whom were consistent in their message that the lack of transparency about the way social media platforms operate creates a permissive environment for covert social manipulation that often goes undiscovered.

8.17Australia recognised the importance of transparency in strengthening our democracy via the introduction of the Foreign Influence Transparency Scheme (FITS) in 2018. The FITS requires foreign actors to register their lobbying and communications activities in Australia, and this scheme has since been replicated by other nations who have looked to us as a leader on these issues.

8.18Following a request from the Australian Government in 2020, the DigitalIndustry Group Inc (DIGI)—working in conjunction with industry—released the 2021 Australian Code of Practice on Disinformation and Misinformation (voluntary code), which committed a diverse set of technology companies to take steps to reduce the risk of online misinformation causing harm to Australians. Despite its admirable intent, there is little evidence to suggest the voluntary code has been effective in combating foreign state disinformation, which remains rife on both western-headquartered social media platforms and those from authoritarian states.

8.19The Australian Government has released an exposure draft of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 (the Bill), which gives the Australian Communications and Media Authority (ACMA) powers to gather information from platforms on their records regarding misinformation and disinformation; request industry develops a code of practice; and allows the ACMA to create and enforce industry standards should the code be deemed ineffective.

8.20Although the draft bill was raised during public hearings of the SelectCommittee, we did not conduct a formal inquiry into the bill. For that reason, it would not be appropriate for the committee to express a view about the merits or otherwise of the Australian Government's proposed approach.

8.21Regardless of whether the draft bill proceeds or not, the committee believes that the increasing risk that foreign interference and disinformation poses to Australia's democratic institutions, values and way of life, means that there is a pressing need for additional action to counter the efforts of authoritarian states that seek to weaponise social media platforms against liberal democracies like Australia.

8.22The committee agrees with evidence that censorship and the reporting of alleged misinformation can itself present a risk of foreign interference.

8.23The AHRC noted 'increased examples of extra-territorial censorship, where governments seek to suppress speech outside of their national borders'. ASIOadvised that acts such as these 'could become foreign interference if they involve the hidden hand of a foreign state'.[5]

8.24Foreign government officials pressuring social media companies to make censorship content removal decisions which are in their political or strategic interests but contrary to Australia's democratic processes and national interests represents a foreign interference risk.

8.25While some platforms do report formal legal requests made by government and any action taken as a result, is the removal or restriction of content after informal or non-official requests from foreign government officials, particularly where it is later claimed by the platform that the content breached their policies, that is particularly lacking in transparency. As acknowledged by Twitter, this type of action reduces public trust.

8.26Additionally, though, if content removal, restriction or visibility filtering occurs after an informal request from a foreign government which is not disclosed by platforms, this can result in an unacceptable form of foreign interference in Australian public debate and discussion.

8.27Decisions made by social media companies to censor or remove content on the grounds that it is allegedly 'misinformation' or causes 'harm' can adversely affect public discourse in our democracy. This was particularly demonstrated by Meta's decisions during the pandemic to restrict content on its platform discussing the theory that a laboratory accident might have been involve in the origins of the virus as 'misinformation', despite the fact that we now know that the US Intelligence Community was asked by the Biden administration to investigate the origins of the virus and are divided in their findings on this issue. The Australian public had no visibility over how such decisions were made and whether there was pressure from foreign governments to remove such content, and this demonstrates that less censorship, rather than more, is the safest option for our democracy.

8.28Current transparency measures by social media companies do not adequately cover pressure applied by government officials to censor content where this pressure is applied outside of formal legal channels.

8.29There is wide scope for foreign government officials to pressure social media staff to make a finding that content that they wish to have removed should be interpreted as being in breach of the platform's rules, despite not having been deemed to be so prior to the government's request.

8.30This issue goes directly to the concerns about censorship raised by submitters and witnesses that social media platforms should be required to be fully transparent about the content they censor or visibility filter in response to requests from foreign governments, whether as a result of a legal process or through other channels.

8.31Under current policies, decisions to censor content following requests from foreign governments can be obscured by statements from platforms that the content breached their content policies.

8.32There is currently an unacceptable risk of staff at social media companies making content moderation decisions which affect Australian political discourse and democratic processes under pressure from foreign government officials, without this being adequately disclosed to the Australian public.

8.33Several experts who testified before the committee recommended that the best path forward is to introduce enforceable transparency obligations that social media companies must meet to continue operating in Australia.

8.34Dr Seth Kaplan recommended a 'step-by-step approach' where we say 'these are the rules that function in a democracy, these are the things we expect and these are the penalties' and serious non-compliance could be met with enforcement measures, including up to a ban of the social media application.[6]

8.35Mr Albert Zhang from the Australian Strategic Policy Institute also favoured 'regulation or at least a discussion about how we standardise and create better reporting mechanisms to increase transparency', and Mr Fergus Ryan called for 'legislation that deals with any other emerging apps that may come out of countries like China'.[7]

8.36Similarly, Australian cybersecurity firm CyberCX called for 'mandatory compliance and report frameworks for social media platforms' so that the government has the final say on managing risks to our democracy.[8]

8.37Finally, the AHRC supported establishing 'clear and mandatory requirements, and pathways, for social media organisations to report suspected foreign interference' and to require social media platforms to publicly disclose the content that they censor and make it an offence to censor content where that has not been publicly disclosed to users.[9]

8.38Given this, the committee recommends an approach that favours transparency over censorship, whereby platforms are given reasonable hurdles to meet and it is incumbent on them to take reasonable steps to inform Australians about the origin, and, where possible, the intent of the content they are exposed to so they can make up their own minds about its merits. This serves as a reasonable alternative to a more heavy-handed approach, wherein decisions about what constitutes allowable speech are outsourced to tech platforms—most of which are headquartered overseas—in the name of user 'protection'.

8.39For example, content which has been published by a proxy of an authoritarian state, such as state media broadcasters like Russia's RT, or China's Xinhua, would be labelled as such, instead of being censored. Informed of the origins of this content, users can decide for themselves how much weight to award it. Some social media platforms already do this, while others have never done so, or recently ceased doing so.

8.40However, transparency should not be restricted to information that users see, it must also be for information that users do not see due to censorship or other forms of visibility manipulation. Australian laws should discourage censorship of political discussion and public policy debate by social media platforms outside of very clearly defined legal processes.

8.41Under this transparency-led model, platforms who fail to meet these reasonable standards would be fined, while repeated non-compliance could result in a ban given the harmful impact this conduct would have on our democracy.

8.42The committee therefore recommends the Australian Government immediately create a system of enforceable minimum transparency standards that all social media platforms operating in Australia must meet, underpinned by appropriate sanctions to ensure compliance up to and including banning platforms from operating within Australia.

Recommendation 1

8.43The committee recommends the Australian Government require all large social media platforms operating in Australia to meet a minimum set of transparency requirements, enforceable with fines. Any platform which repeatedly fails to meet the transparency requirements could, as a last resort, be banned by the Minister for Home Affairs via a disallowable instrument, which must be reviewed by the Parliamentary Joint Committee on Intelligence and Security.

8.44Requirements should include, at minimum, that all large social media platforms:

  1. must have an Australian presence;
  2. must proactively label state affiliated media;
  3. must be transparent about any content they censor or account takedowns on their platform;
  4. must disclose any government directions they receive about content on their platform, subject to national security considerations;
  5. must disclose cyber-enabled foreign interference activity, including transnational repression and surveillance originating from foreign authoritarian governments;
  6. must disclose any takedowns of coordinated inauthentic behaviour (CIB) networks, and report how and when the platform identified those CIB networks;
  7. must disclose any instances where a platform removes or takes adverse action against an elected official's account;
  8. must disclose any changes to their platform's data collection practices or security protection policies as soon as reasonably practicable;
  9. must make their platform open to independent cyber analysts and researchers to examine cyber-enabled foreign interference activities;
  10. must disclose which countries they have employees operating in who could access Australian data and keeps auditable logs of any instance of Australian data being transmitted, stored or accessed offshore; and
  11. must maintain a public library of advertisements on their platform.

Platforms headquartered in authoritarian states

8.45Experts in cyber security and foreign interference sent a clear and consistent message to the committee: social media platforms that originate from authoritarian states present an additional layer of risk. CyberCX warned the committee that the collection and aggregation of the personal information of millions of Australians and the capacity for malign foreign actors to manipulate content consumed by Australians at scale is a heightened risk for platforms that are linked to authoritarian governments, including TikTok and WeChat.[10] As they pointed out, Chinese government authorities operate 'without the oversight and transparency mechanisms of rule-of-law democracies', and therefore it is clear to the committee that platforms headquartered in authoritarian states deserve even greater scrutiny because of the heightened risks to our democracy and citizens.

8.46It is not merely that authoritarian states tend to place less emphasis on personal privacy and data protection. It is that those authoritarian states have deliberately—themselves or by proxy—designed social media platforms that can both obscurely harvest vast swathes of user data while also covertly manipulating the information environment to serve the interests of the intervening state. Dr William Stoltz from the Australian National University's National Security College advised this committee that information campaigns can be weaponised to 'flood the information sphere' with multiple and even contradictory narratives that sow confusion or discord, distract attention, and to encourage the target audience to question the authenticity of other information and induce wider cynicism.[11]

8.47ASIO highlighted the additional security concerns that authoritarian governments 'are able to direct their country's institutions—including media, businesses and society—to support intelligence or foreign policy objectives in ways which are not acceptable in a democracy'.[12]

8.48The committee is alarmed by this rising authoritarianism creeping into our society through the digital domain, and we must be alert and ready to respond to these grey zone tactics that fall short of war but still corrode our democracy and sovereignty.

8.49These platforms' ability to gain and maintain our attention through opaque algorithms and relentlessly collect user data did not develop by accident. Eachplatform's commitment to these features goes well beyond a simple commercial imperative as failure to comply with the will of an authoritarian government is often an existential decision for platforms headquartered in these countries. Consequently, these tactics are vigorously defended by both the platforms themselves as well as their authoritarian governments who have a vested interest in preserving access to harvested data from democratic nations. It is little wonder that China's Ambassador to Australia, Xiao Qian, protested the Australian Government's decision on 4April2023 to ban TikTok from government-owned devices and cease using technology from Chinesemanufactured companies which are ultimately beholden to the demands of the Chinese Communist Party (CCP).[13]

8.50The committee's concerns are echoed across the globe. As early as 2015, the European Council publicly stressed 'the need to challenge Russia's ongoing disinformation campaigns' and called for 'an action plan on strategic communication'.[14] A recent joint address by the heads of Security Service MI5 (United Kingdom) and the Federal Bureau of Investigation (United States) noted that the CCP is covertly applying pressure across the globe via a coordinated campaign on a grand scale that involves planned, professional activity in support of a decades-long strategic contest.[15]

8.51In the case of TikTok and WeChat, both platforms originate from China, where national security laws such as Article 7 of the National Intelligence Law of 2017 require those platforms, and their employees, to assist national security agencies with any request they may make, and furthermore, to keep those requests secret. As Ms Shanthi Kalathil testified to this committee, 'were the [Chinese] state to go to [ByteDance and Tencent] with certain demands, they would be unable to legally or in any other capacity resist those demands'.[16] The committee is deeply concerned that the consequence of this extrajudicial direction is that social media companies that are ultimately beholden to the CCP have unfettered access to the personal data of millions of Australians that could ultimately end up in the hands of a foreign intelligence service.

8.52Throughout this inquiry, both WeChat and TikTok have been reluctant participants in this parliamentary scrutiny of social media platforms—a scrutiny that is itself an expression of Australia's democracy. This reluctance to participate speaks volumes about the respect those platforms hold for Australian democratic processes, and stood in stark contrast to the more cooperative disposition of platforms headquartered in western countries who recognise the fundamental importance of the checks and balances inherent in democratic systems.

8.53The committee notes that WeChat repeatedly declined multiple invitations to participate in a public hearing on grounds it does not have a legal presence in Australia, ignoring that the invitation allowed them to participate via videoconference from any location in the world—an option other witnesses utilised. As Chair of the Select Committee, Senator Paterson sent an open letter on 4 July 2023 to WeChat urging them to reconsider their refusal to appear at a public hearing. While WeChat reiterated its stated intention to 'working collaboratively with Australian regulators and authorities' and said that it remained 'committed to providing responsive information to the committee in writing', its blanket refusal to front up in an open forum demonstrates to the Australian public that WeChat is not genuinely committed to being held accountable for the allegations of censorship, surveillance and foreign interference at the hands of the CCP that are rife on its platform.[17] WeChat's written responses, delivered on 26 July 2023, to the committee's 53 questions on notice, do nothing to assuage our strongly held concerns. WeChat's responses demonstrate a non-genuine effort to assist the committee with answers that completely lack credibility and are widely contradicted by the evidence of independent experts.

8.54TikTok was similarly reluctant to participate, requiring the committee to remind them of its powers to compel any Australian-based employee to appear at a hearing. Once in attendance, the committee found TikTok engaged in a determined effort to obfuscate and avoid answering simple questions about their platform's operations and its links to their parent company, ByteDance, and its relationship with the CCP.

8.55On all the available evidence, including TikTok's admission that the majority of ByteDance employees are based in China, the committee does not accept TikTok's statement that 'ByteDance operates as a global company without an officially designated headquarters.' This claim has been contradicted by TikTok and ByteDance's own public statements including in court-filings in the US.[18] Itis clear that TikTok's parent company, ByteDance, is headquartered in China where the majority of its employees are based, and is therefore subject to the Chinese Government's laws including the 2017 National Intelligence Law. As TikTok witnesses reluctantly admitted, those China-based employees can remotely access Australian user data, and can even make changes to the algorithm.

8.56TikTok's obfuscation continued with their responses to questions on notice, which were not appropriately answered. TikTok told the committee that 'data is protected, kept and logged in our company'[19] but were unable to give a number on how often Australian user data had been accessed by employees in China, and TikTok said it did not have specific numbers but would take that on notice. TikTok's follow-up answer was not enlightening either; TikTok failed to quantify the number of times Australian user data had been accessed by mainland China. The only reasonable conclusions the committee can reach is that TikTok does not track and log this data access in the way it first suggested to the committee and has failed to correct the record, or it does do so but is refusing to provide the information because it would be a shockingly large number. Neither prospect reflects well on the company, its officers or its commitment to comply with Australian law.

8.57Again, this is not an issue that is isolated to Australia. Other nations have sought to engage with TikTok about ongoing security concerns they hold with the practices of the platform, and have also found TikTok to be reluctant participants who were evasive and misleading under questioning. In their letter to the US Federal Trade Commission, US senators Mark R. Warner (D-VA) and Marco Rubio (R-FL) criticised the 'repeated misrepresentations by TikTok concerning its data security, data processing, and corporate governance practices'.[20]

8.58It is also telling that all social media platforms who appeared before the committee stated that they found and removed CIB that sought to influence Australia. In contrast, TikTok declared that there was no such CIB on its platform.[21] The committee holds this declaration in high suspicion.

8.59Concerningly, in answers to questions on notice, TikTok Australia admitted that its much-vaunted Project Texas, designed to supposedly protect American user data, and Project Clover, which purports to protect European user data, has no Australian equivalent. Although widely criticised by independent experts—including Ms Shanthi Kalathil[22] and the US Federal Communications Commissioner Mr Brendan Carr[23] who testified before this committee—as insufficient to protect user data, TikTok in these jurisdictions at least feels compelled to offer this fig-leaf. It is telling that no similar proposal exists for Australian users. Australia cannot again be left behind, and must be included in any solution to the problem posed by ByteDance's ownership of the app.

8.60The committee notes the ongoing exploration of options in relation to ownership of TikTok in the US, which are centered on divesting ownership away from a company that is beholden to the national security laws of the CCP, thereby severing TikTok's obligations to cooperate with China's intelligence apparatus. Should the US Government take action to force Bytedance to divest ownership of TikTok, the Australian Government should take advantage of that move to implement similar requirements in Australia.

Recommendation 2

8.61The committee recommends that, should the United States Government force ByteDance to divest its stake in TikTok, the Australian Government review this arrangement and consider the appropriateness of ensuring TikTok Australia is also separated from its ByteDance parent company.

Apps on government devices

8.62On 4 April 2023, the Australian Government issued a directive under the Protective Security Policy Framework (PSPF) that TikTok could not be installed on any Australian Government-issued devices due to the serious espionage risk.

8.63Ms Abigail Bradshaw, the Deputy Director General of the Australian Signals Directorate and Head of the Australian Cyber Security Centre, outlined the rationale for the ban in compelling evidence to the committee. MsBradshaw detailed the extensive data collection practices of TikTok, including its attempts to access data held by other applications on a device, and noted the company was subject to the extrajudicial direction of the Chinese Government. When asked about TikTok's public statements that it has never been asked to hand over its data to the Chinese Government, Ms Bradshaw advised that she had 'not seen technical controls that would provide a basis for technical advice that the risks of sharing data are adequately mitigated'.[24]

8.64While the ban of TikTok from Australian Government devices is a promising first step, there are still security gaps that must be addressed. For example, the Attorney-General's Department (AGD) noted that while the PSPF also applied to government contractors, AGD did not have oversight over whether departments had turned their minds to ensuring contractors either knew about or were adhering to the TikTok ban. If TikTok is not safe to be on the device of a government employee, it is not safe to be on the device of a government contractor who has access to similarly sensitive information.

8.65During the inquiry it emerged that several major consulting firms took the decision to ban TikTok from their employee's work-issued devices following correspondence from the Shadow Minister for Home Affairs and CyberSecurity.[25]

8.66WeChat poses a similar data security risk to TikTok and should be added to the PSPF ban, with appropriate exemptions available on a case-by-case basis provided sufficient security mitigations are deployed, as is the case with TikTok. Additional research on other platforms should be undertaken, with a view to developing more detailed guidance on the installation of any similar social media platforms onto devices that can access sensitive government data and information.

8.67Critical infrastructure entities regulated under the Security of Critical Infrastructure Act 2018 (SOCI Act) especially those designated as Systems of National Significance by the Minister for Home Affairs, are recognised as providers of essential services to the Australian people. Under the SOCI Act, entities are required to take steps to protect themselves from a range of risks, including risks to cyber security. Evidence provided by the Department of Home Affairs (Home Affairs) during Senate Estimates indicated it was an option available to the government to direct these entities to take reasonable steps to mitigate the risk of apps like TikTok.[26]

8.68Applications like TikTok and WeChat illustrate the broader risk to sensitive government information from software and hardware posed by high-risk vendors, particularly those subject to the control of authoritarian foreign governments. It was clear during the inquiry that there is currently no entity within government with central policy responsibility for mapping, assessing and mitigating the risks posed by these vendors. Too often, government is reactive in its approach to high-risk technologies, only taking action to ban software or restrict hardware after it has generated public controversy. We need to move to a proactive setting which considers the risk of emerging technologies before they are widely installed by government users and subsequently have to be removed or have mitigations applied at great cost and inconvenience.

Recommendation 3

8.69The committee recommends the Australian Government extend, via policy or appropriate legislation, directives issued under the Protective Security Policy Framework regarding the banning of specific applications (e.g. TikTok) on all government contractors' devices who have access to Australian government data; and

8.70The Minister for Home Affairs should review the application of the Security of Critical Infrastructure Act 2018, to allow applications banned under the Protective Security Policy Framework to be banned on work-issued devices of entities designated of Systems of National Significance.

Recommendation 4

8.71The committee recommends the Australian Government consider extending the Protective Security Policy Framework directive banning TikTok on federal government devices to WeChat, given it poses similar data security and foreign interference risks.

Recommendation 5

8.72The committee recommends the Australian Government continues to audit the security risks posed by the use of all other social media platforms on governmentissued devices within the Australian Public Service, and issue general guidance regarding device security, and if necessary, further directions under the Protective Security Policy Framework.

Recommendation 6

8.73The committee recommends the Australian Government establish a national security technology office within the Department of Home Affairs to map existing exposure to high-risk vendors such as TikTok, WeChat and any similar apps that might emerge in the future. It should recommend mitigations to address the risks of installing these applications, and where necessary, ban them from being installed on government devices.

Building the capacity of government

Lead agency

8.74The committee considers that the Australian Government must urgently increase its expertise and technical capability in combatting foreign interference through social media to address current and emerging threats.

8.75In this report, the committee has mapped the departments, agencies and taskforces involved in addressing foreign interference through social media and created a visual diagram showing the complexity and diffusion of responsibility created by these arrangements. This was further highlighted in discussions with Home Affairs, who told the committee that despite having responsibility to run the Counter Foreign Interference Coordination Centre, it was not a member of the Counter Foreign Interference Taskforce.[27]

8.76While a range of agencies contribute their specialist knowledge to track, analyse and counter foreign interference, there is a general lack of accountability because no single entity is accountable for the outcomes. A dedicated body is needed to coordinate government efforts and serve as a one-stop shop for individuals and private sector stakeholders to report and seek advice about suspected foreign interference through social media. This body should also serve a public outreach function and regularly publish advice to support companies and individual recognise and appropriately respond to foreign interference activity.

8.77The AHRC, the Australian Strategic Policy Institute and CyberCX all testified to this committee that the Australian Government should nominate a lead government entity that is responsible for countering cyber-enabled foreign interference.[28]

8.78The committee notes that this proposed body, and existing agency efforts, need to be adequately resourced. Despite an announcement on 14February this year that Home Affairs and ASIO have been tasked with a new community outreach program to counter foreign interference, Home Affairs informed the committee it was not given any new resources to undertake this work. It is vital that Australia's frontline agencies in the fight against foreign interference are adequately resourced so that the acquittal of their broader functions is not weakened or jeopardised due to resources being spread too thin.

8.79The committee further notes the recent agreement between the EuropeanUnion and the US to use a common methodology for identifying, analysing and countering foreign interference. By using the DISARM framework, STIX2 standard and the OpenCTI platform, information between the two regions will be shared more efficiently, effectively and with a greater level of detail. A lead agency in Australia should quickly move to adopt the same standards and join this coalition so that we can leverage the learnings and scale of likeminded partners while sharing our own experiences to inform best practice.

Recommendation 7

8.80The committee recommends the Australian Government designate an entity with lead responsibility for whole-of-government efforts to counter cyberenabled foreign interference, with appropriate interdepartmental support and collaboration, resources, authorities and a strong public outreach mandate.

Cyber security strategy should include foreign interference

8.81Home Affairs is currently developing the 20232030 Australian Cyber Security Strategy. Earlier this year Home Affairs published a discussion paper seeking views to inform the final strategy, however despite the discussion paper noting that cyber technology provides a conduit for 'crime, foreign interference, espionage, disinformation and misinformation', none of these issues are addressed in the suggested 'core policy areas that the Expert Advisory Board expects will be addressed in the Strategy'.[29] The committee is concerned that this area of risk is not being appropriately addressed within the strategy, which provides a logical mechanism to address the risk of cyber-enabled foreign interference as part of broader efforts to uplift Australia's national cyber resilience.

Recommendation 8

8.82The committee recommends the Australian Government address countering cyber-enabled foreign interference as part of the 2023–2030 Australian Cyber Security Strategy.

Magnitsky sanctions

8.83There are existing provisions within the Autonomous Sanctions Act 2011 which allow the Australian Government to impose targeted financial sanctions and travel bans against perpetrators and beneficiaries of prohibited behaviours and actions. The scheme allows the government to create sanctions that are either country-specific or thematic. Thematic sanctions can currently be imposed for:

… the proliferation of weapon of mass destruction; threats to international peace and security; malicious cyber activity; serious violations or serious abuses of human rights; activities undermining good governance or the rule of law, including serious corruption; and serious violations of international humanitarian law.[30]

8.84The committee recommends that the scope of the Australia's Autonomous Sanctions regime should be clarified to ensure include foreign interference through social media, via amendments to the Act if necessary. This will serve as a deterrent to foreign states seeking to conduct foreign interference in Australia, and will allow Australia to issue sanctions and attributions in concert with likeminded partners for foreign interference attempts as we already have the ability to do for cyber-attacks.

Recommendation 9

8.85The committee recommends the Australian Government clarify that Magnitskystyle cyber sanctions in the Autonomous Sanctions Act 2011 can be used to target cyberenabled foreign interference actors, via legislative amendment if necessary, and ensure it has appropriate, trusted frameworks for public attribution.

Capacity to lay charges for foreign interference

8.86The committee heard from a range of community groups about the impacts that foreign interference has on the lives of many culturally diverse people in Australia. These groups and individuals make numerous reports to the National Security Hotline on specific instances of interference, threats, harm and campaigns of harassment, many of which are conducted using social media platforms.

8.87Despite these numerous reports, the Australian Federal Police (AFP) reported to the committee that not a single person has been charged with offences relating to foreign interference via a social media platform under the National Security Legislation Amendment (Espionage and Foreign Interference) Act 2018 (theEspionage Act). The AFP noted there is a very high bar to laying such charges, which requires either a foreign principal or a proxy of a foreign principal to be undertaking the action. As such, the Espionage Act should be reviewed to ensure the settings are correct for the current operating environment.

8.88Further, the Law Council of Australia (Law Council) submitted that the intention was for offences introduced by the Espionage Act into the CriminalCode Act 1995 to be applied to conduct where social media is used to exert the influence on behalf of a foreign actor by deceptive or covert means. However, the Law Council queried the utility of these provisions being used against instances of foreign interference through social media given 'the challenges that exist in relation to successfully investigating and prosecuting persons who commit this offence when the 'conduct' occurs outside Australia'.[31]

8.89The Parliamentary Joint Committee on Intelligence and Security (PJCIS) is currently reviewing the Foreign Influence Transparency Scheme. Despite evidence put to the committee that these reforms should be considered as part of a package with the Espionage and Foreign Interference Act 2018, the PJCIS is not required to complete a review of the Espionage Act, and has not been referred an inquiry to do so. The scheduled review of the Espionage Act by the Independent National Security Legislation Monitor (INSLM) is not likely to be completed in this term of the INSLM due to other review priorities.

Recommendation 10

8.90The committee recommends the Australian Government refer the NationalSecurity Legislation Amendment (Espionage and Foreign Interference) Act2018 to the Parliamentary Joint Committee on Intelligence and Security for review, with particular reference to the Act's effectiveness in addressing cyberenabled foreign interference.

Artificial intelligence: an emerging threat

8.91The committee is alarmed at the increasing national security risks associated with the potential weaponisation of artificial intelligence (AI) technologies by malicious actors. These concerns have sharply increased with the advent of generative AI tools such as ChatGPT and Midjourney, which can be used to generate content at a speed and scale we have never seen before.

8.92Expert witnesses told the committee that AI will exponentially increase the volume of disinformation and foreign agents engaging in CIB. MrDavidRobinson from Internet 2.0 warned the committee that 'because of artificial intelligence we assess time is not on our side' and that AI is 'increasing the effectiveness of influence and disinformation campaigns against elections', cautioning that we may not be able to reverse the loss of trust our system will suffer.[32]

8.93The Department of Industry, Science and Resources is currently leading the Australian Government's public consultation on the responsible use of AI. The Government's discussion paper on 'safe and responsible AI in Australia' specifically excludes consideration of the national security implications of AI. This is one of the most pressing areas of concern today.[33] The Australian Government must urgently address the national security implications in our policy response to AI.

Recommendation 11

8.94The committee recommends the Australian Government investigate options to identify, prevent and disrupt artificial intelligence (AI)generated disinformation and foreign interference campaigns, in addition to the Government's Safe and Responsible AI in Australia consultation process.

Intelligence briefings

8.95The committee heard from both social media platforms and government agencies about the importance of collaborative relationships between the two sectors. However, cooperative platforms were often stymied in their ability to take targeted action because they lack the threat intelligence that could assist them in focusing their search for disinformation and foreign-state coordinated inauthentic behaviour.

8.96One of the stated barriers to greater threat intelligence sharing is the limited number of security-vetted staff working for social media platforms. Noting that there are some platforms with which it would be inappropriate to share any classified material, providing a greater degree of threat intelligence with appropriately vetted staff at trusted platforms would assist those platforms in streamlining efforts to counter foreign interference through social media.

Recommendation 12

8.97The committee recommends the Australian Government establish a program of vetting appropriate personnel in trusted social media platforms with relevant clearances to ensure there is a point of contact who can receive threat intelligence briefings.

Building the capacity of civil society

8.98The committee heard that Australians lack the information, expertise and resources to understand the risks presented by the use social media platforms and the information they engage with online.

8.99Given the declining trust in democratic institutions and leaders, the increasing polarisation of opinion in society and the dominant role of social media platforms in our contested information environment, it is clear from the evidence received over the course of this inquiry that a coordinated national approach is required. While government and industry have a significant role to play, efforts to counter foreign interference and CIB through social media must include civil society.

8.100The ability of civil society—individuals, organisations and community groups—to make informed decisions about information consumption and sharing needs to be expanded. To address the threat of foreign interference through social media, government support and funding is required to build this capacity.

Improving independent research and frameworks

8.101The committee considered evidence supporting the role of expert centres and researchers—whether engaged by platforms or operating independently in civil society—to scrutinise the conduct of the platforms and those who seek to manipulate them for their own ends. These independent experts help people understand how social media platforms use and distribute data and content that is capable of shifting public opinion, including through personal data collection and use of algorithms which promote and filter content. Independent experts can also help highlight where these algorithms are being used maliciously or in an inauthentic manner to covertly sway public opinion or create chaos.

8.102However, the committee is very aware that the effectiveness of independent research will be largely determined by the transparency of social media platforms as discussed in Recommendations 1 and 2—including in relation to their data collection policies, algorithms, and open access to social media platforms.

8.103When provided access to relevant data, the committee has heard that expert centres and researchers can play a vital role in understanding how social media platforms operate, the effectiveness of foreign actors' content operations, and the nature and scale of certain cybersecurity threats.[34] Support from the Australian Government would ensure that cyber-enabled foreign interference activities or campaigns in Australia could be more consistently and capably monitored and publicly reported.[35]

8.104This support could take the form of grant funding for social media monitoring and investigations, bolstering laws ensuring researchers can access more and higher-quality information data.[36]

8.105The committee also acknowledges the vital work that independent organisations undertake in exposing and analysing foreign interference. Expert Mr David Robinson told the committee that, in one instance, social media companies investigating commentary on their platforms after a particular event declared them to be free of interference. However, when using a mathematical analysis, Mr Robinson and other cybersecurity experts identified an interference campaign.[37]

8.106This example demonstrates that even when looking closely, sometimes even the platforms themselves cannot discern bots from people or campaigns of influence from genuine free expression. It requires the expertise of specialist organisations to review available data so that all of society can map an accurate picture of foreign interference. Some companies are more transparent than others when it comes to making data available for this kind of review due to commercial incentives, risk avoidance or other reasons. As such, Recommendations 1 and 2 will complement research efforts by ensuring platforms open themselves to independent cyber analysts and researchers to examine cyber-enabled foreign interference activities, but the Australian Government also needs to ensure independent researchers are supported to make the most of the opportunities this data provides.

Recommendation 13

8.107The committee recommends the Australian Government build capacity to counter social media interference campaigns by supporting independent research.

Protecting the public and diaspora communities

8.108As technology continues to evolve, it is important that all elements of Australian society—government, business, civil society, and individuals—be equipped to respond appropriately to the threat of foreign interference and disinformation.

8.109In addition to the recommendations above aimed at improving the integrity and transparency of all social media platforms operating in Australia, it is vital that the Australian Government lead an integrated whole-of-nation approach. Thisshould include building the resilience of Australians, and particularly diaspora communities that might be subject to transnational repression and foreign interference. This can be achieved by building digital information and media awareness.

8.110ASPI has identified that well-resourced state and non-state actors are misusing all forms of social media to 'facilitate the transnational repression of individuals and marginalised communities in Australia', which is of grave concern for our liberal democracy.[38]

8.111Over the course of this inquiry the committee heard compelling accounts diaspora communities and organisations on this rising trend of transnational repression. The perspectives of these community groups are uniquely important to the committee, as we understand they are often the targets of attempted foreign interference and victims of transnational surveillance, repression and coercion.

8.112The committee heard powerful testimony from individuals personally targeted by foreign interference activities, and these experiences were instructive in contemplating how to address these threats.

8.113Ms Vicky Xu, a journalist and policy analyst who was written extensively on the human rights abuses of the CCP told the committee:

I'm still dealing with death threats. I'm still dealing with repeated hacking attempts. Just this week I received a dozen hacking attempts across all of my accounts … I've had to adapt the way I live, my lifestyle, to one that's akin to a criminal, I would say. People in democracies, politicians, academics and people with good social standing tell me things like I'm going to end up in history books, and that all sounds grand, but what about life?[39]

8.114Similarly, Badiucao told this committee:

… there are always certain topics that you cannot talk about. What happens is you'll be disappeared or your accounts will be suspended. This is actually a typical way of how the Chinese government manages to export its censorship outside of China regardless of local law or rights protection in Australian society. This is totally contradictory with our values, but functionally for Chinese diaspora and other people who are using WeChat, we actually have to be subject to China's law, which is not really a law for society. This for me is very unfair and fundamentally dangerous to our national security and core values of society.[40]

8.115In relation to an Iranian-Australian woman who has been outspoken on the human rights situation in Iran and advocated for the Iranian women's rights movement, Mr Peter Murphy from the Australian Supporters of Democracy in Iran, said '[t]he basis of our democratic society include freedom of association, freedom of speech and freedom of expression, and this family in Melbourne has been told to stop talking, stop attending certain meetings and stop expressing certain views. It is really a blatant attack on our basic rights by a foreign government'.[41]

8.116Additionally, Mrs Kateryna Argyou from the Australian Federation of Ukrainian Organisations described how the Ukrainian-Australia diaspora is dealing with instances of harassment and intimidation:

We have tried to be very structured and systemised in the way that we approach it. Every time we've had a community member who has been harassed online or threatened – and we've had many community members who've received death threats through Telegram channels, through Facebook Messenger and through other social media – we've reported that to the police…we'd had over 40 official complaints registered with police.[42]

8.117The impact on individuals who have to live with this fear cannot be overstated. Many have fled authoritarian regimes and have chosen to live in Australia because of our rule of law and systems of democracy that protect our people from repressive tactics designed to silence dissent. Yet these same authoritarian states are misappropriating social media channels to reach into our nation to continue attempts to silence people through fear and intimidation. Liberal democracies like Australia must be resolute that we will not tolerate this behaviour and it must end now.

8.118Law enforcement agencies and bodies such as the eSafety Commissioner are often privy to these personal experiences of coercion and repression, and as such are well-placed to share their observations and lessons learned with social media platforms and the broader public.

8.119With the benefit of this knowledge, individuals and organisations will be better placed to make informed choices about their own online activities and identify and respond to foreign interference attempts they may be subject to in the future.

Recommendation 14

8.120The committee recommends the Australian Government ensure that law enforcement agencies, and other relevant bodies such as the eSafetyCommissioner, work with social media platforms to increase public awareness of transnational repression.

Improving digital and information literacy

8.121One of the most effective buffers against foreign interference through social media is an informed, resilient population that is capable of identifying problematic content on their online feeds. It is vital that an integrated approach include supporting and building the resilience of Australians, particularly diaspora communities, through improved digital literacy and media awareness.

8.122As technology continues to evolve, all Australians must be equipped to understand whether the information they are consuming is true or false, whether it comes from an authoritative source, whether it is emotionally manipulative, and whether they are presented with information from a diversity of sources.

8.123The Australian Government should take steps to build resilience against foreign interference at the individual and community level by providing resources to better equip these groups to navigate the fraught digital information landscape. Efforts should prioritise at-risk communities, and should consider tailored messaging to address the specific needs of these groups.

Recommendation 15

8.124The committee recommends the Australian Government empower citizens and organisations to make informed, riskbased decisions about their own social media use by publishing plainlanguage education and guidance material and regular reports and risk advisories on commonly used social media platforms, ensuring this material is accessible for nonEnglish speaking citizens. Specific focus should be on protecting communities and local groups which are common targets of foreign interference and provide preemptive information and resources.

Independent media

8.125The committee has been reminded of the role of news media in providing reliable, accurate and apolitical news about politics and current events. It has received evidence about the potential dangers associated with consuming news content through social media, and the dangers of accessing information in an 'echo chamber' where people only encounter information or opinions which reflect and reinforce their own worldviews because of algorithmic preferences.

8.126Furthermore, evidence heard by the committee shows authoritarian states control news and other content on some social media platforms used by Australians. For example, ASPI cited leaked content moderation documents which revealed that TikTok previously instructed its moderators to censor videos that mention Tiananmen Square, Tibetan independence, or the banned religious group Falun Gong.[43] This demonstrates the ways in which information can be filtered and manipulated to support the strategic interests of the interfering country, often in ways inimical to our own interests.

8.127The committee heard of the specific challenges posed by TikTok and WeChat, given their ownership structures and the operation of Chinese national security laws. It also heard evidence of surveillance and restrictions of freedom of expression of Iranian- and Ukrainian-heritage people in Australia.

8.128However, the committee also heard that independent reporting in the original languages of diaspora communities in Australia could play a key role in countering disinformation, content moderation, propaganda, censorship, and foreign influence through social media.

8.129Witnesses explained that independent journalism provides diaspora communities with neutral and reliable sources of news about politics and current events. As noted by legal expert submitters, 'news and journalism generate significant benefits for society through the "production and dissemination of knowledge, the exposure of corruption, and holding Governments and other decision-makers to account"'.[44]

Recommendation 16

8.130The committee recommends the Australian Government support independent and professional foreign-language journalism by supporting journalism training and similar programs, thereby expanding the sources of uncensored news for diaspora communities to learn about issues such as human rights abuses inside their country of origin.

Protecting our neighbours

8.131Australia has a strong, longstanding relationship with its Pacific neighbours. Australia's security cooperation with Pacific countries covers defence, law enforcement, transnational crime, climate and disaster resilience, border management and human security.

8.132While the Australian Government is working with our Pacific neighbours through the Cyber and Critical Technology Cooperation Program to increase information technology resilience of governments and civil society we need to do more to specifically address the impacts that foreign interference can have both on governments and ordinary people across our neighbouring regions.

8.133The Australian Government should consider ways to extend the learnings and benefits of the other initiatives recommended by this report to our regional neighbours. These efforts will build country-to-country linkages, and empower our neighbours to identify and thwart foreign interference attempts within their own communities.

Recommendation 17

8.134The committee recommends the Australian Government promote the digital literacy and the infrastructure of developing countries in the Indo-Pacific region that are the targets of malicious information operations by foreign authoritarian states.

Senator James Paterson

Chair

Liberal Senator for Victoria

 

Footnotes

[1]Mike Burgess Director-General of Security, Australian Security Intelligence Organisation,Director-General's Annual Threat Assessment, 21 February 2023.

[2]Australian Human Rights Commission, Submission 9, p. 3.

[3]Dr Seth Kaplan, Private capacity, Committee Hansard, 20 April 2023, p. 12-13.

[4]Committee Hansard, 21 April 2023, p. 35.

[5]Australian Human Rights Commission, Submission 9, pp. 4 and 15.

[6]Dr Seth Kaplan, Private capacity, Committee Hansard, 20 April 2023, p. 16.

[7]Mr Fergus Ryan, Analyst, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p.21

[8]CyberCX, Submission 16, p. 4.

[9]Australian Human Rights Commission, Submission 9, p. 9.

[10]CyberCX, Submission 16, p. 5.

[11]Dr William Stoltz, National Security College, Australian National University, Submission 18, p. 9.

[12]Australian Security Intelligence Organisation, Submission 2, p. 4.

[14]European Council conclusions, 20 March 2015, EUCO 11/15, p. 5.

[15]Security Service MI5, Joint Address by MI5 and FBI Heads, 6 July 2022, www.mi5.gov.uk/news/speech-by-mi5-and-fbi (accessed 14 July 2023).

[16]Ms Shanthi Kalathil, Private capacity, Committee Hansard, 20 April 2023, p. 3.

[17]Senator James Paterson, 11 July 2023, https://twitter.com/SenPaterson/status/1678557229883207681(accessed 29 July 2023).

[18]John Garnaut et al, Submission 34, p. 39 citing TikTok Inc v Donald J Trump, Plaintiffs ' Notice of Filing, 10 November 2010: 'The parties [TikTok Inc and ByteDance Ltd] did not submit the Musical.ly transaction to CFIUS for review in 2017 because ByteDance was a Chineseheadquartered company and Musical.ly was also a Chinese-headquartered company', p. 12. www.scribd.com/document/483797602/TikTok-asks-U-S-federal-appeals-court-to-vacate-U-S-divestment-order#.

[19]Ms Ella Woods-Joyce, Acting Director of Public Policy, Australia and New Zealand, and MrWillFarrell, Security Officer, TikTok US Data Security, TikTok, Committee Hansard, 11 July 2023, pp.23–24.

[21]Ms Ella Woods-Joyce, TikTok, Committee Hansard, 11 July 2023, p. 20.

[22]Ms Shanthi Kalathil, Private capacity, Committee Hansard, 20 April 2023, p. 4.

[23]Mr Brendan Carr, Commissioner, US Federal Communications Commission, Committee Hansard, 20April 2023, pp. 10–11.

[24]Ms Abigail Bradshaw, Deputy Director-General, Australian Signals Directorate, Committee Hansard, 12 July 2023, p. 3.

[25]Senator James Paterson, Committee Chair, and Ms Sarah Chidgey, Deputy Secretary, National Security and Criminal Justice, Attorney-General's Department Committee Hansard, 12 July 2023, pp.20–21.

[26]Mr Michael Pezzullo AO, Secretary, and Mr Hamish Hansford, Deputy Secretary, Cyber and Infrastructure Security Centre, Department of Home Affairs, Senate Legal and Constitutional Affairs Legislation Committee, EstimatesCommittee Hansard, 22 May 2023, pp. 67–68.

[27]Ms Sally Pfeiffer, Acting First Assistant Secretary, Counter Foreign Interference, Department of Home Affairs, Committee Hansard, 12 July 2023, p. 12.

[28]Australian Human Rights Commission, Submission 9, p. 8; CyberCX, Submission 16, p. 4; MrAlbertZhang, Analyst, Australian Strategic Policy Institute, Committee Hansard, 20April 2023, p. 21.

[30]Department of Foreign Affairs and Trade, Autonomous Sanctions Amendment (Magnitsky-style and Other Thematic Sanctions) Act 2021, www.dfat.gov.au/news/news/autonomous-sanctions-amendment-magnitsky-style-and-other-thematic-sanctions-act-2021 (accessed 24 July 2023).

[31]Law Council of Australia, Submission 27, pp. 2-3.

[32]Mr David Robinson, Director, Internet 2.0, Committee Hansard, 20 April 2023, p. 31

[33]Department of Industry, Science and Resources, Safe and responsible AI in Australia: Discussion Paper, June 2023, p. 4.

[34]See, for example: Department of Home Affairs, Submission 1, p. 3; University of Adelaide, Submission 3; US Federal Communications Commission, Submission 4, p. 2; RAND Australia, Submission 11, p. 3.

[35]CyberCX, Submission 16, p. 9.

[36]ASPI, Submission 13, p. 13.

[37]Mr David Robinson, Internet 2.0, Committee Hansard, 20 April 2023, p. 36.

[38]Australian Strategic Policy Institute, Submission 13, p. 3.

[39]Ms Vicky Xu, Senior Fellow, Australian Strategic Policy Institute, Committee Hansard, 21 April 2023, p. 35.

[40]Badiucao, Private capacity, Committee Hansard, 21 April 2023, p. 35.

[41]Mr Peter Murphy, Co-Secretary, Australian Supporters of Democracy in Iran, Committee Hansard, 21 April 2023, p. 22.

[42]Mrs Kateryna Argyrou, Co-Chair, Australian Federation of Ukrainian Organisations, CommitteeHansard, 21 April 2023, p. 28.

[43]Mr Fergus Ryan, Australian Strategic Policy Institute, Committee Hansard, 20 April 2023, p. 18–19.

[44]The Law Society of New South Wales Young Lawyers, Submission 33, p. 17.