Chapter 1 - Introduction

Chapter 1Introduction

1.1On 19 September 2024, the Senate referred the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (the bill) to the Environment and Communications Legislation Committee (the committee) for inquiry and report by 25 November 2024.[1]

Conduct of the inquiry

1.2In accordance with its usual practice, the committee advertised the inquiry on its website and wrote to relevant organisations inviting written submissions by 30September2024.

1.3The committee published 105 submissions which are listed in Appendix 2 and available on the committee's website. In addition, more than 8000 contributions from individuals were submitted to the committee, and a further 22000contributions were submitted through various campaigns. Representative samples of these are published on the committee's website as additional information to this inquiry.

1.4Three public hearings were held in Canberra on 11 October 2024, 17October2024 and 11 November 2024. A list of witnesses who gave evidence at these hearings is available in Appendix 1.

1.5In this report, references to Committee Hansard are to proof transcripts. Page numbers may vary between proof and official transcripts.

Acknowledgments

1.6The committee acknowledges the significant public interest in this inquiry, and that polarising views and perspectives have been expressed about the bill and consultation process. The committee thanks the organisations and individuals that made written submissions, commented on the bill and appeared at the public hearings.

Purpose of the bill

1.7In her second reading speech for the bill, the Minister for Communications, the Hon Michelle Rowland MP (the Minister), outlined that the bill would provide the Australian Communications and Media Authority (ACMA) with:

… new powers to create transparency and accountability around the efforts of digital platforms to combat misinformation and disinformation on their services, while balancing the freedom of expression that is so fundamental to our democracy.[2]

1.8The bill has three key objectives:

to provide the ACMA with new regulatory powers to require digital communication platform providers to take steps to manage the risk that misinformation and disinformation on digital communications platforms poses in Australia;

to increase transparency about the way in which digital communications platform providers manage misinformation and disinformation; and

to empower users to identify and respond to misinformation and disinformation.[3]

1.9The bill would provide the ACMA with new information-gathering, record keeping, code registration and standard making powers relating to digital communication platform providers.[4]

1.10New obligations would be placed on digital platforms to increase their transparency with Australian users about how they handle misinformation and disinformation on their services.[5]

1.11Notably, under these proposed arrangements, the ACMA's powers would be directed to digital communications platform providers and not individual endusers. The ACMA would not have a 'direct takedown power for individual content or particular accounts', except in the case of disinformation involving inauthentic behaviour (for example, coordinated bots, troll farms or fake accounts).[6]

1.12The bill proposes to define 'misinformation' as the dissemination of content using a digital service where that content 'is reasonably verifiable as false, misleading or deceptive', is provided to one or more end-users in Australia, and where the provision of that content is 'reasonably likely to cause or contribute to serious harm'.[7]

1.13'Disinformation' is required to meet the same elements as misinformation, however, is distinguished by intent.[8] This is discussed further in Chapter 2 and Chapter3.

Amendments passed by the House of Representatives

1.14On 7 November 2024, the House of Representatives passed several amendments to the bill. These included:

amending the definition of 'professional news content' to also apply to news content produced by a person who is subject to the rules of the Community Radio Broadcasting Codes of Practice;

requiring digital communication platform providers to publish information regarding their policy or policy approach for supporting access by researchers to data relating to misinformation and disinformation on the platform;

providing the ACMA with the power to make digital platform rules to establish one or more data access schemes, permitting independent researchers to be given access to data held by digital communications platform providers in certain circumstances;

requiring a review of the data access schemes rules; and

clarifying that the triennial statutory review is an independent review.[9]

1.15These amendments are outlined in further detail in Chapter 2 and the respective chapters of this report.

Scope and structure of the report

1.16This report comprises six chapters:

Chapter 1 provides background to the bill, including its purpose, and the conduct of the inquiry. It provides contextual information relevant to the bill's development, including the voluntary Australian Code of Practice on Misinformation and Disinformation, and the Australian Government's consultation process on the 2023 Exposure Draft of the bill. The chapter also explores the general feedback received about the bill, including the views of individuals.

Chapter 2 outlines the key provisions of the bill, including Government amendments passed by the House of Representatives on 7 November 2024.

Chapter 3 canvasses submitter views on the bill's proposed scope and coverage, including definitional issues.

Chapter 4 discusses views expressed about the proposed transparency obligations of digital communication platform providers.

Chapter 5 outlines the ACMA's proposed regulatory powers under the bill.

Chapter 6 concludes with the committee's views and recommendation.

1.17The following section outlines the financial impact of the proposed measures in the bill, its compatibility with human rights and relevant parliamentary committee scrutiny.

Financial impact statement

1.18The Explanatory Memorandum outlines that the measures in the bill are expected to have a minor financial impact on Commonwealth expenditure. As part of the 2023–24 Budget, the ACMA was provided with $7.78 million over four years from 2023–24 to 2026–27 for regulatory powers to combat misinformation and disinformation as contained in the bill.[10]

Human rights compatibility

1.19The Explanatory Memorandum stated that the bill is compatible with the human rights and freedoms recognised or declared in the international instruments listed in section 3 of the Human Rights (Parliamentary Scrutiny) Act 2011.[11]

1.20According to the Explanatory Memorandum, the bill would serve to positively affect several human rights, such as the right to security of the person, the right to participate in public affairs, the right to vote and be elected at genuine periodic elections, the right to be protected against discrimination, and the right to the highest attainable standard of physical and mental health.[12]

1.21However, the Explanatory Memorandum stated that the bill limits the right to privacy and the right to freedom of expression. With regard to the right to privacy, the Explanatory Memorandum notes that the '[b]ill's regulatory regime … burdens the right to privacy, to the extent that it imposes obligations on digital communications platform providers (and provides the ACMA with regulatory powers) that pertain to information disseminated on digital communication platforms'.[13]

1.22The Explanatory Memorandum outlined that the bill's limitations on the right to privacy have been imposed in pursuit of a legitimate objective, are necessary and proportionate to the achievement of that objective, and have appropriate protections in place for individuals.[14]

1.23On the right to freedom of expression, the bill limits this right as it empowers the ACMA to require providers to take steps to manage the risk of misinformation and disinformation on their platforms.[15]

1.24The Explanatory Memorandum explains that the six types of serious harms provided for in the bill 'align with the purposes for which international human rights law allows restrictions to be placed on freedom of expression'. This is because the measures in the bill are focused on systems and processes rather than the regulation of individual pieces of content, except in cases where disinformation involves inauthentic behaviour such as bots.[16]

Relevant parliamentary committee scrutiny

Consideration by the Parliamentary Joint Committee on Human Rights

1.25The Parliamentary Joint Committee on Human Rights (PJCHR) scrutinises bills for compatibility with human rights, and considered the bill's engagement with human rights.

1.26The PJCHR noted that, while the bill is directed towards a legitimate objective and has safeguards in place, many details are left to delegated legislation. The PJCHR stated that there is a risk that platforms could over-regulate content to avoid penalties:

It appears that the bill is directed towards a legitimate objective which is broadly of pressing and substantial concern, and would likely be rationally connected to (that is, capable of achieving) that objective. However, questions remain as to whether the scheme would constitute a proportionate limit on the right to freedom of expression and the right to privacy in practice. While the bill does establish several broad safeguards which would assist with the proportionality of the measure, and provide for regular review of the operation of the scheme, much of the detail of what the scheme would require providers to do would be set out in delegated legislation. Further, there may be a risk that, in practice, providers over-regulate content on their platforms in order to avoid the risk of a civil penalty for noncompliance with this scheme, meaning that the extent of the limitation on the right to freedom of expression (and privacy) may only become apparent as a matter of practice.[17]

1.27The PJCHR suggested that:

the bill be amended to require the ACMA to 'have regard to the right to freedom of expression, as recognised under international human rights law, in approving a code or determining a standard';

consideration be given to whether the proposed scheme would appropriately protect content produced by 'citizen journalists' who are not subject to formalised editorial standards; and that

the statement of compatibility be updated to identify what, if any, remedy an individual may access if compliance, or purported compliance, with the proposed scheme resulted in a breach of their right to freedom of expression or privacy.[18]

1.28The PJCHR did not seek a response from the Minister for Communications as much of the detail of the scheme would be set out in delegated legislation.

1.29However, the Minister's response outlined that the bill provides that approved codes and standards are legislative instruments subject to parliamentary scrutiny and disallowance. The ACMA must also be satisfied that it is reasonably appropriate and adapted to achieving the purpose of providing adequate protection for the Australian community from serious harm caused or contributed to by misinformation or disinformation on the platforms, and goes no further than reasonably necessary to provide that protection. These requirements are similar to those set out under Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR).[19]

1.30In relation to the statement of compatibility, the Minister outlined that the statement will be updated to explain the remedies available, if the actions of a platform constitute a breach of either an individual's right to freedom of expression or right to privacy (as recognised under international law).[20]

1.31The PJCHR thanked the Minister for her response and reiterated its recommendations, and noted that it will assess the compatibility of any future legislative instruments made pursuant to this scheme.

1.32With respect to the preparation of a statement of compatibility, the PJCHR outlined that while its preparation is an important element of supporting the legislative scrutiny process, the ‘requirement to prepare a statement of compatibility does not, in and of itself, constitute a sufficient safeguard to protect human rights, particularly as it does not require the legislation be compatible with human rights’.[21]

1.33The PJCHR recommended:

The committee recommends that consideration be given to amending the bill to require the Australian Communications and Media Authority (ACMA) to establish a complaints mechanism to handle complaints with respect to breaches of human rights arising from the proposed scheme.[22]

Consideration by the Senate Standing Committee for the Scrutiny of Bills

1.34The Senate Standing Committee for the Scrutiny of Bills (Scrutiny of Bills Committee) assesses bills against a set of accountability standards that focus on the effect of proposed legislation on individual rights, liberties and obligations, the rule of law and on parliamentary scrutiny. The Scrutiny of Bills Committee reviewed the bill, seeking the Minister's advice on:

why significant details had been left to delegated legislation;

whether more detail could be included in the primary legislation;

why there is no requirement to make rules regarding misinformation and disinformation complaints;

whether all of the ACMA's decisions made under the rules should be subject to merits review, unless the ACMA specifically excludes merits review in individual cases;

why privacy protections specified in the Explanatory Memorandum are not included in the bill;

whether the definition of 'professional news content' is overly narrow;

why it is appropriate to leave to codes and standards all processes by which participants in a digital platform industry are to prevent or respond to misinformation or disinformation, including why there is no requirement about what a code or standard must contain; and

whether the ACMA should be required to be satisfied that a misinformation code or standard appropriately balances protecting the community from serious harm with the right to freedom of expression.[23]

1.35With respect to significant details, such as obligations relating to risk management, media literacy plans, complaints and dispute handling, and record keeping being left to delegated legislation, the Minister's response outlined various reasons for why this would be specified in the rules. This includes:

rules would more appropriately account for differences between digital platform providers, taking into account differences in respect to their user-interfaces, the users, and the content shared;

to retain flexibility to consider how the system is operating in practice, and to respond to the evolving risk landscape; and

to provide greater flexibility for the digital communications platforms industry, particularly as the ACMA gains a more thorough understanding of the sector through its information-gathering powers.[24]

1.36Regarding the content requirements of a code or standard, the Minister explained that the bill does not set out all the matters which must be in a code, 'as the nature of the risk and the appropriate measures will depend on the relevant industry sector and class of digital communication platform provider'.[25]

1.37In relation to the ACMA being satisfied that a misinformation code or standard appropriately balances the protection of the community from serious harm with freedom of expression, the Minister stated:

Where it approves a misinformation code or determines a misinformation standard, the ACMA will have a legal obligation to prepare a statement of compatibility with Australia's international human rights obligations. In practice, this means that the ACMA will need to consider Australia's international human rights obligations, including those relating to freedom of expression, in deciding whether to approve a code or make a standard.

In the context of this legislation, any attempt to codify some aspects of that consideration, with reference [to] a specified human right, is likely to create legal uncertainty, with real risks of legal challenges and unintended consequences.

Codes and standards are also subject to parliamentary scrutiny and disallowance.[26]

1.38The Scrutiny of Bills Committee outlined that it considered the Minister’s response had largely addressed its concerns regarding the use of delegated legislation in relation to risk management and media literacy plans, however retained its scrutiny concerns with respect to providers implementing misinformation complaints and dispute handling processes. It recommended that consideration be given to requiring providers to implement and maintain these processes, or provide that rules are made to establish this.[27]

1.39With respect to protecting the right to freedom of expression, the Scrutiny of Bills Committee expressed concern that the proposed safeguards in relation to the approval of codes or making of standards ‘may not be sufficient to fully protect the right to freedom of expression’. It stated:

While the committee [Scrutiny of Bills committee] notes that the bill includes safeguards, it cautions that the proposed scheme has the potential to unduly trespass on personal rights and liberties by potentially acting as a chilling effect on freedom of expression, as there are incentives for providers to remove content that might constitute misinformation or disinformation, while there is no incentive for providers to respect the right to freedom of expression...

To better protect the right to freedom of expression the committee recommends that consideration be given to amending the bill to require the ACMA to be satisfied that a misinformation code or standard appropriately balances the importance of protecting the community from serious harm with the right to freedom of expression. [28]

Background and context to the bill

1.40This section outlines the context to the bill's development. It provides an overview of growing concerns about the impact of misinformation and disinformation on digital communication platforms, the Australian Government's response in related reviews and the outcomes of consultation on the 2023 Exposure Draft.

1.41This section also outlines the broader policy and regulatory framework in which the bill operates.

Australians' increasing concern about the impact of misinformation and disinformation

1.42With the growth in the use of digital platforms, research has indicated that Australians are increasingly concerned about the information they access online. For example, the ACMA's September 2024 report on the adequacy of the Australian Code of Practice on Misinformation and Disinformation found that 75percent of Australians were 'concerned' about misinformation and disinformation—an increase from 69 per cent in 2022.[29]

1.43Similarly, the Australian Media Literacy Alliance's Adult Media Literacy 2024 report stated that four in five Australians want the spread of misinformation on social media to be addressed in Australia. This was an increase of six per cent from the 2021 report.[30]

1.44These findings are consistent with international trends. The Reuters Institute Digital News Report 2024, an international survey of comparative data on media usage in 47 countries, found that concerns about 'what is real and what is fake' with respect to online news had risen by three percentage points in the last year, with around six in ten (59 per cent) of people stating that they are concerned.[31]

1.45Australians are now among the most concerned consumers about misinformation, with levels of concern similar to concerns in the United States, and the United Kingdom, and behind only Portugal.[32]

The findings of the ACCC's Digital Platforms Inquiry

1.46In December 2017, the then-Treasurer, the Hon Scott Morrison MP, directed the Australian Competition and Consumer Commission (ACCC) to conduct an inquiry into digital platforms. As part of the inquiry's terms of reference, the ACCC examined the impact of digital platforms on consumers, businesses, and the quality of news and journalism.

1.47The inquiry focused on Google and Facebook as the two largest and most used digital platforms in Australia. The inquiry's final report in 2019 noted the 'opaque operations' of digital platforms, and that these platforms 'act as gateways' to Australian consumers.[33] The ACCC argued that 'where digital platforms perform comparable functions to media businesses, they should be regulated similarly'.[34] The ACCC found that algorithms also pose risks for consumers by encouraging the spread of misinformation and disinformation.[35]

1.48In its final report, the ACCC made several recommendations relevant to this bill. These include:

that digital platforms establish an industry code to govern the handling of complaints about disinformation;

that an independent regulator, like the ACMA, should be directed to monitor the voluntary initiatives of digital platforms to enable users to identify the reliability, trustworthiness and source of news content featured on their services; and that

the regulator should also be empowered to obtain data and information from digital platforms, publicly report on its findings and make recommendations in relation to regulatory action if voluntary initiatives are ineffective.[36]

1.49The ACCC also recommended greater investment in the approach and delivery of digital media literacy resources and training, including in Australian schools.[37]

The Australian Code of Practice on Disinformation and Misinformation

1.50In December 2019 as part of the then Australian Government's response to the ACCC's Digital Platforms Inquiry final report, it requested that major digital platforms develop a voluntary code of conduct to address disinformation and news quality.[38] The ACMA was tasked with overseeing the code's development, report on platforms' measures as well as the broader impacts of disinformation in Australia.[39]

1.51In line with the ACMA's code development practice, the Digital Industry Group (DIGI), a digital industry association, worked with eight digital service providers to develop the voluntary Australian Code of Practice on Disinformation and Misinformation (the Code).[40] The Code aims to manage the risks associated with disinformation (intentionally misleading content) and misinformation (false content spread without malicious intent) on digital platforms.

1.52Signatories opt into commitments under the Code, including two mandatory commitments to implement measures to reduce the risk of harms that may arise from misinformation and disinformation and publishing an annual transparency report.[41]

1.53DIGI can accept complaints about platforms' breaches of the Code, and can investigate and withdraw a company from the Code if necessary. The aim of the complaints facility is to 'resolve the complaints, so as to have a positive impact on misinformation and disinformation in Australia' rather than to penalise the platform.[42]

1.54As of November 2024, the Code currently has nine signatories: Adobe, Apple, Google (including YouTube), Meta (formerly Facebook), Microsoft (LinkedIn), Redbubble, TikTok, Twitch and Legitimate.[43]

1.55However, some major platforms with a significant Australian user base are not signatories.[44] These include Reddit (1.8 million Australian monthly active users), Snapchat (eight million Australian monthly active users), X (formerly Twitter) (2.9million Australian monthly active users), and WeChat (400 000 Australian monthly active users).[45]

1.56In November 2023, the Code's independent Complaints Sub-Committee withdrew X's signatory status. This action followed X's refusal to cooperate in an investigation or take corrective action after a complaint was lodged regarding X's decision to close, and leave closed, accessible channels for the public to report misinformation and disinformation on the platform during the Australian Voice to Parliament referendum.[46]

1.57The ability for users to report misinformation and disinformation is a key feature of the Code.

The ACMA's reports on the adequacy of the Code

1.58The ACMA has provided three reports to the Australian Government on the adequacy of the Code since its commencement.

1.59The ACMA's first report in June 2021 recommended that it should be provided with information-gathering and record keeping powers, and reserve regulatory powers to register codes and set standards. The ACMA also recommended that formal regulatory options should be considered, 'particularly for platforms that choose not to participate in the Code or reject the emerging consensus on the need to address disinformation and misinformation'.[47]

1.60The ACMA argued that the current self-regulatory approach 'may prove insufficient to incentivise broader behavioural change across industry', noting that:

compliance with the Code is uncertain given the data provided by platforms;

uncertainty that current deficiencies with the Code will be addressed by the industry in its 12-month review;

there are a range of non-signatories to the Code; and that

usage of platforms may expand rapidly and with new services introduced, without these potential new services being brought quickly into the Code's remit.[48]

1.61In July 2023, the ACMA's second report found that platforms' voluntary 'transparency reports are not working to provide transparency about signatories' current and proposed measures under the Code'.[49] At this time of this report, DITRDCA consultation on the Exposure Draft for the 2023 bill was occurring (discussed below).

1.62In September 2024, the ACMA's third report noted that the transparency reports of digital platforms subject to the Code were 'inconsistent', with 'patchy' data, and that it was 'difficult to track the progress of measures against the relevant outcome'. The report also noted that the implementation of regulatory responses to online harm internationally is starting to impact digital platform behaviour, citing novel regulatory approaches in the European Union and United Kingdom to improve accountability.[50]

1.63This is discussed in more detail in Chapter 4.

Release of the 2023 Exposure Draft of the bill

1.64In January 2023, the Minister announced that the Australian Government would introduce laws to provide the ACMA with new powers to combat online misinformation and disinformation.[51]

1.65In June 2023, an Exposure Draft of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 (the Exposure Draft of the bill) was released for public consultation.[52]

1.66The Exposure Draft of the bill built upon the previous Government's commitment to provide the ACMA with greater powers to 'combat harmful disinformation and misinformation online'. In March 2022, the HonPaulFletcher MP, the former Minister for Communications, Urban Infrastructure, Cities and the Arts (the former minister) foreshadowed introducing ‘legislation this year to combat harmful disinformation and misinformation online’, and stated that such legislation would provide ACMA with ‘new information-gathering powers to incentivise greater platform transparency’ as well as be provided “'reserve powers' to register and enforce industry codes or make industry standards”.[53]

1.67DITRDCA undertook two phases of consultation on the Exposure Draft in 2023 and 2024.

First phase: public consultation

1.68The first phase consultation process included a call for public submissions on the Exposure Draft as well as targeted stakeholder roundtables with the digital communications platforms industry, civil rights groups, academia, religious organisations, and the media and broadcasting sectors. Several stakeholder roundtable sessions were held during this eight-week consultation period.[54]

1.69The consultation process for the Exposure Draft received more than 24 000 responses, including 2418 public submissions.[55]

1.70The following primary concerns were identified during the consultation:

freedom of expression and religious freedom—concern that the powers would unduly restrict Australians' implied right to freedom of political communication and would 'cancel' religious content online;

'government overreach' and censorship—concern that the Government would censor Australians who held contrary views;

transparency and oversight—concern about a lack of clarity in how digital communications platforms treat the content they host, and how decisions made under a future code or standard would be communicated to users. Concerns were also raised about how the ACMA would make decisions in exercising its powers; and

workability for platforms—the digital communications platform industry held concerns about how it would be able to comply with the powers, particularly with regard to the definitions provided in the exposure draft of the bill.[56]

Second phase: targeted consultation

1.71The second phase consultation process on the Exposure Draft of the bill occurred in April and July 2024 and involved targeted consultation with key industry and other stakeholders on potential changes to the draft bill following feedback from the first phase consultation process.[57]

1.72These potential changes included narrowing the scope of harms in order to better protect freedom of speech and religious expressions, refining provisions regarding platform transparency and accountability requirements, and improved transparency and accountability measures relating to the ACMA, including a triennial review of the legislative framework and annual reporting by the ACMA to the Parliament.[58]

1.73A revised bill was provided to stakeholders in July 2024.[59]

Key changes to the bill following consultation on the Exposure Draft

1.74The consultation process on the Exposure Draft informed the drafting of the bill as introduced, to refine definitions, improve workability, and reinforce safeguards for freedom of expression. These are discussed in more detail in the following chapters, and include:

Reinforced protections to safeguard freedom of speech

the scope of 'serious harms' was narrowed in the bill to ensure greater alignment with Australia's obligations under international human rights law;

the excluded categories of content were refined, to encompass the reasonable dissemination of content for any academic, artistic, scientific or religious purpose. Notably, the exclusions for government-authorised content and authorised electoral matter were removed;

explicit provisions were included to make clear that nothing in the bill could require the removal of content or blocking end-users unless it is disinformation that involves inauthentic behaviour (such as bots); and

the proposed information-gathering powers of the ACMA were clarified to ensure that the ACMA cannot require individuals to produce information or documents except where they are a platform employee, content moderator, fact checker or a person providing services to the provider of the platform.[60]

Improved workability of the bill

the definitions of misinformation and disinformation were refined to require that the content in question must be reasonably verifiable as false, misleading or deceptive; and

the definition of disinformation was amended to include 'false, misleading or deceptive information disseminated via inauthentic behaviour'.[61]

Strengthened transparency and accountability for the ACMA and digital platforms

core upfront obligations were placed on the platforms to publish a current media literacy plan, a risk assessment report, and policies or information on their approach to addressing misinformation and disinformation;

the ACMA would be able to make digital platform rules (disallowable by Parliament) with additional transparency requirements such as a complaints and dispute handling process for misinformation and disinformation complaints, risk management, and risk assessment; and

greater Parliamentary oversight was incorporated through triennial reviews of the bill's framework, annual reporting by the ACMA, and for codes approved by the ACMA to be subject to parliamentary scrutiny and disallowance.[62]

Procedures to be observed by Senate Committees for the protection of witnesses

1.75On 11 November 2024, the Australian Jewish Association notified a committee member of a serious threat made on a social media platform which was directly linked to their appearance before the committee. The committee member raised the matter with the committee prior to the public hearing. This matter was also addressed during the public hearing.

1.76The committee takes the protection of witnesses seriously, and condemns any action which seeks to intimidate or threaten anyone who provides evidence as part of the inquiry process.

1.77The Parliamentary Privileges Act 1987 provides some of the legal protections for senators, witnesses and others. In particular, any action harming or penalising a witness, or depriving them of a benefit, in consequence of their giving or proposing to give evidence, is a criminal offence which may be prosecuted in the courts.[63]

1.78The Senate also has the power to punish contempts under section 49 of the Constitution.

1.79Under Privilege Resolution 6, the Senate has declared that the following matters may constitute a contempt of the Senate:

Interference with witnesses

(10) A person shall not, by fraud, intimidation, force or threat of any kind, by the offer or promise of any inducement or benefit of any kind, or by other improper means, influence another person in respect of any evidence given or to be given before the Senate or a committee, or induce another person to refrain from giving such evidence.

Molestation of witnesses

(11) A person shall not inflict any penalty or injury upon, or deprive of any benefit, another person on account of any evidence given or to be given before the Senate or a committee.

1.80Privilege Resolution 1 provides the procedures to be observed by Senate committees for the protection of witnesses. This resolution states that:

18. Where a committee has any reason to believe that any person has been improperly influenced in respect of evidence which may be given before the committee, or has been subjected to or threatened with any penalty or injury in respect of any evidence given, the committee shall take all reasonable steps to ascertain the facts of the matter. Where the committee considers that the facts disclose that a person may have been improperly influenced or subjected to or threatened with penalty or injury in respect of evidence which may be or has been given before the committee, the committee shall report the facts and its conclusions to the Senate.

1.81Due to the serious nature of the threat made to the Australian Jewish Association, which may also constitute a criminal offence under other state or Commonwealth laws, the committee determined to refer the matter to the Australian Federal Police (AFP).

1.82The committee takes this opportunity to report to the Senate that, as the matter has been referred to the AFP, the committee will refrain from further action at this time. This accords with the principles declared by the Senate of having regard to other remedies at law before resorting to the use of its contempt powers.

1.83The committee will consider, once any AFP investigation is finalised, whether to report the matter to the Senate as a possible contempt.

1.84The committee notes the right of individual Senators to pursue this matter separately.

Footnotes

[1]Journals of the Senate, No. 135, 19 September 2024, p. 4077.

[2]The Hon Michelle Rowland MP, Minister for Communications, House of Representatives Hansard, 12September 2024, p. 7.

[3]Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, Explanatory Memorandum, p. 1.

[4]Explanatory Memorandum, p. 1.

[5]Explanatory Memorandum, p.1.

[6]Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA), Fact sheet - Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, 12 September 2024, p. 2.

[7]Explanatory Memorandum, p. 44.

[8]Explanatory Memorandum, p. 45.

[9]Supplementary Explanatory Memorandum, p. 2.

[10]Commonwealth of Australia, Australian Communications and Media Authority, Portfolio Budget Statements: 2023–24, p.153.

[11]Explanatory Memorandum, p. 5.

[12]Explanatory Memorandum, p. 7.

[13]Explanatory Memorandum, p. 16.

[14]Explanatory Memorandum, p. 15.

[15]Explanatory Memorandum, p. 7.

[16]Explanatory Memorandum, p. 6. More information relating to human rights can be found in the bill's Explanatory Memorandum, from page 5.

[17]Parliamentary Joint Committee on Human Rights, Scrutiny Report 9 of 2024, 10 October 2024, p. 91.

[18]Parliamentary Joint Committee on Human Rights, Scrutiny Report 9 of 2024, 10 October 2024, p. 92.

[19]Minister for Communications, Parliamentary Joint Committee on Human Rights, Ministerial response, Report 10 of 2024; [2024] AUPJCHR 82, p. 15.

[20]Minister for Communications, Parliamentary Joint Committee on Human Rights, Ministerial response, Report 10 of 2024; [2024] AUPJCHR 82, p. 18.

[21]Parliamentary Joint Committee on Human Rights, Report 10 of 2024; [2024] AUPJCHR 77, p. 82.

[22]Parliamentary Joint Committee on Human Rights, Report 10 of 2024; [2024] AUPJCHR 77, p. 82.

[23]Senate Standing Committee for the Scrutiny of Bills (Scrutiny of Bills Committee), Scrutiny Digest 13 of 2024, pp. 34–35, 37–38, and 43.

[24]Minister for Communications,Ministerial response to the Senate Scrutiny of Bills Scrutiny Digest 13 of 2024, [pp. 28‒57].

[25]Minister for Communications,Ministerial response to the Senate Scrutiny of Bills Scrutiny Digest 13 of 2024, [p. 53].

[26]Minister for Communications,Ministerial response to the Senate Scrutiny of Bills Scrutiny Digest 13 of 2024, [p. 55].

[27]Senate Standing Committee for the Scrutiny of Bills, Scrutiny Digest 14 of 2024; [2024] AUSStaCSBSD 211, pp. 99−100.

[28]Senate Standing Committee for the Scrutiny of Bills, Scrutiny Digest 14 of 2024; [2024] AUSStaCSBSD 211, pp. 107−108.

[29]Australian Communications and Media Authority (ACMA), Digital platforms' efforts under voluntary efforts to combat misinformation and disinformation: Third report to government, September 2024, p. 1.

[30]Tanya Notley et al, Adult Media Literacy in 2024: Australian Attitudes, Experiences and Needs, Australian Media Literacy Alliance, August 2024, p. 15.

[31]Reuters Institute, Digital News Report 2024, p. 10.

[32]ACMA, Digital platforms' efforts under the Australian Code of Practice of Disinformation and Misinformation: Second report to government, July 2023, p. 1.

[33]Australian Competition and Consumer Commission (ACCC), Digital Platforms Inquiry: Final Report, June 2019, pp. 1 and 4.

[34]ACCC, Digital Platforms Inquiry: Final Report, June 2019, p. 2.

[35]ACCC, Digital Platform Services Inquiry Interim Report No. 6 – Social media services in Australia, March2023, p. 156.

[36]ACCC, Digital Platforms Inquiry: Final Report: Executive Summary, June 2019, pp. 33–34.

[37]ACCC, Digital Platforms Inquiry: Final Report: Executive Summary, June 2019, pp. 33–34.

[40]Digital Industry Group (DIGI), Submission 79, p. 1.

[41]DIGI, About the Code (accessed 7 November 2024).

[42]DIGI, Complaints (accessed 18 October 2024).

[43]DIGI, Signatories (accessed 7 November 2024).

[44]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p. 20.

[45]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p. 21.

[46]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p.12.

[48]ACMA, Adequacy of digital platforms disinformation and news quality measures, June 2021, pp. 80–81.

[50]ACMA, Third report to government on digital platforms' efforts under voluntary arrangements, September2024, p. 7. The report outlines developments including the impact of the European Union's Digital Services Act that places obligations on large online platforms and search engines including transparency reporting obligations and identification of systems risks related to public security and electoral processes among other matters. The United Kingdom's Online Safety Act 2023 also places a new duty of care for online platforms to remove illegal content (including foreign interference) and take down material that breaches their terms of service.

[52]DITRDCA, New ACMA powers to combat misinformation and disinformation (accessed 2 October 2024).

[53]The Hon Paul Fletcher MP, Minister for Communications, Urban Infrastructure, Cities and the Arts, Media Release, 21 March 2022 (accessed 8 October 2024).

[54]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p. 9.

[55]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p. 10.

[56]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p. 10.

[57]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p. 10.

[58]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p. 10.

[59]DITRDCA, Online misinformation and disinformation reform: Impact Analysis, p. 10.

[60]DITRDCA, Fact sheet - Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, 12 September 2024, p. 2.

[61]DITRDCA, Fact sheet - Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, 12 September 2024, p. 2.

[62]DITRDCA, Fact sheet - Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, 12 September 2024, p. 2.

[63]Parliamentary Privileges Act 1987, s. 12.