Bills Digest No. 81, 2023-24

Criminal Code Amendment (Deepfake Sexual Material) Bill 2024

Attorney General's

Author

Owen Griffiths

Go to a section

Key points

  • the person knows the other person does not consent to the transmission; or
  • the person is reckless as to whether the other person consents to the transmission.
  • The new offence will apply regardless of whether the material is unaltered or has been created or altered in any way using technology. This means the new offence could apply to a variety of circumstances where image-based abuse occurs.
  • There are exceptions for where the new offence will apply, including a broad exception where ‘a reasonable person would consider transmitting the material to be acceptable’ having regard to a number of factors.
  • The Bill will also create two aggravated offences which may apply where a person commits the underlying offence and additionally where:
  • an offender has already been subject to 3 or more civil penalties orders relevant to the Online Safety Act 2021; or
  • the offender was ‘responsible for the creation or alteration of the material’.
  • The Bill’s amendments will also repeal two existing aggravated offences including where the person commits the underlying offence of using a carriage service in a way that is menacing, harassing or offensive (section 474.17) and the commission of this underlying offence involves the transmission, making available, publication, distribution, advertisement or promotion of private sexual material.

 

 

Introductory InfoDate of introduction: 2024-06-05

 

House introduced in: House of Representatives

Portfolio: Attorney-General

Commencement: The day after Royal Assent

 

Purpose of the Bill

The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 (the Bill) will amend the Criminal Code Act 1995 (the Criminal Code) to create a new offence where a person uses a carriage service to transmit sexual material which depicts (or appears to depict) another person (who is, or appears to be 18 years of age or over) when:

  • the person knows the other person does not consent to the transmission or
  • the person is reckless as to whether the other person consents to the transmission.

The Bill will also create two aggravated offences which may apply where a person commits the new offence (the underlying offence) and:

  • has already been subject to 3 or more civil penalty orders relevant to the Online Safety Act 2021 or
  • was ‘responsible for the creation or alteration of the material’.

The Attorney-General, Mark Dreyfus, indicated that the introduction of the Bill was part of a commitment which recognised ‘the urgent and collective need to respond to the growing challenges associated with artificially generated sexual material’:

Digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse. This insidious behaviour is degrading, humiliating and dehumanising for victims. Such acts are overwhelmingly targeted at women and girls and perpetuate harmful gender stereotypes and gender-based violence.[1]
 

Structure of the Bill

The Bill’s amendments to the Criminal Code are contained in Schedule 1.

Items 1–4 contain amendments which are consequential to the repeal of the existing aggravated offences in section 474.17A, which include reference to the transmission etc of ‘private sexual material’. Section 473.1 provides definitions for the telecommunication services offences in Part 10.6 of the Criminal Code. Items 1 and 2 will repeal the definitions of ‘private sexual material’ and ‘subject of private sexual material’ in section 473.1.

Section 473.4 provides the matters to be taken into account in deciding whether, in all the circumstances, reasonable persons would regard material as offensive. Items 3 and 4 will repeal subsections 473.4(2), (3) and (4) which relate to the definition of consent and how the consent of the subject(s) of private sexual material should be weighed in deciding if material would be regarded as offensive by reasonable persons.

Item 5 contains the main amendments of the Bill. It will repeal the existing aggravated offences in section 474.17A and replace them with a new offence in proposed section 474.17A for using a carriage service to transmit sexual material without consent and two aggravated offences in proposed section 474.17AA where the new offence occurred:

  • after certain civil penalties have been made against the person (proposed subsection 474.17AA(1)) or
  • when the person created or altered the sexual material which was transmitted (proposed subsection 474.17AA(5)).

It will also insert proposed section 474.17AB which will address double jeopardy matters arising from the new underlying offence and the aggravated offences.

Item 6 will repeal and replace section 474.17B which deals with the possibility of alternative verdicts to reflect the new underlying offence and the aggravated offences.

Item 7 provides for the application of the amendments. The amendments will apply to material transmitted after commencement ‘regardless of whether the material was created or altered before or after’ commencement. 

 

Background

Non-consensual sharing of intimate images and deepfakes

The accessibility of digital manipulation technologies, in particular new generative artificial intelligence (AI) image tools and programs, have increased public concern regarding non-consensual creation and sharing of ‘deepfakes’. The eSafety Commissioner’s position statement on deepfakes notes:

Manipulation of images is not new, but over recent decades digital recording and editing techniques have made it far easier to produce fake visual and audio content, not just of humans but also of animals, machines and even inanimate objects.

Advances in artificial intelligence (AI) and machine learning have taken the technology even further, allowing it to rapidly generate content that is extremely realistic, almost impossible to detect with the naked eye and difficult to debunk. This is why the resulting photos, videos and sound files are called ‘deepfakes’.[2]

The spread of AI-generated, explicit images of popular singer Taylor Swift highlighted the issue of the creation and distribution of sexualised deepfakes.[3] In Australia, sexualised deepfakes recently received attention when a Victorian teenager was arrested for allegedly generating and sharing a number of explicit deepfakes of other schoolchildren.[4]

The capability to generate realistic sexually explicit ‘deepfakes’ through programs and online services is becoming increasingly accessible. In December 2023 a report by Graphika, a US-based social media analytics firm, found that ‘[t]he creation and dissemination of synthetic non-consensual intimate imagery (NCII) has moved from a custom service available on niche internet forums to an automated and scaled online business that leverages a myriad of resources to monetize and market its services’.[5]

On 1 May 2024, the Albanese Government announced a number of measures concerned with ‘Tackling online harms’:

The Albanese Government will introduce legislation to ban the creation and non-consensual distribution of deepfake pornography. Digitally created and altered sexually explicit material is a damaging form of abuse against women and girls that can inflict deep harm on victims. The reforms will make clear that creating and sharing sexually explicit material without consent, using technology like artificial intelligence will be subject to serious criminal penalties.[6]

Legislative context

Constitutional context

The Commonwealth lacks a direct constitutional head of power to legislate regarding technology generally (including computing), which appears to restrict its capacity to create offences regarding individuals who create, produce or generate deepfakes.[7] However, the Australian Constitution does grant legislative power over telecommunications to the Commonwealth via section 51(v), allowing laws to be made in relation to ‘postal, telegraphic, telephonic, and other like services’. Generally, a link to the Commonwealth’s constitutional authority to legislate under this head of power in the Criminal Code is made through the inclusion of the ‘use of a carriage service’ as one of the elements of offence.

Commonwealth criminal law

There is a complicated legislative background to the Criminal Code offences to be amended by the Bill. Under section 474.17, a person commits an offence if they use a carriage service in such a way that a ‘reasonable person’ would find it harassing, offensive or menacing, in all the circumstances. The scope of this offence is limited by section 473.4 which provides matters to be taken into account in deciding if reasonable persons would regard the particular material or the particular use of a carriage service as offensive.[8]

In 2018, the Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Act 2018 amended the Criminal Code to insert section 474.17A which included:

  • a standard aggravated offence where the underlying offence in section 474.17 involved ‘private sexual material’
  • a special aggravated offence where the underlying offence involved ‘private sexual material’ and prior to the offence 3 or more civil penalty orders had been made against the person relating to posting an intimate image with consent under then section 44B(1) of the Enhancing Online Safety Act 2015 (which was superseded by subsection 75(1) of the Online Safety Act).

It also inserted definitions of ‘private sexual material’, ‘subject’ of private sexual material as well as a clarification of the regard which must be had to the consent of the subjects of ‘private sexual material’ in determining if material was offensive. This clarification also defined the term ‘consent’ as meaning ‘free and voluntary agreement’.[9]

The Online Safety (Transitional Provisions and Consequential Amendments) Act 2021 further amended this part of the Criminal Code. In particular, it changed the special aggravated offence (paragraph 474.17A(4)(d)) to apply where, prior to the underlying offence, the person had been the subject to civil orders relating to contraventions of:

  • subsection 75(1) of the Online Safety Act (the posting an intimate image without consent) and/or
  • section 91 of the Online Safety Act which relates to removal notices given under section 89 of that legislation.

Section 91 of the Online Safety Act creates a civil penalty for non-compliance with removal notices issued by the eSafety Commissioner (under sections 88, 89 or 90) concerning cyber‑abuse material targeted at an Australian adult.[10] In particular, section 89 provides for the eSafety Commissioner to give end-users removal notices concerning cyber-abuse material which has been the subject of a complaint.

Online safety regulation

The Online Safety Act commenced on 23 January 2022. The legislation bolstered the powers of the eSafety Commissioner to address harmful online content including an updated image-based abuse scheme to address the sharing (and threatened sharing) of intimate images without the consent of the person shown.[11] For example, under section 75, it is unlawful for a person to post, or make a threat to post, an ‘intimate image’ of another person without their consent if either person is ordinarily resident in Australia. This is subject to a maximum civil penalty of 500 penalty units ($156,500).[12] The eSafety Commissioner can also give remedial directions where a person has contravened, or is contravening, section 75 (section 83).

The scheme also allows the eSafety Commissioner to issue removal notices to social media services, relevant electronic services or designated internet services, hosting service providers and end-users concerning intimate images (sections 77, 78, 79). A person who fails to comply with a requirement under a removal notice may be the subject of a civil penalty (section 80).[13]

On 13 February 2024, it was announced that Delia Rickard PSM would lead a statutory review of the Online Safety Act. The terms of reference of the review are available here. The issues paper released for the review indicates a report will be made to the Minister by 31 October 2024.[14]

States and territories

Most Australian jurisdictions have criminal offences which cover the non‑consensual sharing or distribution of intimate or explicit images (some of which may apply to altered material).[15] Victoria appears to be the only jurisdiction which has expressly criminalised both the production and the distribution of deepfakes.[16]

Under section 53R of the Crimes Act 1958 (Vic), a person will commit an offence if they intentionally produce an intimate image depicting another person, and they know it is, or probably is, an intimate image, and the production of the intimate image is ‘contrary to community standards of acceptable conduct’. The Explanatory Memorandum to the legislation states that the new offence ‘applies to the production of deepfake or altered intimate images’ due to the definition of ‘produce’ and the meaning given to the word ‘image’.[17]

Overseas jurisdictions

United Kingdom (UK)

In April 2024, the UK government announced that amendments to the Criminal Justice Bill will include a new offence in relation to making deepfakes:

Last year, reforms in the Online Safety Act criminalised the sharing of ‘deepfake’ intimate images for the first time. This new offence, which will be introduced through an amendment to the Criminal Justice Bill, will mean anyone who makes these sexually explicit deepfake images of adults maliciously and without consent will face the consequences of their actions.

The existing UK ‘sharing’ of intimate material offences are contained in section 66B of the Sexual Offences Act 2003 (UK).

United States (US)

Free speech protections in the US complicate the criminalisation of the distribution of deepfakes.[18] Most states in the US make the dissemination of ‘nonconsensual pornography’ a criminal offence.[19] In 2022, the US Congress authorised a federal civil claim for individuals where ‘intimate visual depictions’ are disclosed by persons who know that, or recklessly disregard whether, the individual has not consented to the disclosure (15 US Code § 6851).

The Disrupt Explicit Forged Images and Non-Consensual Edits Act, currently before the US House of Representatives, would extend civil remedies for victims who are identifiable in a ‘digital forgery,’ which is defined as a visual depiction created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means to falsely appear to be authentic.[20]

European Union (EU)

In May 2024, the EU Directive on combating violence against women and domestic violence (2024/1385) outlined obligations for member states including in relation to offences concerning ‘non-consensual production, manipulation or altering, for instance by image editing, including by means of artificial intelligence, of material that makes it appear as though a person is engaged in sexual activities’. It states: 

Such production, manipulation or altering should include the fabrication of ‘deepfakes’, where the material appreciably resembles an existing person, objects, places or other entities or events, depicts the sexual activities of a person, and would falsely appear to other persons to be authentic or truthful. In the interest of effectively protecting victims from such conduct, threatening to engage in such conduct should also be covered.[21]
 

Committee consideration

Senate Standing Committee on Legal and Constitutional Affairs

The provisions of the Bill were referred to the Senate Legal and Constitutional Affairs Legislation Committee (Senate Committee) for inquiry and report by 8 August 2024.[22] The inquiry received 35 submissions and a public hearing for the inquiry was held on 23 July 2024. Further details concerning the inquiry are available here.

The Senate Committee’s report:

  • recommended the Attorney-General review the threshold for the first aggravated offence (where the person has already been subject to 3 or more civil penalty orders relevant to the Online Safety Act) after two years of the Bill’s operation
  • recommended the Attorney-General continues work already underway via the Standing Council of Attorneys-General in relation to development of harmonised offences across Australian jurisdictions for the:
  • non-consensual creation of sexual material
  • threat to non-consensual creation and/or distribution of sexual material that does not use a carriage service
  • recommended the Education Ministers Meeting continues to progress their work to strengthen respectful relationships in schools, and noted ‘the work of Education Ministers to implement version 9 of the Australian Curriculum and the work of schooling systems to implement age and developmentally appropriate programs on consent and online safety within the context of respectful relationships’
  • subject to these recommendations, recommended that the Senate urgently passes the Bill.[23]

The Deputy Chair, Senator Paul Scarr made additional comments (discussed below).

Senate Standing Committee for the Scrutiny of Bills

The Senate Standing Committee for the Scrutiny of Bills (Scrutiny Committee) raised a number of issues concerning the Bill (discussed further below). These included:

  • the unclear scope of the new offence, including the lack of a definition for ‘sexual pose’;
  • the reversal of the evidential burden of proof in the offence-specific exceptions; and
  • the potential for the double jeopardy provisions to result in defendants being required to stand trial twice for the same factual circumstances.[24]
 

Policy position of non-government parties/ independents

Coalition

Paul Fletcher MP, the Manager of Opposition Business in the House, stated that the Coalition ‘supports the intention’ behind the Bill, but added that they would ask for it to be ‘closely scrutinised’. In particular, he questioned why the Bill repealed existing parts of the Criminal Code amended by the Coalition in 2018 and argued the government had not done ‘an adequate or satisfactory job’ of explaining why the existing law needed to change.[25] He characterised the Bill as having ‘several strange features’:

For example, this bill says the new offences are based on the distribution of material without a person's consent. But this bill also repeals the definition of consent. That is a very strange thing to do. Right now, the law says, 'Consent means free and voluntary agreement.' The government is getting rid of that definition. Why? Is the government suggesting that consent to share deepfake sexual material can be implied? Can consent be understood from the circumstances? How do the offences work if a person withdraws their consent? And what does that mean when you go to prosecute these laws? Will we have victims being cross-examined about whether they consented to their image being used?

These laws are also very broad. They apply to deepfakes but they also apply to true images and to obvious fakes like cartoons. Why? There are other questions: Do these new offences apply to historical figures? How do these laws interact with child pornography offences? Some of the interactions are unclear.

It seems this law is drafted backwards. Under the current law, a prosecutor must show the material is offensive. There is no equivalent under the new law. Under the new law, it falls to the defendant to show, on the balance of probabilities, that the material is acceptable. Why?

More seriously, for these offences the government has changed the definition of 'recklessness' when it comes to consent. 'Recklessness' has a defined meaning that applies in almost every other part of the Criminal Code but not here. Why not? What are they capturing that they could not capture previously, and what are the consequences of that change?[26]

In additional comments to the Senate Committee’s report on the Bill, the Deputy Chair, Senator Paul Scarr stated that, while the Coalition supported the policy intent of the Bill, it ‘would have benefitted from an additional period of scrutiny and review’.[27] He made three recommendations:

Recommendation 1

It is recommended that the Senate consider whether the policy objectives of the Bill could be better achieved by retaining features of the framing of the offence in section 474.17 of the Criminal Code Act 1995 that promote certainty, with appropriate amendments made to the current offence to address the concerns raised by the Commonwealth Director of Public Prosecutions ...

Recommendation 2

It is recommended that, prior to passing the Bill, the Senate should have the opportunity to consider the response of the Scrutiny of Bills Committee to any response received from the Attorney General to the issues raised by the Scrutiny of Bills Committee in its report dated 26 June 2024 ...

Recommendation 3

It is recommended that the Bill be amended to provide for an independent statutory review of the operation of the Bill and related matters after 2 years of operation. Preparation of the terms of reference for the review should be informed by (amongst other things) the issues raised during this inquiry.[28]

Australian Greens

The Australian Greens spokesperson for Justice and Digital Rights Senator David Shoebridge has been reported as highlighting risk that the provisions of the Bill could be applied to children:

[He] said it was ‘not credible for the government to say that new criminal offences about online content creation and sharing will not see children end up in prison’... He said the government needed to ‘take on the power of platforms and algorithms’ and that while expanding the law to cover AI-generated images made sense, police would struggle to enforce it.[29]

Independents

Zali Steggall MP ‘strongly supported’ the Bill but emphasised the ‘need to go further in addressing deepfakes’. She moved a 2nd reading amendment which highlighted the harmful use of deepfakes and called on ‘the Government to take immediate action to extensively ban the creation and transmission of deepfakes without consent of the subject’.[30]

Zoe Daniel MP also supported the passage of the Bill in the House of Representatives but raised her concerns regarding ‘its limitation to the transmission of content depicting an individual who is exclusively over the age of 18’ and ‘whether teenager-to-teenager deepfake abuse would be captured under existing laws’. She considered there was ‘a strong argument for banning all deepfakes that are used without permission’:

As well as legislation, tech companies can contribute by using technology to track, trace and prevent deepfakes. Deepfakes are a threat to democracy and public trust, and we must step in strongly to prevent their nefarious use. This legislation, which I will support, is part of that process.[31]

Kate Chaney MP characterised the Bill as a ‘very welcome criminal justice response to the unlawful dissemination of non-consensual images’. She also referred to the recent amendments to the Basic Online Safety Expectations (BOSE) made by the Minister under the Online Safety Act for social media service providers:

Serving a notice on providers that requires them to explain how they're doing this is not enough. The BOSE fall short of requiring platforms and service providers to develop ways of independently detecting harmful content and deepfake images. I look forward to the report of the Statutory Review of the Online Safety Act, as well as the findings of the joint select committee on the impact of social media in Australia. I'll be advocating sharper instruments and stronger solutions.[32]
 

Position of major interest groups

Support for new offences and the need for reform

Submissions to Senate inquiry mainly expressed support for robust offences to address the non-consensual transmission of sexualised content. The eSafety Commissioner considered that the Bill, combined with the existing functions and powers under the Online Safety Act, would ‘lead to a more holistic and comprehensive range of support, redress options, and choice for victim survivors of image-based abuse’.[33] The Australian Information Industry Association also supported the Bill’s ‘harms-specific and technology neutral’ approach.[34]

The importance of sending a clear message which reflected the serious harms associated with non-consensual sexual deepfakes was also emphasised in evidence to the Senate Committee. Noelle Martin told the Senate Committee inquiry:

In terms of the individual implications for people, I think it is life-destroying and shattering. It is something that impacts every aspect of a person's life. It impacts their being able to find work and their employability to being able to manage and control their reputation. It impacts a person's dignity, autonomy and agency. It compromises and threatens people's capacity to self-determine, to flourish and to exercise their sexual agency.[35]

However, support was not universal and a number of issues with the Bill were raised. The joint submission from the NSW Council of Civil Liberties and the Scarlet Alliance considered the offences proposed in the Bill overlapped with existing legislation and the focus on increased penalties failed ‘to recognise there are other more effective ways to prevent and minimise the harm caused by non-consensual dissemination of sexual material’.[36] The Law Council of Australia also raised concerns and made recommendations for amendments to the proposed offences (discussed below).[37]

The need for reform was highlighted in the submission by the Commonwealth Director of Public Prosecutions (CDPP). It noted it has received referrals from investigative agencies in relation to the creation and dissemination of ‘deepfakes’ and had considered the ability of the existing offence in the Criminal Code to capture potential offending relating to deepfake material. The CDPP expressed concern that some conduct is unlikely to constitute an offence contrary to section 474.17A(1) because of the way ‘private sexual material’ is defined:

The issue that arises here is that, with certain deepfakes, it cannot be said that any expectation of privacy attaches to the depiction of a victim. For example, if an accused were to transpose an image of a victim’s face onto a publicly available pornographic video, this would generally speaking, not be ‘private sexual material’. This is because the creator of the deepfake uses, as source material, a depiction of a part of the victim (for example, their face) with respect to which it cannot be said there would be an expectation of privacy.[38]

Further measures

A number of submitters to the Senate Committee argued that limiting the proposed underlying offence to the ‘transmitting’ of non-consensual sexualised material would be insufficient.[39] Professor Rebecca Delfino argued ‘[t]he creator’s criminal culpability is equal to, if not greater than, those who transmit the harmful content’.[40] The Australian Human Rights Commission recommended that the Australian Government ‘should enact a standalone offence for the creation and/or alteration of sexual material without consent’.[41]

Others also highlighted that the proposed offences do not extend to situations where victims are threatened with distribution. For example, Dr Rachael Burgin, Chief Executive Officer of Rape and Sexual Assault Research and Advocacy, told the Senate Committee:

The creation of those images, whether or not they are distributed, is a significant harm and, commonly, threats to circulate intimate images, including deepfakes, is a tactic used by an abuser to instil fear and exert control and power over another person. Those threats need not be made explicitly to achieve their ends of controlling. We would suggest though that an offence of 'threat to distribute' should also be created under this bill, so we do recommend that the bill go further to ensure that the creation alone is captured.[42]

A number of contributors to the Senate Committee inquiry argued for further regulation to address facilitators and platforms used to access services to create sexual deepfakes. Professor Clare McGlynn and Ruya Toparlak from Durham University submitted that criminalising creating sexually explicit deepfake could ‘inspire and force platforms’ to act through:

  • challenging the payment providers that continue to prop up the deepfake financial ecosystem;
  • saying to Google and Bing they can no longer highly rank ‘deepfake porn’ sites and apps;
  • making YouTube remove the tutorial videos telling people how to create sexually explicit deepfakes;
  • removing adverts for nudify apps on mainstream social media such as X (formerly Twitter) and Instagram.[43]

Support was also expressed for increased public awareness strategies in relation to image abuse and non-consensual sexual deepfakes.[44] Independent Schools Australia stated:

A national community education campaign is recommended to raise awareness about the harms of deepfake sexual material and the legal consequences of misusing it, as education on this issue cannot be the sole responsibility of schools or parents. This should not be limited to deepfake sexual material, but any misuse of deepfake images that can create harm or widespread misinformation.[45]

Children

The potential application of the proposed offences to children concerned a number of contributors to the Senate Committee inquiry.[46] This included the Law Council of Australia which recommended that strengthened safeguards should be considered to ensure that the proposed offences did not have a disproportionate impact on children and young persons.[47]

The National Children’s Commissioner, Anne Hollonds, also reportedly raised concerns the criminalisation of deepfakes could result in children as young as 10 years old going to prison:

Ms Hollonds acknowledged the government's deepfake bill was intended to address real harm in the community and that having sexual images shared was ‘horrifying’ for victims. But she said the problem of minors sharing such images was ‘a classic example’ of young people having access to technology without proper safeguards. ‘It's actually on us, as the adults,’ she said. ‘We've allowed the regulation of the digital world, the online world to be so weak, that it allows kids to do things, or get involved in these negative actions, that have terrible consequences on others’.[48]

She indicated her office had not been consulted on the Bill and the potential for the Bill to be applied to children highlighted an urgency for the federal government to raise the age of criminal responsibility to ‘at least 14’.[49]

 

Financial implications

The Explanatory Memorandum states the amendments made by the Bill will have no financial impact.[50]

 

Statement of Compatibility with Human Rights

As required under Part 3 of the Human Rights (Parliamentary Scrutiny) Act 2011 (Cth), the Government has assessed the Bill’s compatibility with the human rights and freedoms recognised or declared in the international instruments listed in section 3 of that Act. The Government considers that the Bill is compatible.[51]

Parliamentary Joint Committee on Human Rights

The Parliamentary Joint Committee on Human Rights had no comment on the Bill.[52]

 

Key issues and provisions

Elements of the underlying offence

The new offence in proposed subsection 474.17A(1) will apply where a person uses a carriage service to transmit material of another person who ‘is, or appears to be 18 years or older’ and the material ‘depicts, or appears to depict’:

  • the other person engaging in a sexual pose or sexual activity (whether or not in the presence of other persons); or
  • a sexual organ or the anal region of the other person; or
  • if the other person is female—the other person’s breasts.

For the purposes of the new offence, the term transmit is to be broadly interpreted as including ‘make available, publish, distribute, advertise and promote’ (proposed subsection 474.17A(4)). The term ‘sexual pose’ is not defined, but the Dictionary to the Criminal Code does contain a definition of ‘sexual activity’:

sexual activity means:

(a) sexual intercourse; or

(b) any other activity of a sexual or indecent nature (including an indecent assault) that involves the human body, or bodily actions or functions (whether or not that activity involves physical contact between people).

Proposed paragraph 474.17A(1)(d) contains the fault elements of the new offence in relation to consent. These are that the person:

  • knows that the other person does not consent to the transmission of the material; or
  • is reckless as to whether the other person consents to the transmission of the material.[53]

The Bill clarifies that being reckless in relation to consent includes ‘not giving any thought to whether or not the person is consenting’ (proposed subsection 474.17A(5)).

The new offence will be punishable by a maximum penalty of 6 years imprisonment.

Scope of the underlying offence

The Explanatory Memorandum clarifies that ‘[t]he use of the phrase “appears to depict” is intended to cover material where the depiction reasonably or closely resembles an individual to the point that it could be mistaken for them’.[54]

Proposed subsection 474.17A(2) provides that, for the purposes of the offence, it is irrelevant whether the material is transmitted ‘in unaltered form’ or ‘has been created, or altered in any way, using technology’. The Explanatory Memorandum outlines that this provision has been ‘framed broadly to refer to the use of technology, to ensure [it] is technology neutral and can apply to existing and future technologies’.[55] The appended note to the subsection states that material which ‘has been created, or altered in any way, using technology’:

... includes images, videos or audio depicting a person that have been edited or entirely created using digital technology (including artificial intelligence), generating a realistic but false depiction of the person. Examples of such material are ‘deepfakes’.

Unlike the aggravated offence which it replaces, the new offence in proposed subsection 474.17A(1) does not have the requirement that the person’s use of a carriage service first be found to be done in a way that ‘reasonable persons would regard as being, in all the circumstances, menacing, harassing or offensive’. Potentially, this significantly broadens the scope of the offence where sexual material is transmitted without consent. On the other hand, the new offence created by the Bill will not address ‘threatening to transmit’ behaviour (which is included in the Online Safety Act’s civil penalty for posting, or threatening to post, an intimate image without consent).[56]

Some contributors to the Senate Committee inquiry argued the Bill’s offences should be extended to address ‘threats to transmit’ sexual material without consent. In response to this proposal, the Attorney-General Department stated:

The making of a threat to create or disseminate sexually explicit material concerning a person over a carriage service may constitute an offence against section 474.17 of the Criminal Code, which makes it an offence to use a carriage service to menace, harass or cause offence. State and territories have extortion and blackmail related offences that may also cover this type of conduct.[57]

The Scrutiny Committee raised several issues with the potential scope of the proposed offence. In particular, it noted that no definition was provided for the term ‘sexual pose’ and the explanatory materials do not provide ‘clarity on how this term should be interpreted and it is unclear what is expected to constitute a sexual pose for the purposes of the offence’.[58]

It also noted the requirement in the existing aggravated offence in section 474.17A, that the relevant material depicts ‘a person in circumstances that a reasonable person would regard gives rise to an expectation of privacy’, would be repealed. The Scrutiny Committee queried ‘whether there may be situations in which non-sexual nudity is acceptable, where there is no reasonable expectation of privacy, and seeking consent as to transmitting this material may not always be feasible...’. It sought advice concerning whether ‘consideration was given to retaining the existing offence in section 474.17A of the Criminal Code for transmission of private sexual material, while creating a new offence targeted at deepfake material...’.[59]

The Law Council’s submission considered the drafting of the proposed offences to be ‘at times unnecessarily complex’ with the ‘potential to apply to a wide range of innocuous conduct’. It stated:

In our view, the elements of a well-drafted offence should clearly define what is prohibited behaviour. This is because the rule of law requires that law is readily known and certain so that people are able to know in advance whether their conduct might attract criminal sanction. Offence provisions should not be so broadly drafted that they may inadvertently capture a wide range of potentially benign conduct and are thus overly dependent on police and prosecutorial discretion to determine, in practice, what type of conduct should or should not be subject to sanction.[60]

The Law Council recommended the Senate Committee should consider ‘improvements to enhance the clarity of the offence provision and to ensure that innocuous conduct (such as educational material or material that has been subject to consensual prior distribution) is not inadvertently captured’.[61]

Consent

The Explanatory Memorandum states that the Bill’s offences ‘turn on whether the person depicted in the material consents to its transmission’.[62] As noted above, the Bill repeals both the existing definition of ‘consent’ and the required consideration of whether the subjects of the ‘private sexual material’ consented in determining if the material was ‘offensive’.[63] In relation to the issue of consent, the Explanatory Memorandum states:

Consent is required for each particular instance of transmission of the material. For example, consent to share an image with one individual would not cover the sharing of an image with a different individual. If the person who received the image then wanted to share the image with another person, further consent from the person depicted in the image would be required. 

Consent is not defined under this section and relies on its ordinary meaning. In this circumstance, a person would be taken to have consented to the transmission if the person freely and voluntarily agrees to the transmission... The offence is not intended to capture private communications between consenting adults or interfere with private sexual relationships involving adults.[64]

In the absence of a definition, it is not clear how the issue of consent to transmission of the material will be interpreted. This may be complicated by situations which could involve implied consent to transmission, where consent is withdrawn after material is made available, where there is genuine consent to a limited distribution but not to a wide distribution of material or where the relevant sexual material is a composite which depicts more than one person. A lack of clarify concerning consent issues was raised in the RMIT University Enterprise AI and Data Analytics Hub submission:

The amendment should address scenarios where an individual initially consents to the distribution of sexual material but later revokes that consent. It is unclear how the timing of consent and its revocation would impact the legal process. For instance, if a person consents to the distribution now but revokes it a few months later, the legal framework should clearly articulate the implications. We recommend including provisions that detail the process and consequences of revoking consent at a later stage, ensuring that individuals’ rights are protected throughout.[65]

The civil penalty provision for posting an intimate image without consent in the Online Safety Act provides that it ‘does not apply if the second person consented to the posting of the intimate image by the first person’.[66] This is not mirrored in the Bill’s proposed offence. The Online Safety Act also includes a definition of consent for the purposes of the application of the legislation to ‘an intimate image or private sexual material’. This outlines that consent means ‘express’, ‘voluntary’ and ‘informed’ and clarifies that it does not include consent by children; or adults who have a mental or physical condition which makes them incapable, or substantially impairs their capacity, to give consent.[67]

The issue of a person’s consent to the transmission of material may be relevant to the exceptions to the offence, but these must be raised by the defendant (discussed below).

Recklessness

As noted above, proposed subsection 474.17A(5) provides that being reckless in relation to consent includes ‘not giving any thought to whether or not the person is consenting’.

The Law Council was ‘doubtful about the merits of developing an offence-specific definition of recklessness...’. It stated:

We express caution that the proposed modification of the meaning of recklessness is contrary to its accepted meaning and apt to result in confusion and inconsistency across Commonwealth criminal law...

[W]hile some states and territories include recklessness as the relevant mental element in respect of the circumstance element of non-consent, or alternatively, define consent affirmatively, such that the accused person is required to take positive steps to ascertain consent—the maximum penalty in all cases is not greater than 3 years. The Law Council is concerned by the possibility that a person can be convicted of a serious criminal offence without forming any specific intent.[68]

If the maximum sentences of the proposed offences were to be retained, the Law Council recommended consideration of the ‘appropriateness of recklessness as the mental element with respect to the consent element of the offence’.[69]

Exceptions

Proposed subsection 474.17A(3) contains three narrow exceptions and one broad exception to the new offence. The narrow exceptions provide the offence does not apply where transmitting the material is:

  • necessary for, or of assistance in enforcing or monitoring compliance with, or investigating a contravention of a Commonwealth, State or Territory law; or
  • for the purposes of proceedings in a court or tribunal; or
  • for a genuine medical or scientific purpose.

The broad exception, contained in proposed paragraph 474.17A(3)(d), provides the offence does not apply if a reasonable person would consider transmitting the material to be acceptable, ‘having regard to’ a range of factors. These factors are:

  • the nature and content of the material
  • the circumstances in which the material was transmitted
  • the age, intellectual capacity, vulnerability or other relevant circumstances of the person depicted, or appearing to be depicted
  • the degree to which the transmission of the material affects the privacy of the person depicted, or appearing to be depicted
  • the relationship between the person transmitting the material and the person depicted, or appearing to be depicted, in the material and
  • ‘any other relevant matters’.

The Explanatory Memorandum states:

This is an objective test and the qualification that a reasonable person must regard the transmission as acceptable, allows community standards and common sense to be imported into a decision on whether the offence at 474.17A is made out. Examples of this include where a person has downloaded material that was published online and expected that consent was provided for the material due the commercial nature of such material and its availability, photographs of models that were specifically taken with permission for advertising or publication, or images that are solely satirical in nature.[70]

Defendants will bear the evidential burden in relation to matters in the exceptions.[71] The Explanatory Memorandum justifies this approach as ‘reasonable and necessary’:

If a person had a particular reason for thinking that they were transmitting the material for legitimate purposes or in circumstances where it would have been considered reasonable and acceptable, it would not be difficult for them to describe how they came to those conclusions. It would be significantly more cost effective for the defendant to assert this matter rather than the prosecution needing to prove, beyond a reasonable doubt, that the transmission of the material without consent was neither necessary, reasonable, or for a genuine purpose, in all the circumstances. 

Defences which place an evidential burden on the defendant are proportionate because, the prosecution will still be required to prove each element of the offence beyond reasonable doubt. Further, if the defendant discharges an evidential burden, the prosecution will be required to disprove those matters beyond reasonable doubt, consistent with section 13.1 of the Criminal Code.[72]

However, the Scrutiny Committee requested a ‘detailed justification as why it is proposed to use offence-specific exceptions (which reverse the evidential burden of proof) in relation to the offence’. In relation to the broad exception in proposed paragraph 474.17A(3)(d), it stated:

... it is unclear to the committee how what a reasonable person would consider in relation to a transmission is a matter that is peculiarly within any person’s knowledge. The committee further notes that the circumstances to which a reasonable person would have regard to in this exception, such as the age, intellectual capacity, vulnerability, and violation of privacy of the person being depicted, and the nature of their relationship with the person capturing the material, are not information or evidence in relation to the transmission. Rather, these are perceptions of the transmission itself. The committee considers that this indicates the offence may be drafted in overly broad terms, and that these are matters that are more appropriately disproven by prosecution by including them as elements of the offence under proposed subsection 474.17A(1).[73]

The Law Council opposed reversing the evidential onus in relation to exceptions in proposed paragraph 474.17A(3)(d) ‘because the matters encompassed in this exception are central to establishing the criminality of the proscribed conduct’. It considered that ‘matters referring to objective (rather than subjective) community standards of acceptable behaviour should be included in the elements of the offence’.[74]

Application of the new offence to children

As noted above, concerns have been raised regarding the possible application of the new offence to children. In August 2023, the eSafety Commissioner noted her office had ‘received our first reports of sexually explicit content generated by students using this technology to bully other students’.[75] However, the Bill’s new offence will only apply where the person depicted is, or appears to be, over 18 years of age.[76]

The Attorney-General has acknowledged that potentially children ‘could be charged’ but has noted that ‘...they would be dealt with as children are dealt with in the criminal law’. He commented:

... that's going to be a matter for courts but by and large children are not gaoled in Australia. The police will obviously excise discretion in what they investigate and how they apply these newly created offences. But we've got here behaviour that affects women and girls who are the target of this kind of deeply offensive and harmful behaviour. We know that it can inflict deep and long-lasting harm on victims. The Albanese Government has no tolerance for this kind of criminal behaviour and that's why we are legislating in this clear way.[77]

Section 7.1 of the Criminal Code provides that a child under 10 years old cannot be liable for an offence against a law of the Commonwealth. Subsection 7.2(1) provides that a child aged 10 years or more but under 14 years old can only be criminally responsible ‘if the child knows that his or her conduct is wrong’, and subsection 7.2(2) specifies that ‘whether a child knows that his or her conduct is wrong is one of fact’ with the burden of proving this on the prosecution.

In deciding whether or not the public interest warrants the prosecution of ‘juveniles’, the CDPP’s Prosecution Policy of the Commonwealth provides for consideration of a number of specific factors including ‘age and apparent maturity and mental capacity’. It states:

Prosecution of a juvenile should always be regarded as a severe step, and generally speaking a much stronger case can be made for methods of disposal which fall short of prosecution unless the seriousness of the alleged offence or the circumstances of the juvenile concerned dictate otherwise. In this regard, ordinarily the public interest will not require the prosecution of a juvenile who is a first offender in circumstances where the alleged offence is not serious... The practice of the DPP is for any decision to proceed with a prosecution in respect of a juvenile to be made by a senior lawyer.[78]

The Criminal Code includes a number of offences relating to using a carriage service for child abuse material. Notably, subsection 474.24C(1) provides that prosecutions for these offences must not be commenced without the consent of the Attorney-General if the defendant was under 18 at the time he or she allegedly engaged in the conduct constituting the offence. This type of precautionary provision, which may allow the Attorney-General to prevent inappropriate prosecutions where children are involved in offences, would not apply to the offences created by the Bill.

In its submission to the Senate Committee, the Law Council noted it had:

…received feedback that children and young people are over-represented in the offences of producing and sharing of sexual images, in part due to a lack of understanding of the criminality of their actions and experience lifelong difficulties that flow from a finding of guilt for a sexual offence.[79]

It proposed that the Bill include a legislative requirement that prosecution of children and young people, under the age of 16 years, not be commenced without the prior consent of the CDPP.[80]

There has been discussion at the Standing Council of Attorneys-General concerning raising the minimum age of criminal responsibility in Australia.[81] Some jurisdictions have moved to increase the age of criminal responsibility.[82]

The aggravated offences

Proposed section 474.17AA contains the two aggravated offences which will apply where particular circumstances exist in addition to the underlying offence. The maximum penalty for each aggravated offence would be 7 years imprisonment.

After certain civil penalty orders

The aggravated offence in proposed subsection 474.17AA(1) is ‘transmission of sexual material without consent after certain civil penalty orders were made’. This will apply if, prior to the underlying offence, the person was the subject of 3 or more civil penalty orders under the Regulatory Powers (Standard Provisions) Act 2014 in relation to:

  • contraventions of subsection 75(1) of the Online Safety Act (which creates a civil penalty for posting or threatening to post online an intimate image without consent) or
  • contraventions of section 91 of the Online Safety Act, that relate to removal notices given under section 89.

Section 91 of the Online Safety Act relates to the non-compliance with removal notices issued by the eSafety Commissioner (under sections 88, 89 or 90) concerning cyber‑abuse material. In particular, section 89 provides for the eSafety Commissioner to give end-users removal notices concerning cyber-abuse material which has been the subject of a complaint.

The civil penalty orders can be against either or both categories of contravention of the Online Safety Act and absolute liability will apply to this element of the proposed new aggravated offence.

The inclusion of the cyber-abuse material removal notices framework in the aggravated offence is incongruous as the Online Safety Act contains a separate specific framework for addressing non‑consensual sharing of intimate images including removal notices (in Part 6). In particular, section 78 allows the eSafety Commissioner to give removal notices to end-users concerning intimate images posted online without consent. If this removal notice is not complied with, a civil penalty may apply under section 80 of the Online Safety Act

The Explanatory Memorandum justifies the increased penalty for the aggravated offence as follows:

This is appropriate because the aggravated offence applies where a person commits an offence to the criminal standard and after repeatedly being found liable [of] contraventions of the Online Safety Act relating to the non-consensual distribution of intimate images. This indicates that the person has had a continued disregard for the effect the distribution of sexual material without consent can have upon victims.[83]

Given that the explanatory materials do not refer to the cyber-abuse material framework in the Online Safety Act, this may be a drafting error—an unintended legacy of when the aggravated offence previously related to the offence in section 474.17 for using a carriage service to menace, harass or cause offence. If this is the case, proposed subparagraph 474.17AA(1)(b)(ii) may require amendment to appropriately connect the aggravated offence to the non‑consensual sharing of intimate images civil penalties in the Online Safety Act:

(ii) contraventions of section 80 of the Online Safety Act 2021 that relate to removal notices given under section 78 of that Act.

In its submission to the Senate Committee inquiry, the eSafety Commissioner noted it is often able to act informally to get harmful online material taken down. Accordingly, situations where an offender would be subject to 3 or more civil penalty orders under the Online Safety Act are ‘likely to only be enlivened in rare and serious circumstances’.[84] This threshold for the aggravated offence was viewed as too high by some submitters to the Senate Committee inquiry.[85] For example, the Australian Human Right Commission recommended the threshold be lowered ‘from requiring three or more previous civil penalty orders, to requiring one or more previous civil penalty orders’.[86]

Responsible for creation or alteration

The aggravated offence in proposed subsection 474.17AA(5) will apply if the person commits the underlying offence and is ‘responsible for the creation or alteration of the material’. The Explanatory Memorandum indicates the fault element for this part of the aggravated offence will be ‘intention’ and under the Criminal Code ‘a person has intention with respect to conduct if he or she means to engage in that conduct’. The increased penalty of the aggravated offence ‘reflects the inclusion of the additional element in the aggravated offence that the person intentionally created the material before intentionally transmitting it, knowing or reckless as to the lack of consent’.[87]

Potentially, the legal concept of ‘responsibility’ may be clouded by the creation or alteration of material by generative AI programs. For example, generative AI text-to-image programs such as Stable Diffusion are developed by training software on millions of images, including images of real persons, and can generate their results from imprecise user ‘prompts’ rather than robotically following the user’s specific commands. In these generative AI image programs, the user only has a degree of control over the content that will be generated by a prompt.[88]

Double jeopardy

Proposed section 474.17AB addresses the double jeopardy issues which arise where aggravated offences build on an underlying offence. The inserted table clarifies that an offender cannot be convicted or acquitted of both one of the aggravated offences and the underlying offence or the other aggravated offence. Similarly, proposed subsection 474.17AB(3) provides that a person who has been convicted or acquitted of the underlying offence cannot be additionally convicted for either of the two aggravated offences.

The reliance on the existence of certain civil penalty orders for the first aggravated offence (in proposed subsection 474.17AA(1)) raises the prospect that these orders may be set aside or reversed on appeal.[89] Where this occurs, proposed subsection 474.17AB(4) provides that a conviction for the aggravated offence must be set aside. However, this setting aside does not prevent proceedings from being instituted for the underlying offence or the other aggravated offence.

The Scrutiny Committee sought the Attorney-General’s justification for why this arrangement may allow ‘a person to stand trial twice for the same factual circumstances when guilt as to the offence under proposed subsection 474.17A(1) would already have been established in a previous proceeding’. It noted ‘the impact criminal trials have on individuals therefore considers a strong justification should be provided as to why it is necessary in this instance for an individual to stand trial twice’.[90]

Alternative verdicts

Item 6 repeals and replaces the existing provision for alternative verdicts in section 474.17B to reflect the new underlying offence and aggravated offences. Effectively this allows for the ‘trier of fact’ (either the judge or a jury), where they are not satisfied a person is guilty of one of the aggravated offences, to nonetheless find the person guilty of the underlying offence or the other aggravated offence. Proposed subsection 474.17B(2) provides these alternative verdicts only apply if the person has been accorded procedural fairness in relation to the finding.

Application

Item 7 provides that the Bill’s amendments will apply to material transmitted after the commencement of the Bill regardless of whether the material was created or altered before or after commencement. The Explanatory Memorandum characterises this approach as ‘operationally necessary because it would be impossible for the prosecution to prove, based on admissible evidence, precisely when material was created or altered and whether that was before or after the commencement of this Bill’. It states:

It is appropriate for the new offences to apply to material created or altered prior to the commencement of the Bill given that the prosecution will still need to prove that the transmission of the material and absence of consent occurred following the commencement of the Bill. In a prosecution of the aggravated offence at subsection 474.17AA(5), the prosecution would also still need to prove that the person intentionally created or altered the material, it just will not matter whether that occurred before or after the commencement of the Bill.[91]