Chapter 2 - The crime of child sexual exploitation

Chapter 2The crime of child sexual exploitation

2.1The internet has become the dominant method for obtaining and sharing child abuse material (CAM), but regrettably this crime is not a new problem. The Office of the eSafety Commissioner (eSafety) described recent developments:

The phenomenon of producing and sharing child sexual exploitation material pre-dates the Internet. However, the pre-online trade came with significant risks to offenders, reliant as it was on distributing hard copy material either through the post or via small interpersonal networks. Processing photographs and film depicting the sexual abuse of children presented considerable risk, given the need to outsource to film processing labs. In consequence, the demand for material through this period was frequently catered to by child sexual exploitation magazines with names such as Lolita and Nudist Moppets.

With the advent of dial-up Internet, the opportunity to connect with likeminded offenders with relative ease and anonymity increased substantially. Digitised versions of CSEM [child sexual exploitation material] imagery, often scanned from magazines, were shared on bulletin boards and via email. However, file sizes were still limited by dial-up connection speeds and shaky infrastructure.

Connection speeds and bandwidth improved through the early 2000s. Alongside this technical development, digital cameras became affordable household items. It did not take long before digital cameras were integrated into mobile phones and, later, smartphones. The Internet began to abound with images produced and shared by offenders abusing children in their care. Websites, peer-to-peer networks, imageboards and forums became common and highly accessible locations to encounter CSEM.[1]

2.2This chapter examines recent trends and changes in the crime of child exploitation, with a focus on online CAM.[2] In so doing, it conveys the criminal landscape to which law enforcement must respond. The chapter reviews key evidence as follows:

The severe harms caused by child exploitation and child abuse material.

The alarmingly high volume of CAM online, comprising:

the prospect that there is even more CAM that is undetected;

the implications of technological developments; and

the effect of the COVID-19 pandemic in facilitating CAM.

The prevalence of this crime in Australia.

The increasing severity of CAM offending.

Different forms and features of CAM offending, comprising:

use of anonymising technologies;

use of both the dark web and clear web;

coercion of children to produce CAM of themselves;

live online child sexual abuse (also known as live streaming);

use of virtual currencies; and

risks associated with emerging technologies.

Harm caused by child exploitation and child abuse material

2.3While the focus of this report is on law enforcement's ability to respond to child exploitation, this brief section relays some evidence to the committee that illustrates the magnitude of harm caused by this crime.

2.4The Australian Institute of Criminology submitted that sexual offending against children is 'a complex and harmful crime associated with ongoing trauma and lifelong adverse consequences for child victims, including psychiatric disorders, substance abuse, revictimisation and offending in adulthood'.[3]

2.5The Cyber Security Cooperative Research Centre emphasised:

Accessing and viewing CAM is not a victimless crime, as many perpetrators believe. It has a devastating and lifelong impact on the children that are abused, with abuse images living on unfettered in the uncensored world of clandestine online CAM groups.[4]

2.6Emeritus Professor Roderic Broadhurst and Mr Matthew Ball highlighted that:

Child sex abuse is not a free speech or privacy issue but a harmful crime of domination sheltered and facilitated by anonymity platforms and encrypted social media services.

One survivor explained: 'The abuse stops and at some point, also the fear of abuse; the fear of the material never ends.' Survivors of CEM [child exploitation material] often suffer post-traumatic stress disorder. Images can be replicated and shared again. Victims experience anxiety using the Internet because their image may reappear and they will be traumatized again. This constant re-victimization process and stress impact day-to-day functioning, degrades quality of life, increases potential physiological and mental harm, and negatively affects life course.[5]

2.7The Carly Ryan Foundation submitted that the 'uncertainty of knowing who has seen them being exploited as a child carries life-long, unresolved paranoia and emotional trauma on a victim'. It quoted from a respondent to the Canadian Centre for Child Protection's Survivor Survey:

Look at it like this. The hands-on was horrible. But at the very least it is over and done with. The constant sharing of the abuse will never end; therefore the reminder of its existence will never end… If you ask me, a crime that will never end is worse than one that is over; no matter how much more serious it may appear. That this is something inescapable. That there will never be total absolution.[6]

2.8The Uniting Church in Australia, Synod of Victoria and Tasmania (Uniting Church Synod), provided a series of distressing quotes from survivors of child sexual abuse. The themes of these quotes included a sense of powerlessness, fear, suicidal thoughts, and continued harm to the survivors due to the knowledge that images of their abuse are still in existence.[7] The Uniting Church Synod also quoted Dr Sharon Cooper, Developmental and Forensic Paediatrician and Adjunct Professor of Paediatrics, University of North Carolina at Chapel Hill School of Medicine, as follows:

Child sexual abuse is a life changing adversity and an injury which research now reveals can manifest a harmful impact upon a child's physical health, immunity, ability to learn, to grow, and mental well-being. Children with pre-existing health problems often have worsening of symptoms when they suffer this and other forms of abuse. Survivors tell us that the memorialisation of child sexual abuse through the production of abusive images and videos and even worse, its distribution, constitutes a most egregious insult to an already severe injury.[8]

2.9Ms Madeleine West, an advocate against child sexual exploitation, described some of the harms to our society in this way:

I would ask: what is the social cost of failing to protect our children? And I can speak from personal experience. Walking wounded who can't work, can't function, drug dependence, unsafe sexual practices, alcohol abuse, crime, eating disorders. Some even go on to perpetuate these crimes themselves. We shake our heads and we wring our hands at the thought of some monster seeking to harm our most vulnerable citizens, and that's our children, yet most of us blindly refuse to acknowledge that very predation is playing out in our towns, our schools, our streets, on our screens.[9]

High volume of child abuse material online

2.10Inquiry participants emphasised that there is a staggeringly large amount of CAM online. Many pointed to the millions of reports of suspected online exploitation of children that are made each year to the United States National Center for Missing and Exploited Children (NCMEC).[10] The NCMEC forwards these reports to law enforcement agencies around the world.[11]

2.11According to NCMEC data on the number of reports of suspected online child exploitation, it received approximately:

16.9million reports in 2019;[12]

21.7million reports in 2020;[13]

29.3million reports in 2021;[14] and

32million reports in 2022.[15]

2.12The vast majority of these reports were made by electronic service providers.[16]NCMEC data show that reports from electronic service providers:

in 2021 included 39.9million images of which 16.9million were unique, and 44.8million videos of which 5.1million were unique;[17] and

in 2022 included 49.4million images of which 18.8million were unique, and 37.7million videos of which 8.3million were unique.[18]

2.13Reports to NCMEC can relate to multiple forms of online child sexual exploitation, but, in 2021, over 99percent of the reports 'regarded incidents of suspected CSAM [child sexual abuse material]'.[19] In 2022, it was over 99.5percent.[20]

2.14The Australian Institute of Criminology submitted that '[t]here is evidence that CSAM has proliferated in recent years' and cited analysis of NCMEC data which found that 'reports of sexually abusive videos of children dramatically increased from under 1,000video reports per month in 2013 to over two million video reports per month in 2017'.[21]

2.15The Uniting Church Synod also drew attention to the upward trend of NCMEC reports, though pointed out that 'it is not clear how much of the increase in reports is due to a rise in the amount of online child sexual abuse material. Some of the growth in reports may be due to better detection and reporting of such content'.[22] Senior Social Justice Advocate, Dr MarkZirnsak, also observed that some of the reports are 'repeat reports of the same content'. He said:

It's still a problem for the victims and the survivors whose material is out there all the time; that's ongoing abuse for them. But it's a much smaller pool of victims that relate to that large pool of material. On the flipside of that, of course, that's only detected material; there's still a pool of undetected material as well. So often when measuring these things there is quite a bit of uncertainty, but I would certainly agree there's been an increase in the prevalence of this abuse and human rights abuses online.[23]

2.16eSafety provided data from other organisations that also suggest there is a high volume of online CAM, including the following:

In 2021, the United Kingdom (UK) Internet Watch Foundation assessed 361,062 reports, of which almost 70 per cent 'led to online material depicting children being sexually abused'.

The Canadian Centre for Child Protection leads Project Arachnid, which has 'led to 6million images and videos of child sexual exploitation being removed from more than 1,000 electronic service providers across more than 100 countries worldwide'.

Members of the International Association of Internet Hotlines (INHOPE) 'exchanged reports about nearly one million URLs depicting suspected CSEM' during 2021. In 2020, 39percent of content URLs were unknown; this increased to 82percent in 2021.[24]

2.17Evidence to the committee indicates this is a global problem.[25] For instance, the United Nations Office on Drugs and Crime, Regional Office for Southeast Asia and the Pacific (UN Office on Drugs and Crime) submitted that the 'amount of CSAM uploaded and shared online in Southeast Asia has increased dramatically in recent years, as reported by both law enforcement agencies and industry partners'. It also said 'the number of offences reported to law enforcement is likely to be only a fraction of the actual cyberrelated dangers currently threatening Southeast Asian children and adolescents'.[26]

2.18Research by the UK National Crime Agency suggested that 'there are likely to be 550,000 to 850,000 UK based individuals posing varying degrees of sexual risk to children, with a central estimate of 700,000'. This covers a 'broad spectrum' of potential offences.[27]

2.19The Uniting Church Synod reported that 'children's rights network Terre des Hommes has estimated that there will be roughly 750,000 men worldwide looking for online sex with children at any time of the day'.[28]

Potential for more child abuse material that is undetected

2.20The Australian Institute of Criminology submitted that 'it is possible that a large proportion of CSAM offending remains undetected'. It referred to research finding that 'only 30percent of sexual assault victims in Australia report their abuse to police', and said:

There have been similar findings relating to CSAM offending. A survey of 133 victim-survivors of CSAM offending found only one in four (23%) of the CSAM incidents were reported to the police or a child welfare agency…[29]

2.21The Uniting Church Synod also cited evidence that 'the number of victims of online child sexual abuse is underestimated', submitting:

For example, interviews with children across 12 countries in the East Asia and Pacific and Eastern and Southern Africa regions during 2020 to 2021 indicated that between one and 20% of children suffered online sexual exploitation and abuse in the past year. Only one in three told anyone about the abuse they suffered.[30]

2.22The eSafety Commissioner, Ms Julie Inman Grant, was quoted in an eSafety media release regarding the high number of reports to NCMEC in 2021:

Ms Inman Grant said these reports appear to be the tip of a very large iceberg. eSafety has handled more than 76,000 investigations concerning child sexual exploitation material since 2017 and believes there is a lot more child sexual exploitation material looming beneath the surface.[31]

2.23The Australian Federal Police (AFP) advised that 'the volume of child abuse material facilitated by and shared on the dark web is difficult to quantify' (use of the dark web is discussed later in this chapter).[32] Nonetheless, the AFP advised in December2021 that:

…the ACCCE [Australian Centre to Counter Child Exploitation] has identified over 1.6 million registered accounts using anonymised platforms such as the dark web and encrypted apps solely for the purposes of facilitating and distributing child abuse material.[33]

The implications of technological developments

2.24As explained by eSafety at the start of this chapter, technological developments have amplified CAM-related crimes. Other inquiry participants also highlighted this point; for instance, the Australian Institute of Criminology submitted that viewing, sharing and producing CAM 'is a borderless crime that is flourishing with ongoing advances in technology in the online environment, including internet sites and platforms'.[34]

2.25On this matter, the Uniting Church Synod advanced:

The emergence of the online world has dramatically facilitated the rape, torture and sexual abuse of children across the globe. Child sexual abuse perpetrators can now find their victims online by using advanced technologies and taking advantage of online platforms and services to go undetected.[35]

2.26The Uniting Church Synod also discussed the challenge of general deterrence in the online world:

The combination of completely anonymous identities, communication channels that police cannot access in any circumstances and technology corporations being able to conceal and destroy evidence of serious crimes creates an online environment where those wishing to harm others can have a sense of impunity. This encourages higher levels of severe criminal behaviour. The higher levels of serious criminal behaviour mean that police can deal with a shrinking portion of the online criminal behaviour, which in turn increases the level of people engaged in severe criminal behaviour. It becomes a vicious circle.[36]

2.27The UN Office on Drugs and Crime pointed to an increase in the sharing of CAM on social media platforms 'due to the self-destruct function and encryption some of these applications offer'. It said that 'false user profiles are easily created, taken down, and recreated, enabling the set-up of hidden social media offender groups distributing such illicit media'.[37] In relation to live streaming, the office submitted that:

…as the Internet expanded to more parts of the world, foreign child sex offenders no longer needed to travel to gain access to Southeast Asian children. They can now easily contact local traffickers, select children, view and even direct long-distance video calls in real-time from any location while maintaining their anonymity.[38]

2.28Destiny Rescue pointed out that language differences previously provided a barrier between western-based abusers and non-English speaking communities, but 'over the past five years, the rise in automatic electronic translation software has broken down these barriers'.[39]

2.29The Department of Home Affairs submitted that information and communications technology 'have provided a vehicle for the proliferation of child sexual abuse at a global scale, and created an online market for the exchange of child abuse material, including on the darknet, where offenders can operate with anonymity'. In relation to anonymising technologies, it advanced:

The anonymity offered by the dark web and other forms of anonymising technologies, combined with the rise of live-streaming and pay-per-view services, and the use of virtual currencies is making it increasingly difficult to identify and track offenders.[40]

The effect of the COVID-19 pandemic

2.30Some inquiry participants observed an increase in reports of CAM during the COVID19 pandemic.[41] The Cyber Security Cooperative Research Centre explained that 'though widespread prior to COVID-19, the pandemic has created the perfect environment for the spread of online CAM to become ever more pervasive'. It said a January 2021 United Nations report found that:

…COVID-19 has increased and accelerated the sexual exploitation of the some of the world's most vulnerable children, "amplifying the risks of exposing them to sale, trafficking and sexual exploitation globally". The report notes the pandemic and an increased use of online platforms has increased unsupervised time spent on the internet, exacerbating already existing patterns of sexual exploitation, with criminal groups dedicated to sexual exploitation quick to adapt their ways of working, by escalating the use of online communication.[42]

2.31The AFP explained that 'as a result of the growing number of households spending considerably increased time online (with sometimes limited security measures) the availability and access to children online has increased'.[43] The AFP reported that it and the ACCCE have:

…observed the emergence of a number of child abuse forums established as a result of COVID-19 stay at home measures. These forums now include more than 1,000 members combined and specifically provide advice on how to establish online relationships with children in the COVID-19 environment.[44]

2.32Victoria Police referred to AFP reports that 'the March 2020 COVID-19 lockdown resulted in websites hosting child abuse material to crash due to the increased volume of user traffic'.[45]

2.33While recognising the increase in reported CAM during the pandemic, ECPAT International suggested that this does not necessarily reflect an increase in new material being created:

Recent research funded by the Australian eSafety Commissioner on the impact of COVID-19 on online child sexual exploitation indicates that Australian law enforcement agencies, child helplines and online reporting mechanisms showed increased reporting of a range of online related child sexual abuse and exploitation. This noted increase in reporting was also seen globally. However, global law enforcement bodies have also noted that there is not yet evidence indicating an increase in new child sexual abuse material. It could be that during movement restrictions, global attention turned to our online lives and has resulted positively in more vigilance, and more concerns being raised.[46]

2.34However, ECPAT International also highlighted how the pandemic has hindered law enforcement in other ways:

Despite increased attention on online child sexual exploitation during this period, the COVID-19 pandemic has globally resulted in fewer reports reaching police, difficulties in moving forward with existing investigations and reduced use of the global International Child Sexual Exploitation database due to movement restrictions and other priorities faced by law enforcement personnel.[47]

CAM offending related to Australia

2.35Regrettably, online CAM is of particular concern for Australia. As the Cyber Security Cooperative Research Centre submitted, 'Australians are some of the most prolific consumers of this material, with research indicating Australia is the third largest market for live, online child sexual abuse'.[48]

2.36Commander Hilda Sirec of the ACCCE reported that 'Australia [is] not immune to offenders disseminating child abuse material'.[49] In 2022-23 the ACCCE Child Protection Triage Unit received over 40,000 reports of online child sexual exploitation, which is more than double the 14,285 reports received five years prior.[50] The AFP provided the following table showing its recent activity relating to child exploitation.

Table 2.1Data from the Australian Federal Police about recent activity relating to child exploitation

2019-20

2020-21

2021-22

2022-23

Persons arrested / summonsed / Court Attendance Notice issued[51]

161

235

221

186

Charges laid

1,214

2,772

1,746

925

Children removed from harm:

—domestically

67

88

33

36

—internationally

67

144

46

84

—total

134

232

79

120

New victims identified[52]

37

184

170

141

Source: Data from table in Australian Federal Police, Submission 38.1, [p. 2] (reformatted for this report). Also see Australian Federal Police, Submission 38, p. 1.

2.37The AFP has suggested there are several potential explanations for the increase in data over recent years, including improvements to victim identification capability and greater awareness in the community.[53] The data also reflect major AFP operations during 2020-21.[54] The AFP highlighted Operation Molto, coordinated by the AFPled ACCCE.[55]

Box 2.1 Operation Molto

Operation Molto was described in the AFP's 202122 annual report:

The ACCCE continues to establish, coordinate and lead major operations to counter child exploitation. In 2020 it began coordinating Operation Molto to target offenders sharing child abuse material online. This was a significant nationwide operation, which closed in 2021–22, involving the AFP and all Australian state and territory police, including the JACETs [Joint Anti Child Exploitation Teams], the ACIC [Australian Criminal Intelligence Commission] and AUSTRAC.

The operation began when the ACCCE received intelligence from the New Zealand Te Tari Taiwhenua (Department of Internal Affairs) showing over 200,000 potential suspects using a cloud storage platform to share abhorrent child material abuse online.

In Australia, approximately 1,440 persons of interest were assessed. At the conclusion of the operation in March 2022, police from every state and territory in Australia had executed 158 search warrants in Australia, charged 121 men with 1,248 charges, and removed 51children from harm.

Some of the alleged Australian offenders are accused of producing their own child abuse material online and were allegedly in possession of material produced by a man the AFP arrested in 2015 as part of Operation Niro. Initial review by the ACCCE in establishing Operation Molto identified that most of the material was categorised as being in the top tiers of severity.

Globally, the multinational law enforcement effort has resulted in 153 children being removed from harm in many countries around the world.[56]

2.38The Department of Home Affairs collaborates with the Australian Border Force and AUSTRAC to discover previously unknown travelling child sex offenders and those carrying CAM across the border. It advised in October 2022 that since international travel resumed in early 2022, '31 significant events have occurred as a result of the Home Affairs Intelligence Division and AUSTRAC data profile, with 17targets being detected with CAM at the border, nine of which were subsequently arrested'.[57] A representative of the Australian Border Force told the committee 'the more we look the more we find. That's the sad truth of the sorts of environment that our officers work in each and every day'.[58]

2.39The Attorney-General's Department observed an increase over recent years in the number of matters referred to the Commonwealth Director of Public Prosecutions that involve Commonwealth online child sex exploitation offences.

Table 2.2Number of referrals of matters involving Commonwealth online child sex exploitation offences to the Commonwealth Director of Public Prosecutions

2018-19

2019-20

2020-21

2021-22

229

347

385

387

Source: Data from Attorney-General's Department, Submission 43.1, [p. 1] (reformatted for this report).

2.40The Attorney-General's Department also submitted that, in the first three months of 2022-23, there was an average of 41 referrals per month:

If this trend is maintained, the CDPP [Commonwealth Director of Public Prosecutions] will receive almost 500 referrals in the 2022-23 financial year, which would be an increase of 27% in a single year. When compared against data from 2018-19, this referral rate is more than a 100% increase of referrals compared to 5 years ago.[59]

2.41Alongside the prevalence of CAM offending by adults in Australia, the committee heard that the risks of child exploitation extend to children in Australia. For instance, the AFP's Deputy Commissioner, Operations, MrBrettPointing, advised:

As of 2021, 89 per cent of the Australian public are active internet users. One in three of those users are children. Of those children online, one in five are sexually solicited.[60]

2.42MrStephenDametto, Acting Assistant Commissioner at the AFP, advised that '[o]ver 100 children report each month to Australian law enforcement that they are a victim of sextortion, and analysis confirms that this is a fraction of the real number'.[61]

2.43The Western Australia Commissioner for Children and Young People provided results from its 2021 Speaking Out Survey, which included a 'fully representative sample of 16,532 Western Australian school-aged children and young people'. It seems the results do not necessarily reflect the involvement of an adult; nonetheless, the Commissioner reported:

56 per cent of female students and 30 per cent of male students in years912 reported they had been sent unwanted sexual material.

Of the students who had received unwanted sexual material, 41.9percent reported they had received it three or more times in the last 12 months, and 35.7 per cent reported they had received unwanted sexual material once or twice in the last 12 months.

The vast majority students who had been sent unwanted sexual material continued to report they had received the material via social media (92.9percent - unchanged from 2019), with a lower proportion reporting receiving it via text message (18.7 per cent).[62]

2.44eSafety—which receives complaints about illegal and restricted online material—has handled over 90,000 complaints since 2015, of which 'the majority' involve child sexual exploitation material. It submitted that the 'sustained, global growth is often outstripping capacity to respond, and is an issue of worldwide concern'.[63]

2.45Ms Kirra Pendergast, Chief Executive Officer of Safe on Social, leads education programs in schools and told the committee that she has received disclosures from children 'at least once a week…around sexual assault that's been stimulated online'.[64]

Increasingly severe offending

2.46A range of inquiry participants highlighted a worrying trend towards more egregious offending. As the Department of Home Affairs submitted, '[t]he scale and severity of online offending has escalated over time, with material now depicting increasingly younger children and higher degrees of violence'.[65] It also cited research by the Internet Watch Foundation in 2017 that was:

…based on analysis of over 2,000 images and video captures from live streamed sexual abuse of children [and which] revealed that 40 per cent were classified as containing serious sexual abuse, including the rape and torture of children.[66]

2.47eSafety reported that of nearly one million reports exchanged between members of the INHOPE network in 2021, '82% of all reported CSEM involved the abuse or exploitation of pre-pubescent children'.[67]

2.48The UN Office on Drugs and Crime also highlighted a 'substantial escalation in CSAM where the victims are infants or "pre-verbal" children'. It explained:

These children cannot self-report the abuse and are often abused by someone they know. The Internet Watch Foundation (IWF) reported that 94percent of CSAM found online contains images of children aged 13 or under. IWF also found that 39 percent of the images were children aged 10 or under.[68]

2.49ECPAT International submitted that, during interviews with law enforcement officers in 2018, the officers 'indicated that they were seeing increasing levels of violence in the child sexual abuse material they encountered in their work and that more egregious images involved younger children and were often produced within a family context'.[69]

2.50Collective Shout also submitted that 'that more horrific sexual content featuring children has emerged over time', and referred to analysis of a sample of INTERPOL's International Child Sexual Exploitation database, which showed:

The younger the victim, the more severe the abuse.

84% of images contained explicit sexual activity.

More than 60% of unidentified victims were prepubescent, including infants and toddlers.

65% of unidentified victims were girls.

Severe abuse images were likely to feature boys.

92% of visible offenders were male.[70]

2.51Regarding live online child sexual abuse, International Justice Mission described the horrific severity of abuse:

Internet Watch Foundation (IWF) research on child sex abuse livestreaming reveals 98% of victims are 13 or under. Forty percent of the livestream captures or recordings were classified by IWF as containing 'serious' sexual abuse, with 18 percent involving the rape and sexual torture of children. This is consistent with IJM's [International Justice Mission's] on-the-ground casework experience in the Philippines. In the over 330 cases IJM has worked on, the livestreamed abuse suffered by children at the behest of Australian and other offenders who watch on video calls is rarely limited to erotic displays: it usually includes forcible sexual penetration constituting rape in most jurisdictions. Children are forced to engage in sex acts with other children, sexually abused by an adult, and sometimes harmed in other degrading ways, such as in bestiality. IJM social workers and lawyers have journeyed with hundreds of survivors as they pursued healing and justice from these traumatic harms perpetrated both in person and online.[71]

2.52The Cyber Security Cooperative Research Centre highlighted that 'the average number of images police seize from offenders has exploded – in the early to mid2000s, offenders averaged about 1000 images; now it is 10,000 to 80,000 images and videos'.[72] Evidence from Victoria Police also suggested an increase; it advised that:

Victoria Police offence data between 2010-2020 shows that incidents involving child abuse material consistently make up about 5% of all sex offences. However, the average volume of material involved in each of these incidents has increased over this time, particularly in 2020.[73]

Forms and features of CAM offending

2.53As the Australian Institute of Criminology submitted, 'online sexual exploitation of children is an evolving crime, with a recent trend towards more harmful and financially motivated methods of exploitation'.[74] This section reviews evidence about different forms and features of this offending, which have implications for how law enforcement must respond to it.

Use of anonymising technologies

2.54eSafety submitted that anonymous communication is 'a cornerstone of promoting freedom of speech, expression and privacy on the Internet, but it can also be misused to control and abuse people'. Anonymity techniques include using virtual private networks (VPNs), end-to-end encryption, and taking on a fictional identity under a false name. In relation to CAM, eSafety explained:

Most investigations into CSEM involve individuals posting the content online anonymously. These investigations have shown that content contributors will go to great lengths to remain anonymous, often using one or more anonymising security measure to hide their identities.

Sexual predators also commonly use anonymous, fake, imposter and impersonator accounts to lure victims and gain their trust. For example, they may use an avatar in a game to pretend they are the same age and gender as a child so they can become a fake friend and groom them for sexual interaction.[75]

2.55Queensland Police referred to offenders' use of fake accounts to make contact with children.[76] Victoria Police reported that the 'absence of any requirement to provide identification to access the internet and set up online accounts has limited law enforcement's ability to track suspected offenders and bar recidivist offenders from being online and reoffending'.[77]

2.56The Uniting Church Synod advanced that the 'ease with which it is possible to set up multiple anonymous and false identities have greatly assisted those who seek to abuse children online'. It also observed that there is 'increasing availability of products that help people conceal their online identities'.[78] DrZirnsak provided an example of how online anonymity hinders law enforcement and therefore facilitates the exploitation of children:

To highlight the damage done, there is the recent case of Alladin Lanim from Sarawak Malaysia. He was recently sentenced to 48 years in prison. He started posting child sexual abuse material online in 2007 but, because of his ability to have an online anonymous identity, it was not until this year [2021] that law enforcement agencies globally were able to identify him. In that period, he was able to post approximately 10,000 images and videos of child sexual abuse. Australian law enforcement agencies have identified 34 of the children who were the victims of his abuse but there may have been more beyond those that have been identified. So that is the consequence of allowing people to have these completely anonymous identities rather than a situation where the platform knows who you are, even if you have an anonymous identity that is public-facing which is needed for some other reasons.[79]

2.57eSafety said that the use of fake accounts makes it 'very difficult' for regulators and law enforcement to identify and act against the user, and 'almost impossible for social media services and other users to deal with abusers breaching the terms of service'. eSafety called for a balance:

…where the misuse of anonymity and identity shielding is restricted without removing any of the legitimate benefits. Steps can be taken by services to verify accounts before users start to operate them, or to take down accounts that violate the terms of service and prevent them from resurfacing.[80]

Use of the dark web and clear web

2.58The Cyber Security Cooperative Research Centre described the dark web:

The dark web is not like the surface web, the external interface of the internet most people are familiar with. It is part of the internet that evades indexing by search engines, instead requiring the use of an anonymising browser (like Tor) that routes traffic through multiple servers, encrypting it along the way. To help ensure anonymity, dark web browsers isolate sites to prevent tracing, automatically clear browsing history, prevent surveillance of connections, clone or dupe users' appearances to avoid fingerprinting and relay and encrypt traffic three times as it runs across the network. Because access to specific secret sites is required, criminals that use the dark web to plan their activities can hide these activities and work hard to ensure that their groups are not infiltrated by law enforcement. Hence, given the anonymity the dark web affords, it is unsurprising it has been exploited by a wide range of criminal actors, including those producing and seeking CAM.[81]

2.59The AFP highlighted that the technologies needed to access the dark web 'are free or low cost for perpetrators to use, yet make a significant impact on the ability for law enforcement to detect and access'.[82]

2.60Submitters suggested the dark web is widely used for CAM; for example:

The UN Office on Drugs and Crime cited recent analyses of the dark web indicating that although 'only 2 per cent of hidden web services on Tor hosted CSAM, these 2 per cent accounted for 80percent of the Darkweb traffic'.[83]

The NSW Police Force submitted that there is:

…a strong shift among the more technically proficient offenders towards the use of anonymising technologies, such as virtual private networks and the dark web, to host and access file sharing platforms for the distribution of child exploitation material.[84]

Associate ProfessorBenoit Leclerc, Associate Professor Jesse Cale and Professor Thomas Holt reported that, according to the UK's Internet Watch Foundation, 'the last several years have seen the greatest overall rise of Dark Web markets engaged in the sale of CEM', usually via virtual currencies.[85]

2.61Associate Professor Leclerc, Associate Professor Cale and Professor Holt also submitted that offenders who use the dark web increase the scale of others' offending:

While some offenders may not have the skills and knowledge to access and engage on the Dark Web, those who do can dramatically increase the consumption, distribution and production of CEM online for other offenders.[86]

2.62The AFP explained that offenders using the dark web:

…are cautious of law enforcement, and often produce and share 'how to' guides to assist perpetrators on avoiding law enforcement detection alongside instructional guides for producing child abuse material. Traditional law enforcement techniques struggle to address the scale of this problem.[87]

2.63The committee heard that forums on the dark web are used to share information about accessing CAM; the Uniting Church Synod submitted:

Those engaged in child sexual abuse online teach each other how to become anonymous online. They are more commonly educating each other on using private chats, Internet voice and video chat software, forums and anonymisation software. The feeling of impunity, because of those carrying out the abuse being able to conceal their identity, has enabled them to diversify their activities.[88]

2.64Collective Shout quoted the following comments by Europol about offenders' efforts to avoid detection:

Users regularly publish information and safety manuals aimed at avoiding detection by law enforcement authorities. Some users are also attentive to law enforcement operations and regularly publish news articles or even summary reports of the techniques used during successful operations. Cross-posting across various boards and forums highlights a collective approach to improve operational security for all.[89]

2.65The Cyber Security Cooperative Research Centre gave an example of a dark web forum with over 45,000 members:

The website required members to post new child exploitation material every 30 days in order to retain membership, utilising TOR computer software to mask their identity.

Membership came with designated access to different areas of the forum, access to the rules of membership and technical forums directed towards encryption, software and internet safety advice. Members also had access on private areas where there was discussion surrounding the sexual abuse of children and 'rare content'. In addition, members could become special VIPs, honorary members or Private Zone members.[90]

2.66The Australian Institute of Criminology submitted that 'offenders tend to network with and encourage one another to sexually abuse children', and gave an example that reflects the magnitude of offending on the dark web:

In 2019, media outlets reported that the United Kingdom's (UK) National Crime Agency took down a dark web site containing 250,000 videos of children being sexually abused…This resulted in 337 arrests of site users in 11 different countries. It was revealed that users were incentivised to upload their own material of children being abused by receiving 'points' that they could then use to download more material. Forty-five percent of the abusive videos were new to authorities, according to NCMEC…[91]

2.67eSafety highlighted that CAM extends beyond the dark web into the clear web:

Far from being a threat that exists solely on the 'dark web', this is all too often a crime and form of abuse that is playing out in front of us. The 'clearweb' (that part of the Internet that is indexed and can be reached by common browsers) remains a preferred medium for the distribution and hosting of CSEM at scale. On the clearweb, well-known top-level domains such as .com and .net are routinely abused to host CSEM, and open websites provide access to hundreds of thousands of images.[92]

2.68Collective Shout submitted that not all webpages hosting or distributing CAM 'even attempt to avoid detection'.[93]

2.69The AFP advised that law enforcement:

…continues to observe offenders using non-encrypted communications channels, such as web forums and social media chat functions. Offenders share insights, tips and protocols with each other, with the aim of preventing or defeating law enforcement detection. It is the case that this communication is, at times, not in the encrypted communications space.[94]

2.70The AFP also pointed out that 'the vast majority of reports received by NCMEC are from clear net electronic service providers'.[95] The UN Office on Drugs and Crime also submitted that:

…there is still a tremendous amount [of child sexual exploitation material] available on the Clearnet with criminals creating closed forums on social media platforms to add security for streaming actions and the dissemination of this material all around the world.[96]

2.71The UK National Crime Agency said it demonstrated—most recently on 30July2021—that CAM can be accessed 'via search engines, where within three clicks material can be found in ten seconds'. It explained this can drive further offending:

[T]he low bar to being able to offend on the open web due to availability of images can lead to; normalisation, including through discussions with likeminded individuals on forums on the open web and dark web; offenders sharing methods to commit offences and to evade detection, making them more difficult to target; and in some cases, escalation and incitement into severe offending including contact sexual abuse of young children.[97]

2.72eSafety advised that an increasing challenge for authorities seeking to remove CAM is the host websites themselves:

Increasingly, websites that contain CSEM are hosted by network providers that deliberately obscure their corporate footprint. This obfuscation can be achieved by providers registering company details in foreign jurisdictions, distributing registration across jurisdictions, and deliberately undermining the integrity of the global WHOIS database. Some providers openly market themselves as being 'bulletproof' implying that they are resistant to takedown and disruption and with a high tolerance to hosting illegal content. Removal of CSEM by INHOPE members, industry and law enforcement can be complicated by these tactics.[98]

'Self-generated' explicit material and extortion[99]

2.73The AFP explained this problem:

Children and young people are being targeted by online child sex offenders through social networking, image, video and instant messaging applications to self-produce online child sexual exploitation material.

Self-generated content can occur for a number of reasons, including but not limited to, consensual sexting, feeling pressured or coerced, sexual extortion, financial gain and in some instances children are being groomed and blackmailed to produce more extreme material.[100]

2.74ECPAT International submitted that this is 'a complex issue and includes a range of different experiences, risks and harms'. It explained how 'selfgenerated' material can arise in different contexts:

Some self-generated content is created and shared by adolescents voluntarily and such exchanges may be increasingly becoming part of young people's sexual development. In some instances young people have expressed that that it may provide advantages in their relationships and increase their self-esteem. However, the creation and sharing of selfgenerated sexual content can be coerced, for example through grooming, threats or peer-pressure. While coercion can clearly be seen as a crime and leads directly to harm, there can be negative consequences for children sharing any sexual content including in cases where sharing is not coerced. Material shared voluntarily may not cause harm at first, but there remain risks if it is later shared beyond the control of the person who created it. Once it exists, such content can also be obtained deceptively or by using coercion, and circulated by offenders perpetually.[101]

2.75Several inquiry participants highlighted the growth in self-generated explicit material.[102] For instance, the UK National Crime Agency referred to reports by the Internet Watch Foundation of 'a 77% global increase in the number of reports that included selfgenerated indecent imagery…including material consensually shared between peers, or elicited by offenders'.[103] The UN Office on Drugs and Crime also submitted that this 'relatively new phenomenon' has:

…grown significantly on a global scale in recent years and the trend is likely to continue due to children's increasing access to smart devices and their lack of awareness of the risks of producing and sharing SGEM [selfgenerated explicit material].[104]

2.76The Queensland Police Service reported it has seen a 'significant increase' in the first half of 2021, compared to previous years, in matters referred from TikTok:

These referrals primarily related to self-produced material with no indicators of coercion or exploitation. Investigations indicate that material is produced by children using internet capable devices with little or no supervision or understanding of the impacts of their behaviour. These referrals result in child protection reports which are referred regionally for police to investigate and act to prevent further offending.[105]

2.77Regarding an increase in online solicitation of children, the UN Office on Drugs and Crime cited a study which found that:

…most adults in SEA [Southeast Asia] do not consider grooming as child sexual abuse. Parents often lack awareness regarding OCSAE [online child sexual abuse and exploitation] and the risks of online grooming, leaving children without proper guidance or supervision in a fast-developing virtual world. Many young people in SEA, especially girls, experience online grooming and usually do not have the skills or knowledge required to address the problem, e.g., by reporting, blocking or using the privacy settings on the social media platforms they use.[106]

2.78More locally, the AFP advised that the ACCCE 'identified that confusion still exists in the community around what constitutes online child sexual exploitation, where to report matters, and the work of the ACCCE'.[107]

2.79The Attorney-General's Department described the worrying link between selfgenerated material and sexual extortion, also called 'sextortion', which is:

…a crime that can involve child victims being coerced by online offenders into sending sexualised images, often through the offender pretending to be another young person. An offender then threatens to on-share the content to others unless their demands are met. These demands can include large amounts of money, gift cards, online gaming credits, more child abuse images, and sexual favours. Despite complying with an offender's demands, the victim may continue to be threatened or extorted.[108]

2.80Queensland Police highlighted an associated form of offending known as 'capping', in which offenders use fake accounts to engage privately with children who 'are "groomed" into live-streaming sexual acts whilst the offender records the video stream'. The material that is produced is then:

(i)used as leverage to blackmail/'sextort' the child for additional sexually explicit material, sometimes forcing children to include friends and siblings;

(ii)exchanged in private communications or in online public forums with other offenders, who may themselves then blackmail/sextort the child, or

(iii)used in offenders' fake social media profiles to 'cap' other children.[109]

2.81Submitters emphasised the severe harms that sexual extortion causes on the child; for instance, the Uniting Church Synod submitted:

The negative psychological impacts of sexual extortion include feelings of low self-esteem, withdrawal, worthlessness, anger and guilt. In some cases, victims have engaged in self-harm or killed themselves.[110]

2.82The UN Office on Drugs and Crime said that 'where coercive techniques are used, an emerging trend can be identified towards more extreme, violent, sadistic, or degrading demands by perpetrators'.[111] It also submitted that offenders are 'typically male adults or young people, who may or may not know their victims', but added that 'online sextortion is also committed by organised cybercriminal networks that operate out of locations similar to call centres'.[112] It gave the example of Operation Strikeback, which was launched in 2014 'specifically targeting organised online sextortion rings in the Philippines':

The two-day raid led to the arrest of 58 individuals, the seizure of over 250pieces of electronic equipment, and the identification of over 190individuals associated with organised crime in the Philippines. Three of the men arrested had harassed and sextorted a Scottish teenager, who later committed suicide.[113]

2.83Commander Sirec of the ACCCE also told the committee that sexual extortion can be 'highly organised'. She said:

We are seeing individuals who are able to extort a number of victims at one point. The methodology and the typology of the offender are changing to a profit generated methodology rather than the commodification of child abuse material, but we're also seeing a network of child abuse syndicates. They apply organised crime methodology, but they are also highly organised in terms of their syndication and the way that they share information and details about law enforcement obfuscation. They create structures that would be akin to organised criminal syndicates, but, again, their commodity is child abuse material.[114]

2.84The AFP has observed 'a significant and sustained rise in numbers of sextortion referrals received by the ACCCE', with 396 victim reports received between April and August 2022. 'Extrapolate from that number, with our understanding that sextortion is substantially underreported, the number of victims could be much greater and exceed well over a thousand'.[115]

2.85The AFP submitted that self-generated content is challenging for law enforcement because:

…it is a highly stigmatised issue and in the majority of cases parents and carers are unwilling to discuss the topic with their children or with others; which can adversely affect the number of cases reported.

Research conducted by the ACCCE indicates that 21% of parents and carers thought the topic of self-generated material to be too sickening or disgusting to think about, 21% of parents thought that online child sexual exploitation could not happen in any form to their child and 15% of parents and carers reported that if their child was exploited online they would be too embarrassed to discuss this with others.[116]

Live online child sexual abuse

2.86The UN Office on Drugs and Crime distinguished between three forms of live streamed material:

Live-streaming of voluntary SGEM [self-generated explicit material], which features children or adolescents engaging voluntarily in live-streaming of nudity or sexual behaviour and is typically initially shared with a peer.

Live-streaming of SGEM that is the result of grooming or sextortion, in which case the victims are coerced into livestreaming while they are undressed or performing sexual acts.

Distant live-streamed child sexual abuse, which entails webcam "shows" often pre-ordered by child sex offenders, in which an adult is either physically involved in the abuse or is coercing the victim into performing sexual acts.[117]

2.87The Australian Institute of Criminology submitted that live streaming of child sexual abuse is 'a hybrid form of online child exploitation' because:

…it involves the real-time sexual abuse of a child by a third-party, often directed by a live streaming consumer from a distance. Offenders do this often in exchange for money and specify the type of abuse they wish to see… This crime blurs the line between contact and non-contact sexual offending because offenders direct the abuse of a child in another location. They do this by giving directions to either the facilitator (trafficker) or the victim themselves over online text or video chat…[118]

2.88The Cyber Security Cooperative Research Centre advised that reported costs of a live stream are 'as little as AU$14–57', and that 'Australian men have been identified as some of the most voracious consumers'.[119]

2.89Dr Rick Brown of the Australian Institute of Criminology gave an example of Australians paying facilitators in the Philippines:

[T]hrough the AFP, we were able to obtain information about the individuals that were sending money to those facilitators. Working with AUSTRAC, we identified the financial transactions that were going to those individuals. That operation identified 256 Australians that were purchasing live streaming of child sex abuse in the Philippines. That was just through that one operation. There were 2,700 transactions over 13 years by those individuals.[120]

2.90The UN Office on Drugs and Crime cited a 2016 UNICEF report that said the Philippines has:

…become the "global epicentre of the live-stream sexual abuse trade" resulting in tens of thousands of Filipino children being victimised via chatrooms and other social media offering online child sex performances. This abuse is not only due to the population's high level of Internet penetration and entrenched poverty, but also widespread knowledge of the English language.[121]

2.91International Justice Mission referred to its work in the Philippines and submitted that '84% of the time, traffickers are relatives of the children being exploited, the very people who are supposed to protect them'.[122] The UN Office on Drugs and Crime also highlighted the involvement of family members in a child's abuse:

An increasing amount of child sexual abuse in SEA [Southeast Asia] is being livestreamed from people's homes and is mostly operated by the victims' families rather than by organised crime syndicates online. Child victims are reported to "work" throughout the day, having to cater to offenders located in different time-zones. As the abuse is carried out by their family or parents, children are often oblivious to the exploitation and they experience the abuse as normal or accept it so they can better provide for themselves and their family. Hence, they are often unwilling to incriminate their relatives in court. With such "shows" costing between US$5 and US$200, the United Nations has estimated that the child abuse industry in the Philippines is already worth over US$1 billion.[123]

2.92Dr Brown, Australian Institute of Criminology, also discussed the prevalence of live streamed child sexual abuse that occurs within families:

There are huge social issues in those countries that, essentially, Australian men are exploiting, in a sense. That abuse comes in different ways. Some is organised through sex dens in those countries, but, increasingly, because of the ease with which the technology is available and the ease with which people living in poverty can make some money, it's becoming more of a locally produced form of crime, within the family. That can be a parent, an uncle or aunty or indeed an older sister that's actually involved in perpetrating. One of the things we found, actually, looking at this, was that often it is a female family member that's the facilitator who engages with the consumer in a country like Australia and makes that connection, followed by the abuse then being arranged.[124]

2.93Regarding the prevalence of live online child sexual abuse, the UK's National Crime Agency submitted:

The online streaming of live abuse is no longer an emerging trend but rather is an established reality. The growth in online streaming of CSA [child sexual abuse] has also been facilitated by the expanding reach of 4G, and recently 5G, in many parts of the world. Access to such technology has made it almost effortless for offenders to network and enable the exploitation of children's use of social media platforms.[125]

2.94The Australian Institute of Criminology said 'it is difficult to measure prevalence' but 'anecdotal evidence suggests global demand for CSA [child sexual abuse] live streaming is high':

In 2013, four researchers from Terre des Hommes Netherlands posed as prepubescent Filipino girls on 19 different online chat forums. Over a 10week period, 20,172 people from 71 different countries asked the researchers posing as children to perform a webcam sex show.[126]

2.95The Cyber Security Cooperative Research Centre reported that, during the COVID-19 pandemic, 'the volume of livestreamed abuse increased, with AUSTRAC reporting a "three-fold" increase of suspicious financial transactions indicating payment for such content in 2019-20'.[127]

2.96International Justice Mission emphasised that cases of live online child sexual abuse have 'recently been identified across dozens of countries' including Australia:

Australian children are also victims of child sexual abuse production and distribution via livestreaming. According to the Australian Centre to Counter Child Exploitation (ACCCE), "Australian children as young as eight are being coerced into performing live-streamed sexual acts by online predators, who often record and share the videos on the dark net and sexually extort victims into producing even more graphic content."[128]

2.97The Department of Home Affairs observed that live online child sexual abuse is challenging for law enforcement because:

…live-streaming leaves no visual evidence and, unless an offender records that abuse, investigators often need to rely on session logs, data usage trails and financial transactions identified as suspicious via AUSTRAC's intelligence reports.[129]

2.98Indeed, the AFP pointed out that live online child sexual abuse is 'seen as a "safer" way to offend'.[130] It also raised concerns that '[a]s a result of the relationships formed between the offenders (consumers and facilitators) and the victim, there remains a risk that the consumer may travel to offend in person against the victim or other children'.[131]

Use of virtual currencies

2.99Detective Superintendent Jayne Doherty of the NSW Police Force reported:

There are two reasons why people are producing CAM: it is either for a financial gain or to receive back further child abuse material.[132]

2.100In discussing the use of the dark web for CAM, Emeritus Professor Broadhurst and Mr Ball advised that 'about 7.5% of the CSAM on the Tor network is estimated to be sold for a profit'. They submitted that the 'majority of those involved are not in it for monetary purposes; most CSAM is simply "swapped" but a profit driven model has emerged with some services charging fees for content'.[133]

2.101Associate Professor Leclerc, Associate Professor Cale and Professor Holt submitted that there is 'growing evidence suggesting that offenders are now using virtual currencies to trade CEM'. They cited a finding by the Internet Watch Foundation that 'most of the identified websites in 2018 were commercial and only accepted payment in virtual currencies'.[134] AssociateProfessorLeclerc also advised that in 'the most recent years, since 2018 particularly, we have seen kind of a substantial rise in offenders using bitcoins, for instance, to get access to the material'. This use of virtual currencies 'adds this extra layer of complexity for law enforcement to track them down'. Associate Professor Leclerc also explained:

…20 years ago we would say offenders would trade material to get other material. 'I have produced some material, and I'm going to trade that material to get some novel or fresh material; something that is new and is not on any social media platforms.' They get into that sort of thing. But now we can see quite an important increase of individuals—not necessarily sex offenders; just individuals—getting into these sorts of criminal markets just to make money. So it provides another incentive for what we call facilitators to facilitate—to get children to do stuff, to record that stuff, and to send it to offenders to get paid for it.[135]

2.102Ms Mary Jane Welsh, Detective Superintendent, Cybercrime Division, Crime Command, Victoria Police, advised that the use of cryptocurrency 'adds a layer of complexity for law enforcement', and gave some examples of how it is used:

It may be the use of a cryptocurrency to access a child abuse forum in the first instance, or it may be the use of cryptocurrency to access the live abuse of children in a pay-per-view type situation, or it may be the use of cryptocurrency simply to buy CAM that that particular person wishes to add to their collection.[136]

2.103From the perspective of the Commonwealth Director of Public Prosecutions, MrMark de Crespigny told the committee that:

…I'm not certain how many cases I've actually seen which have involved cryptocurrency. Live-streaming cases normally involve transfers of money, obviously, to the person who is facilitating over in the Philippines et cetera, but I haven't seen any involving cryptocurrency. I've seen other offending involving cryptocurrency where investigators have been able to address the movement of money.[137]

2.104The AFP reported that the 'increasing use of cryptocurrency as method of payment for child exploitation offenders has presented difficulties for the AFP'.[138] It explained:

Though the AFP has limited abilities to seize or restrain cryptocurrencies, these powers first require identification of the private key or seed phrase that provides access or compliance from the person of interest to facilitate access.

Where access has not been possible, or an offender refused to comply with a section 3LA order,[139] the AFP has limited further recourse, and suspects and other parties retain the ability to dissipate the funds upon notification of law enforcement interest.[140]

2.105The Department of Home Affairs advised that 'AUSTRAC has a small, specialist cryptocurrency intelligence capability'.[141]

Risks associated with emerging technologies

2.106The committee heard that emerging technologies will continue to present risks in relation to child exploitation. The Department of Home Affairs submitted:

Despite continued collective pressure on digital industry to put safety at the forefront of their design, commercial interests continue to drive development decisions and emerging technology exposes more children to risk.[142]

2.107ECPAT International cited an emerging trend involving 'the use of entertainment tools based on virtual reality technology to contact children for the purpose of sexual exploitation'. It said the implications of child sexual abuse through virtual reality technology 'are yet to be fully understood or captured well in legislative frameworks'.[143]

2.108Victoria Police highlighted the use of 'deepfake technology' for grooming; this technology 'allows offenders to mask their identity and to impersonate known individuals or peers to their child target'. Victoria Police also expressed concern about spyware, which 'refers to software that is installed remotely on a device, typically without permission, and gathers information about the device's owner, including their location, browsing habits and personal information'. While spyware is 'marketed as a legitimate tool for parents to monitor their children', it 'can be readily used by offenders'. This includes placing smart phones under surveillance to gather information to facilitate grooming, or using more sophisticated spyware that 'allows remote command of a phone to record live pictures and stream videos, turning a smartphone into an audio and video "bug", allowing offenders to operate the device and send fake messages'.[144]

2.109eSafety expressed 'significant concerns about the use of immersive technologies as a tool for online child sexual abuse, including through the use of augmented reality (AR), virtual reality (VR) such as the metaverse, mixed reality (MR) and haptics'. eSafety explained how these 'hyperrealistic experiences' might by exploited by predators:

For example, sexual assaults might be experienced virtually through a haptic suit, augmented realities could be used to fake a sexually explicit three-dimensional image or video of a real person and interact with it, without their consent, and a virtual experience may feel private because you are physically isolated, but if you use it to create an intimate image or video the file could be livestreamed, stored, stolen, or shared without consent.[145]

2.110eSafety cited reports of perpetrators 'using virtual reality headsets to view and store child sexual abuse material' and submitted that '[a]ssault offences that were previously limited to the physical environment could be extended to the online environment and experienced in an intensely real way by the victim, with lasting feelings of fear, anxiety and trauma'.[146]

2.111One notable technological development is generative artificial intelligence (AI). In August 2023, eSafety reported that it had received its first reports of sexually explicit content generated by students using generative AI to bully other students. This is in addition to 'reports of AIgenerated child sexual abuse material…'. eSafety has published a 'position statement on generative AI which provides specific Safety by Design interventions that industry can adopt immediately to improve user safety and empowerment'.[147]

2.112In August 2023, the eSafety Commissioner indicated her expectation that 'over the next year, we'll have huge increases in these kinds of reports [related to AI]'.[148] In November 2023, in an article for the Australian Strategic Policy Institute, she wrote:

And generative AI poses significant risks in creating synthetic child sexual abuse material. This harm is undeniable; all content that normalises child sexualisation and AI-generated versions of it hamper law enforcement.

eSafety's hotline and law-enforcement queues are starting to fill with synthetic child sexual abuse material, presenting massive new resourcing and triaging challenges.

We are also very concerned about the potential of manipulative chatbots to further weaponise the grooming and exploitation of vulnerable young Australians.

These are not abstract concerns; incidents of abuse are already being reported to us.[149]

2.113Other jurisdictions have also recognised the risks of generative AI in relation to CAM. On 25 October 2023, the Internet Watch Foundation (IWF)—the UK organisation responsible for detecting and removing child sexual abuse imagery from the internet—published a research report showing that 'most AI child sexual abuse imagery identified by IWF analysists is now realistic enough to be treated as real imagery under UK law'. The IWF stated:

…the most convincing imagery would even be difficult for trained analysts to distinguish from actual photographs, and warns text-to-image technology will only get better and pose more obstacles for the IWF and law enforcement agencies.[150]

2.114The IWF report offered the following conclusions in relation to AI child sexual abuse material:

…this report evidences a growing problem that boasts several key differences from previous technologies. Chief among those differences is the potential for offline generation of images at scale – with the clear potential to overwhelm those working to fight online child sexual abuse and divert significant resources from real CSAM towards AI CSAM.

In this context, it is worth re-emphasising that this is the worst, in terms of image quality, that AI technology will ever be. Generative AI only surfaced in the public consciousness in the past year; a consideration of what it will look like in another year – or, indeed, five years – should give pause.

At some point on this timeline, realistic full-motion video content will become commonplace. The first examples of short AI CSAM videos have already been seen – these are only going to get more realistic and more widespread.[151]

2.115In July 2023 the United States Government announced it had secured voluntary commitments from seven leading AI companies (Amazon, Anthropic, Google, Inflection, Meta, Microsoft and Open AI) 'to help move toward safe, secure and transparent development of AI technology'. The companies have committed to:

Ensuring products are safe before introducing them to the public, where the companies:

undertake internal and external security testing of their AI systems before their release;

share information across the industry and with governments, civil society, and academia on managing AI risks;

Building systems that put security first, where the companies:

invest in cybersecurity and insider threat safeguards to protect proprietary and unreleased model weights;

facilitate third-party discovery and reporting of vulnerabilities in their AI systems;

Earning the public's trust, where the companies:

develop robust technical mechanisms to ensure that users know when content is AI generated, such as a watermarking system;

publicly report their AI systems' capabilities, limitations, and areas of appropriate and inappropriate use;

prioritize research on the societal risks that AI systems can pose, including on avoiding harmful bias and discrimination, and protecting privacy;

develop and deploy advanced AI systems to help address society's greatest challenges.[152]

2.116On 1–2 November 2023, the UK government hosted an AI safety summit where 28 governments, including Australia, signed up to the 'Bletchley Declaration', where countries agreed that AI should be 'designed, developed, deployed and used in a manner that is safe, in such a way as to be human-centric, trustworthy and responsible'. The declaration further stated:

We welcome the international community's efforts so far to cooperate on AI to promote inclusive economic growth, sustainable development and innovation, to protect human rights and fundamental freedoms, and to foster public trust and confidence in AI systems to fully realise their potential.[153]

Footnotes

[1]Office of the eSafety Commissioner (eSafety), Submission 20, p. 7.

[2]This chapter's focus on online CAM reflects much of the evidence to this inquiry. For an example of evidence regarding contact and 'offline' offending, see Project Paradigm, Submission54, pp. 2–9.

[3]Australian Institute of Criminology, Submission 37, p. 3.

[4]Cyber Security Cooperative Research Centre, Submission 1, p. 3 (citations omitted).

[5]Emeritus Professor Roderic Broadhurst and Mr Matthew Ball, Submission 27, p. 4 (citations omitted).

[6]Carly Ryan Foundation, Submission 21, pp. [3–4] (citations omitted).

[7]Uniting Church in Australia, Synod of Victoria and Tasmania (Uniting Church Synod), Submission17, pp.14–15 (citations omitted). Also see, for example, International Justice Mission, Submission 53, p. 3.

[8]Uniting Church Synod, Submission 17, p. 9.

[9]Ms Madeleine West, private capacity, Committee Hansard, 20 February 2023, p. 11.

[10]For example, Cyber Security Cooperative Research Centre, Submission 1, pp. 4–6; Australian Institute of Criminology, Submission 2, p. 3; Collective Shout, Submission 16, p. 6; Uniting Church Synod, Submission 17, pp. 9–13; Carly Ryan Foundation, Submission 21, [p.2]; Department of Home Affairs, Submission 25, p. 3; National Crime Agency, United Kingdom, Submission31, p.2.

[11]eSafety, Submission 20, p. 12.

[12]National Center for Missing and Exploited Children (NCMEC), 2019 CyberTipline Reports by Electronic Service Providers (ESP), 2020, p. 1.

[13]NCMEC, 2020 CyberTipline Reports by Electronic Service Providers (ESP), 2021 p. 1.

[14]NCMEC, 2021 CyberTipline Reports by Electronic Service Providers (ESP), 2022, p. 1.

[15]NCMEC, 2022 CyberTipline Reports by Electronic Service Providers (ESP), 2023, p. 1.

[16]eSafety, Submission 44, pp. 11–12.

[17]NCMEC, CyberTipline 2021 Report, April 2023, p. 5.

[18]NCMEC, 'CyberTipline data', webpage, undated, https://www.missingkids.org/cybertiplinedata (accessed 23 June 2023).

[19]NCMEC, CyberTipline 2021 Report, April 2023, p. 3.

[20]NCMEC, 'CyberTipline 2022 Report', webpage, https://www.missingkids.org/cybertiplinedata (accessed 23 June 2023).

[21]Australian Institute of Criminology, Submission 37, p. 3.

[22]Uniting Church Synod, Submission 17, p. 12.

[23]Dr Mark Zirnsak, Senior Social Justice Advocate, Uniting Church Synod, Committee Hansard, 15November2022, p. 3.

[24]eSafety, Submission 44, p. 8.

[25]For example, Associate Professor Benoit Leclerc, Associate Professor Jesse Cale and ProfessorThomas Holt, Submission 8, [p. 2].

[26]United Nations Office on Drugs and Crime, Regional Office for Southeast Asia and the Pacific (UNOffice on Drugs and Crime), Submission 7, [pp. 5–6].

[27]The National Crime Agency noted that the application of its research methodology in a child sex abuse context is 'exploratory', and it has 'moderate confidence' in the estimate. National Crime Agency, United Kingdom, Submission 31, p. 2.

[28]Uniting Church Synod, Submission 17, p. 9.

[29]Australian Institute of Criminology, Submission 37, p. 6.

[30]Uniting Church Synod, Submission 40, p. 2.

[31]eSafety, 'Twitter, TikTok and Google forced to answer tough questions about online child abuse', Media release, 23 February 2023.

[32]Australian Federal Police, Submission 18, p. 4. Also see Emeritus Professor Broadhurst and Mr Ball, Submission 27, p. 6.

[33]Mr Brett Pointing, Deputy Commissioner, Operations, Australian Federal Police, Committee Hansard, 10 December 2021, p. 34.

[34]Australian Institute of Criminology, Submission 2, p. 3.

[35]Uniting Church Synod, Submission 17, p.9 (citations omitted).

[36]Uniting Church Synod, Submission 17, p.19; also see p. 21.

[37]UN Office on Drugs and Crime, Submission 7, [p. 9].

[38]UN Office on Drugs and Crime, Submission 7, [p. 15], also see [p. 2].

[39]Destiny Rescue, Submission 52, p. 14.

[40]Department of Home Affairs, Submission 25, p. 3.

[41]See, for example, Australian Institute of Criminology, Submission 2, p. 3, UN Office on Drugs and Crime, Submission 7, [p. 7] and Attachment 1, [p. 2]; Collective Shout, Submission 16, p. 6; Uniting Church Synod, Submission 17, p. 10; Department of Home Affairs, Submission25, p. 3; eSafety, Submission 44, p. 7.

[42]Cyber Security Cooperative Research Centre, Submission 1, p. 4. The United Nations report is by the Special Rapporteur on the sale and sexual exploitation of children, including child prostitution, child pornography and other child sexual abuse material, Mama Fatima Singhateh, Impact of coronavirus disease on different manifestations of sale and sexual exploitation of children, Human Rights Council, 46th session, A/HRC/46/31, 22 January 2021.

[43]Australian Federal Police, Submission 18, p. 2.

[44]Australian Federal Police, Submission 18, p. 3. Also see Cyber Security Cooperative Research Centre, Submission1, p. 5.

[45]Victoria Police, Submission 30, p. 3.

[46]ECPAT International, Submission 9, p. 2.

[47]ECPAT International, Submission 9, pp. 2–3.

[48]Cyber Security Cooperative Research Centre, Submission 1, p. 3; also see p. 5.

[49]Ms Hilda Sirec, Commander, Australian Centre to Counter Child Exploitation and Human Exploitation, Australian Federal Police, Committee Hansard, 15 November 2022, p. 24.

[50]Australian Federal Police, Submission 38.1, [p. 1].

[51]The AFP advised this '[i]ncludes National and ACT Policing'.

[52]The AFP advised this could be '[a] result of a referral to, or referral from, a victim identification team, where the victim was spoken to by law enforcement and identity confirmed'.

[53]Australian Federal Police, Submission 18, p. 3; Australian Federal Police, Submission 38.1, [p. 1].

[54]Australian Federal Police, Submission 38, pp. 1–2.

[55]Australian Federal Police, Submission 18, p. 9.

[56]Australian Federal Police, Annual Report 2021-22, pp. 29–30.

[57]Department of Home Affairs, Submission 39, pp. 1–2.

[58]Mr James Watson, Assistant Commissioner South, Australian Border Force, Department of Home Affairs, Committee Hansard, 20 February 2023, p. 25.

[59]Attorney-General's Department, Submission 43, p. 4.

[60]Mr Pointing, Australian Federal Police, Committee Hansard, 10December2021, p.34.

[61]Mr Stephen Dametto, Acting Assistant Commissioner, Northern Command, Australian Federal Police, Committee Hansard, 15 November 2022, p. 23.

[62]Commissioner for Children and Young People, Western Australia, Submission 33, pp. 1–2.

[63]eSafety, Submission 44, p. 7; also see p. 9.

[64]Ms Kirrily (Kirra) Pendergast, Chief Executive Officer, Safe on Social, Committee Hansard, 20February 2023, p. 10. Also see Mrs Jen Hoey, Founder and Parent Cyber Safety Consultant, Not My Kid, and MsWest, private capacity, Committee Hansard, 20 February 2023, pp. 9, 14.

[65]Department of Home Affairs, Submission 25, p. 3. Also see, for example, Associate Professor Leclerc, Associate Professor Cale and Professor Holt, Submission 8, [p. 2]; Australian Federal Police, Submission18, p. 3.

[66]Department of Home Affairs, Submission 25, p. 6.

[67]eSafety, Submission 44, p. 8.

[68]UN Office on Drugs and Crime, Submission 7, [p. 3]; also see [p. 9].

[69]ECPAT International, Submission 9, p. 1.

[70]Collective Shout, Submission 16, pp. 6–7.

[71]International Justice Mission, Submission 53, p. 3 (citations omitted).

[72]Cyber Security Cooperative Research Centre, Submission 1, p. 3.

[73]Victoria Police, Submission 30, p. 4.

[74]Australian Institute of Criminology, Submission 37, p. 3.

[75]eSafety, Submission 20, p. 16.

[76]Queensland Police Service, Submission 29, pp. 2–3.

[77]Victoria Police, Submission 30, p. 5.

[78]Uniting Church Synod, Submission 17, p. 24.

[79]Dr Zirnsak, Uniting Church Synod, Committee Hansard, 9 December 2021, p. 1.

[80]eSafety, Submission 20, p. 16.

[81]Cyber Security Cooperative Research Centre, Submission 1, p. 11. Also see AssociateProfessorLeclerc, Associate Professor Cale and Professor Holt, Submission 8, [p.3]; Emeritus Professor Broadhurst and Mr Ball, Submission 27, pp. 2–3.

[82]Australian Federal Police, Submission 18, p. 4. Also see Cyber Security Cooperative Research Centre, Submission 1, p. 11; Australian Institute of Criminology, Submission 2, p. 3; AssociateProfessorLeclerc, Associate Professor Cale and Professor Holt, Submission 8, [p. 6]; Collective Shout, Submission16, p. 11; Department of Home Affairs, Submission 25, p. 3; NSW Police Force, Submission 26, p. 10.

[83]UN Office on Drugs and Crime, Submission 7, [p. 18].

[84]NSW Police Force, Submission 26, p. 10.

[85]Associate Professor Leclerc, Associate Professor Cale and Professor Holt, Submission 8, [p. 2].

[86]Associate Professor Leclerc, Associate Professor Cale and Professor Holt, Submission 8, [pp. 2–3]. Also see, for example, Uniting Church Synod, Submission 17, p. 24.

[87]Australian Federal Police, Submission 18, p. 4.

[88]Uniting Church Synod, Submission 17, p. 24; also see p. 9. Also see Australian Institute of Criminology, Submission 2, p. 6.

[89]Collective Shout, Submission 16, p. 12. Also see, for example, Uniting Church Synod, Submission 17, pp.16–17; Australian Federal Police, Submission 18, p. 4.

[90]Cyber Security Cooperative Research Centre, Submission 1, p. 9. Also see, for example, UN Office on Drugs and Crime, Submission 7, [p. 12]; Australian Federal Police, Submission 18, p.3.

[91]Australian Institute of Criminology, Submission 37, p. 6 (citations omitted).

[92]eSafety, Submission 20, p. 7.

[93]Collective Shout, Submission 16, p. 11.

[94]Australian Federal Police, Submission 18, p. 4.

[95]Australian Federal Police, Submission 18, p. 4.

[96]UN Office on Drugs and Crime, Submission 17, [p. 2], also see [p. 7].

[97]National Crime Agency, United Kingdom, Submission 31, p. 3.

[98]eSafety, Submission 44, p. 9.

[99]This form of CAM is described in different ways, though the term 'self generated' was common in evidence to this inquiry. A 'Global Guide' report submitted by UNICEF discussed some sensitivity regarding the terminology though nonetheless used the term 'self-generated sexual content' (UNICEF Australia, Submission 41, Attachment 1, p. 63). While the term 'self-generated' is used in this report, the committee does not suggest in any way that children are to blame for abuse and exploitation they suffer in relation to this material.

[100]Australian Federal Police, Submission 18, pp. 4–5.

[101]ECPAT International, Submission 9, pp. 3–4.

[102]See, for example, Carly Ryan Foundation, Submission 21, [p. 2]; Queensland Police Service, Submission 29, p. 2; Victoria Police, Submission 30, p. 4; eSafety, Submission 44, p. 8.

[103]National Crime Agency, United Kingdom, Submission 31, p. 4.

[104]UN Office on Drugs and Crime, Submission 7, [p. 13]. Also see, for example, Victoria Police, Submission 30, p. 4; Australian Institute of Criminology, Submission 37, p. 7.

[105]Queensland Police Service, Submission 29, p. 2.

[106]UN Office on Drugs and Crime, Submission 7, [p. 12].

[107]Australian Federal Police, Submission 38, p. 3.

[108]Attorney-General's Department, Submission 43, p. 5. Also see, for example, Uniting Church Synod, Submission 17, p. 24; Department of Home Affairs, Submission 25, p. 20; Australian Federal Police, Submission 38, p. 2.

[109]Queensland Police Service, Submission 29, p. 3. Also see, for example, Australian Institute of Criminology, Submission 37, p. 7; Ms Mary Jane Welsh, Detective Superintendent, Cybercrime Division, Crime Command, Victoria Police, Committee Hansard, 9December 2021, pp. 36–37.

[110]Uniting Church Synod, Submission 17, p. 18. Also see, for example, UN Office on Drugs and Crime, Submission 7, [p. 15]; Attorney-General's Department, Submission 43, p. 5.

[111]UN Office on Drugs and Crime, Submission 7, [p. 14].

[112]UN Office on Drugs and Crime, Submission 7, [p. 15]. Also see, for example, Uniting Church Synod, Submission 17, p. 18.

[113]UN Office on Drugs and Crime, Submission 7, [p. 14].

[114]Ms Sirec, Australian Federal Police, Committee Hansard, 15 November 2022, p. 24.

[115]Australian Federal Police, Submission 38, p. 2. Also see, for example, Attorney-General's Department, Submission 43, p. 5; Mr Toby Dagg, Acting Chief Operating Officer, eSafety, Committee Hansard, 20February2023, p. 26.

[116]Australian Federal Police, Submission 18, p. 5. Also see, for example, Ms Anne-Louise Brown, Director of Corporate Affairs and Policy, Cyber Security Cooperative Research Centre, Committee Hansard, 9 December 2021, p. 13.

[117]UN Office on Drugs and Crime, Submission 7, [p. 16].

[118]Australian Institute of Criminology, Submission 2, p. 7. Also see Dr Rick Brown, Deputy Director, Australian Institute of Criminology, Committee Hansard, 15 November 2022, pp. 36–37.

[119]Cyber Security Cooperative Research Centre, Submission 1, p. 5.

[120]Dr Brown, Australian Institute of Criminology, Committee Hansard, 15 November 2022, p. 37.

[121]UN Office on Drugs and Crime, Submission 7, [p. 16]. Also see International Justice Mission, Submission53, p. 2.

[122]International Justice Mission, Submission53, p. 2 (citations omitted).

[123]UN Office on Drugs and Crime, Submission 7, [p. 17].

[124]Dr Brown, Australian Institute of Criminology, Committee Hansard, 15 November 2022, p. 37.

[125]National Crime Agency, United Kingdom, Submission 31, p. 5.

[126]Australian Institute of Criminology, Submission 2, p. 7 (citations omitted).

[127]Cyber Security Cooperative Research Centre, Submission 1, p. 5.

[128]International Justice Mission, Submission 53, p. 2 (citations omitted).

[129]Department of Home Affairs, Submission 25, p. 6. Also see Australian Federal Police, Submission 18, p. 5; Mr Dametto, Australian Federal Police, Committee Hansard, 15 November 2022, p. 24; International Justice Mission, Submission 53, p. 4.

[130]Australian Federal Police, Submission 18, p. 5.

[131]Australian Federal Police, Submission 18, p. 6. Also see, for example, Emeritus Professor Broadhurst and MrBall, Submission 27, [p. 2].

[132]Detective Superintendent Jayne Doherty, Commander, Child Abuse and Sex Crimes Squad, NSW Police Force, Committee Hansard, 10 December 2021, p. 30. Also see, for example, Ms Lesa Gale, Assistant Commissioner, Northern Command, Australian Federal Police, Committee Hansard, 10December 2021, p. 42.

[133]Emeritus Professor Broadhurst and Mr Ball, Submission 27, p. 10 (citations omitted).

[134]Associate Professor Leclerc, Associate Professor Cale and Professor Holt, Submission 8, [p. 2].

[135]Associate ProfessorLeclerc, private capacity, Committee Hansard, 9 December 2021, p. 23.

[136]Ms Welsh, Victoria Police, Committee Hansard, 9 December 2021, p. 37.

[137]Mr Mark de Crespigny, Deputy Director, Illegal Imports and Exports and Human Exploitation and Border Protection Practice Groups, Commonwealth Director of Public Prosecutions, Committee Hansard, 10 December 2021, p. 25.

[138]Australian Federal Police, Submission 18, p. 17. Also see UN Office on Drugs and Crime, Submission7, [p. 18]; Department of Home Affairs, Submission 25, p. 3.

[139]This refers to section 3LA of the Crimes Act 1914, under which police may apply to a magistrate for an order requiring a person to provide information or assistance that is reasonable and necessary to access certain data. Australian Federal Police, Submission 18, pp. 16–17.

[140]Australian Federal Police, Submission 18, p. 17.

[141]Department of Home Affairs, answers to questions on notice, 10 December 2021 (received 14January2022), [p. 4]. Also see Ms Ciara Spencer, First Assistant Secretary, Law Enforcement Policy Division, Department of Home Affairs, Committee Hansard, 10 December 2021, p. 36.

[142]Department of Home Affairs, Submission 25, p. 14.

[143]ECPAT International, Submission 9, p. 2.

[144]Victoria Police, Submission 30, p. 7.

[145]eSafety, Submission 44, pp. 12–13.

[146]eSafety, answers to questions on notice, 28 February 2023 (received 14 March 2023), [p. 1].

[147]eSafety, 'New industry recommendations to curb harms of generative AI', Media release, 15August2023.

[148]Nick Bonyhady, '"Deepfake" images used to bully', Australian Financial Review, 16 August 2023, p.10.

[149]Julie Inman Grant, 'Safety by Design: protecting users, building trust and balancing rights in a generative AI world', Australian Strategic Policy Institute: The Strategist, 1 November 2023.

[150]Internet Watch Foundation, '"Worst nightmares" come true as predators are able to make thousands of new AI images of real child victims', Media release, 25 October 2023. The study focused on a single dark web forum dedicated to child sexual abuse imagery. Text-to-image technology allows a user to type in what they want to see and the software will generate an image.

[151]Internet Watch Foundation, How AI is being abused to create child sexual abuse imagery, October 2023, p. 12.

[152]The White House, 'Fact Sheet: Biden-Harris Administration Secures Voluntary Commitments from Leading Artificial Intelligence Companies to Manage the Risks Posed by AI', Statements and releases, 21 July 2023.

[153]United Kingdom Government, 'The Bletchley Declaration by Countries Attending the AI Safety Summit, 1-2 November 2023', Policy Paper, 1November2023. Also see statement regarding a potential 'online safety and security memorandum of understanding' between Australia and the United Kingdom in the HonMichelleRowlandMP, Minister for Communications, 'Address to the National Press Club', Speech, 22 November 2023.