Coalition Members' additional comments

Coalition Members' additional comments

List of recommendations

Recommendation 1

1.95That the Australian Government require social media companies to provide easily accessible features for users to control or reset their recommender systems, including to reduce or remove persuasive design features such as infinite scrolling, and to have more proactive ability to customise their feeds. Features would favour the ability for a consumer to opt in or out of recommended content.

Recommendation 2

1.129That the Australian Government enforce minimum transparency and reporting requirements for social media platforms in relation to actual or suspected foreign interference or transnational crime, transnational commercial activities, and foreign news and media, with financial and legal penalties for noncompliance.

Recommendation 3

1.130Recommendation 25 of the House of Representatives Standing Committee on Education and Employment Report Study Buddy or Influence: Inquiry into the use of Generative Artificial Intelligence in the Australian Educational System called for the Australian Government to establish a Centre of Digital Educational Excellence, modelled on existing co-operative research centres, which could act as a thought leader in relation to communication technology, media and digital literacy skills in children and students in Australia.

Recommendation 4

1.146That the Australian Government introduce a proactive obligation on social media companies to provide to the relevant intelligence or law enforcement agencies any and all messages involving the actual or suspected trade in child sexual abuse material (CSAM), child sexual exploitation material (CSEM), or non-consensual sexual images, regardless of whether end-to-end encryption is used.

Recommendation 5

1.147That the Australian Government introduce a proactive obligation on search engines, metasearch engines, web indexes and similar platforms, to provide to the eSafety Commissioner a detailed annual report as to how they are combatting the indexation, publication, promotion and dissemination of unlawful Class 1 and Class 2 material, with financial penalties for noncompliance.

Recommendation 6

1.153That social media companies be held liable for failing to remove and report links, tools and users who, through ‘link-in-bio’ platforms, facilitate access to Class 1 or Class 2 material and scam ads.

  • Class 1 – likely to be refused classification under the National Classification Scheme (including content that depicts child sexual abuse and exploitation material, advocates terrorist acts, or promotes, incites, or instructs in matters of crime or violence; and
  • Class 2 – likely to be classified R 18+ or X 18+ under the National Classification Scheme (including adult pornography and high-impact material that may be inappropriate for children under 18 years old, such as material featuring violence, crime, suicide, death, and racist themes).[1]

Recommendation 7

1.163That the Australian Government work with experts, youth representatives, and lived experience participants to develop and implement a strategy to improve the online safety and wellbeing of boys, acknowledging the disproportionate impact of sexual exploitation, radicalisation, and addiction on young males.

Recommendation 8

1.171That the Australian Government investigate options to resource and support the public research and development of technologies to combat child sexual exploitation and abuse material online.

Recommendation 9

1.172That the Australian Government increase the resources available to the eSafety Commissioner and the Australian Centre to Counter Child Exploitation in light of the significant increase to demand on their citizen-facing services and operational capabilities.

Recommendation 10

1.211That the Australian Government should develop a reporting framework that requires social media companies to provide regular transparency reports, in an effort to assure data transparency and a deeper understanding of consumer rights and data.

1.212The framework would require an articulation of:

  • the type of data social media companies collect on users, and for what purpose it is collected;
  • the data management systems social media companies use to store data on users; and
  • the type of data social media platforms use to generate personalised content on both recommender systems and advertising systems

Recommendation 11

1.213That the Australian Government direct social media companies to provide detailed reports to the eSafety Commissioner annually, declaring the amount of revenue received from companies in restricted and regulated industries including but not limited to alcohol, gambling, pornography, cigarettes, cannabis, weapons, pharmaceuticals, weight loss treatments, vaping, debt collection, and scams.

Recommendation 12

1.240That the Australian Parliament legislate to impose a statutory duty of care on social media platforms and similar technology companies for their users. An overarching duty of care burden should articulate requirements for platforms to implement diligent risk assessments and safety by design principles, as well as broad obligations on those platforms to be responsible for taking all appropriate actions to ensure that the identified risks are mitigated. The duty of care regulation would define risks clearly to inform risk assessments. As part of a duty of care, there should be a requirement for companies to provide annual reporting on identified risks and actions taken to stop them, with an ability for regulators to review these, and penalties for if a regulator believes not enough action was taken. There should also be investigation of how duty of care can be enforced.

Recommendation 13

1.270That the Australian Parliament establish a Joint Standing Committee on Online Safety, Artificial Intelligence and Technology, tasked with investigating the strengths and weaknesses in Australia’s regulatory framework, legislative tools, industrial base and technological capabilities.

Acknowledgements

1.1Coalition members of the Joint Select Committee on Social Media and Australian Society (committee) acknowledge the enormous contribution of over 200 witnesses who have provided written, oral and in some cases, multimedia submissions since the inception of the committee on 16 May 2024.

1.2In particular, we are grateful for the testimony of witnesses including young Australians, victim-survivors, and the parents and loved ones of those who have lost their lives as a result of the unmitigated harm perpetrated through social media. We thank you for your honesty, courage and determination.

Introduction

1.3While Coalition Members agree with all recommendations in the Final report, these additional comments offer further analysis and practical recommendations to address those concerns. While the committee's report is not an exhaustive list of policy prescriptions, we believe that failing to offer suggestions for improvement, drawn from the extensive evidence we have received, would be a lost opportunity.

1.4The impact of social media on child safety, public safety, online safety, and the health and wellbeing of Australians is one which is urgent, extensive, and pervasive.

1.5At present, social media provides opportunities for meaningful connection between Australians, but it also poses a significant threat to democracy, our social cohesion, and indeed our national security.

1.6For too long, social media platforms have deferred responsibility for safety to users, including children and their families.

1.7While Coalition Members believe in the principles of personal responsibility, both of young people and their parents, we must recognise the power dynamic in which an individual's or communities' actions are made.

1.8It has been made abundantly clear to Coalition Members through the evidence heard during this inquiry that the unregulated nature of social media has resulted in a significant power inequity between platforms and users.

1.9We agree with the view that Collective Shout submitted:

Children should not be required to build capacity to keep themselves safe on platforms that are inherently dangerous, and for which children lack the developmental capacity to navigate safely.[2]

1.10As eSafeKids rightly assert:

… it is manifestly unfair and unreasonable for children to be responsible for avoiding illegal and harmful content.[3]

1.11Coalition Members were concerned that a significant amount of evidence indicated social media has facilitated or exacerbated crimes such as child exploitation and fraud, the committee's Final report pays insufficient regard to these explicit harms.

1.12Similarly, we know from the evidence before this committee, that in some areas, social media has become a vector for radicalisation and extremism, with malicious actors seeking to exploit our most vulnerable, and the fabric of our society, and this report fails to address these issues in detail.

The need for age verification, restrictions & strong leadership

On government intervention

1.13It is clear from the evidence provided to the committee that 'big tech' cannot be trusted to self-regulate in the interests of Australian users – and particularly children.

1.14In her evidence, the eSafety Commissioner stated that platforms are:

… not living up to any of those pledges that they made. So, I do think this harder edged regulation is what's necessary. I don't know that there's anyone in this room that can credibly say self-regulation has worked.[4]

1.15Alcohol Change Australia said:

… the self-regulatory system is inadequate, ineffective, and lacks transparency and accountability.[5]

1.16Reset.Tech Australia warned that:

… harm happens as governments wait for self-regulation and co-regulation to fail.[6]

1.17Some platforms have taken steps in the right direction to make their platforms safer in recent times, with embedded safety features, basic algorithm reporting, and alternative platforms targeted towards children like YouTube for Kids and Instagram for Teens. However, the bulk of evidence supplied to this inquiry, and the overwhelming experience of parents, experts, and law enforcement alike shows that more must be done.

1.18The Heads Up Alliance said:

While the inclusion of safety features is a step in the right direction, it ultimately falls short of tackling the deeper problems of online bullying, grooming, tech-addiction and exposure to harmful content.[7]

1.19Coalition Members believe parents are the best judges of how their children should be raised. However, the committee heard repeatedly that parents do not have a full or accurate view of their children's online lives.

1.20Platforms designed to retain the user's attention, which become addictive in nature, cannot be mitigated by parental control and vigilance alone. In this respect, the evidence of the Heads Up Alliance was particularly persuasive.

1.21Testimony from Mr Ali Halkic recounted events leading up to his son's suicide, describing his son Allem as 'so beautiful and so confident. He was probably vulnerable at the same time, because he was never exposed to bad words or evil or anything like that'.[8] Mr Halkic also described that if 'Allem were here and you lined up 10 000 children, he would be down towards the end. He was no way known to be in a position like this. This consumed him, and within four to six weeks Facebook basically destroyed us as a family'.[9]

1.22Mr Halkic felt that had he known more about the online danger his son was facing, he would have behaved differently in how he allowed his son access to a phone, and to social media.

1.23Mr Halkic stated he felt he was a 'contributing factor to my own son's death, and I will live with that every single day. I wake up with it and go to sleep with it. I shower with it. I eat with it'.[10]

1.24Mr Halkic called on the committee to recommend to the Australian Government (government) that education of parents should give them an understanding of the digital world their children will experience.

1.25Mr Elachi, also from the Heads Up Alliance, said:

You ask what messaging parents need. Just like we tell parents, 'Smoking is hazardous for your children,' we need to tell them: 'Social media is hazardous for your children. TikTok is hazardous for your child. We need that messaging out there.[11]

1.26In relation to an age ban for social media, Mr Elachi argued:

… a law that supports that messaging, even if it's wholly unenforceable, would still support that message and would send a very strong signal to parents: 'Stop. Think about this. There's a reason why the government has said there is a minimum age for social media'.[12]

Which age for restrictions

1.27We welcome the government's decision announced in September 2024 to finally legislate an age limit of 16 for social media access, after a sustained campaign from the Coalition, desperate parents, and experts across the country.

1.28Heads Up Alliance advocated strongly for a minimum age limit of 18 years of age, arguing:

… balancing digital participation and preserving childhood experiences away from screens is essential.[13]

1.29Now that the government has agreed to implement the decision, it is imperative that they do so as a matter of urgency and without grandfathering provisions.

Age verification processes

1.30It is of immense concern to Coalition Members that at the time of reporting, the government has failed to commence the age verification trial to which it committed in April 2024.

1.31In testimony before the committee, the responsible government department, the Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA) asserted it did not have the statutory powers to compel platforms to take part in the trial.[14]

1.32The government could have legislated for those statutory powers at any point during the term of the 47th Parliament.

1.33The committee heard testimony that recognised some of the challenges inherent in an age restriction regime for social media use.

1.34In the Roadmap for Age Verification, released in August 2023, the eSafety Commissioner pointed to research which showed that 47 per cent of 16–18 yearolds had first encountered pornography before the age of 16—over a third of them through social media feeds, ads, messages and group chats.[15]

1.35Without robust age verification, children will continue to be exposed to inappropriate material that can have lasting negative effects—harm which can be avoided, or at least delayed, until young people have greater maturity.

1.36As Children and Media Australia submitted, the very existence of an age verification mechanism:

… will slow down and deter some users who would otherwise have a free pass to inappropriate content and contact.[16]

1.37We acknowledge that age verification is not a perfect solution or the panacea. There will be some who find ways to circumvent any age limit or approach to age verification, however robust. Yet, Coalition Members are confident that a mechanism can be devised to protect a significant number of people under 16.

1.38Children and Media Australia put it well, when they said that age verification raises 'a need to balance privacy and safety. However, looking at all relevant considerations from a practical perspective, we conclude that age verification is a challenge that can and must be met'.[17]

1.39Collective Shout observed in their submission:

We cannot allow large-scale reform to be scuttled by disagreements about the technical aspects.[18]

1.40In a 2021 Statement to the US Senate Committee on Commerce, Science and Transportation, Meta Whistleblower, Frances Haugen, who also briefed the committee, stated:

Facebook wants you to believe that the problems we're talking about are unsolvable. They want you to believe in false choices. They want you to believe you must choose between connecting with those you love online and your personal privacy.[19]

1.41Coalition Members of the committee believe that a robust age verification regime to support a minimum age limit for social media access is a necessary element of the multifaceted approach required to keep kids safe online. The Hon Peter Dutton MP, Leader of the Opposition, first announced in June 2024 that within 100 days of the election, a Coalition Government would raise the minimum age that kids can access social media to 16.

1.42It is important to remember that the Australian Parliament has already looked at this issue in some detail. Almost five years ago, the House of Representatives Standing Committee on Social Policy and Legal Affairs released the Protecting the Age of Innocence report, calling for a roadmap to implementing an age verification regime in Australia.

1.43This was a recommendation which received bipartisan support.

1.44In March 2023, the eSafety Commissioner, tasked with expeditiously developing that roadmap submitted a comprehensive plan and background paper to the government. Almost six months later, the government provided a response to the roadmap, refusing to support an age verification regime in favour of industry self-regulation—a decision which was greeted with the support of the EROS Association and other representatives of the predatory porn industry.

1.45At that time, the Coalition stood alongside concerned parents, experts, practitioners and journalists in calling for a reversal of the government's position, proposing legislation to implement an age assurance trial. After a sustained campaign, the government relented, agreeing to an age verification trial in April 2024.

1.46It has since been revealed during the hearing of evidence from DITRDCA, in the course of this inquiry, that social media companies will not be required to participate in that trial.

1.47This committee has heard testimony from over 200 families, experts, and victim-survivors—the majority of whom support age verification as one tool to help keep kids safe online. In spite of this evidence, and despite the release of now two interim reports, the trial is yet to commence—six months since the government's commitment to a trial was made.

1.48On 8 November 2024, Prime Minister the Hon Anthony Albanese MP committed to raising the age limit, promising legislation would be introduced to the Australian Parliament within the fortnight. This would mean that any age verification trial and social media age limit would not be practically implemented until 2025.

1.49It is of immense concern to Coalition Members of this committee that, at the time of reporting, the government had yet to commence its age verification trial.

The addictive nature of social media and algorithms

1.50'It is ... estimated that at least 10 per cent or more of teenagers in total have some form of problematic screen use'.[20]

1.51This is a distressing statistic to consider when acknowledging the digital world is changing young brains.

1.52According to Black Dog Institute's Future Proofing Study:

In 2024 … 93% of young Australians reported using social media at least once a day. On average, young Australians spend 2–3 hours per day on social media, with the most popular social media platforms being Instagram (79%), Snapchat (74%) and TikTok (67%).[21]

1.53While positive technology evolutions and online connections are an inevitability, it is also inevitable that there will be potential negative health implications on consumers. To safeguard technology that works for Australian society, the government needs to respond to this issue as a matter of urgency.

1.54Coalition Members were glad to see that Recommendation 4 acknowledged the need for more user control, but we would like to highlight some further evidence that provides the background context for why user choice requirements would benefit consumers of social media products, as well as highlight the need to revisit the issue of social media products' addictive characteristics in future public health, workforce planning and other policy theatres in future.

1.55At this point in the Parliament's comprehensive response to social media, the root cause of these problems has not been addressed in plain terms: social media is designed to be addictive. Further, consideration of this phenomenon has not been considered a public health issue to date, nor have things like potential impacts on workforce planning been considered:

It is often argued that the exposure of children and young people to advanced forms of digital media is crucial for the development of digital competence, leading to the acquisition of skills and capabilities necessary for thriving in a digitalized world. However, digital media should be integrated into the user’s life in a way that promotes wellbeing.[22]

1.56Recreational screen use as distinct from the use of a screen for a specific task with a beginning, middle and end (a task at work, paying a bill online, watching a movie) is proving to be a complex and harmful experience for many users, without regulation or consideration of consumer protection.

1.57Social media companies consistently told the committee that the canon of research on this topic is inconclusive, or in once instance, Meta told the committee that:

… the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes.[23]

1.58Meta also highlighted that the platform:

… [does] not and cannot measure whether an individual user is experiencing problematic use.[24]

but in the same document also suggested that:

… problematic use of social media is not a clinical disorder, but rather a behaviour that can be managed with strategies to restore balance and encourage healthy social media integration.[25]

1.59In analysing the evidence shared by Meta, Coalition Members have concluded that social media platforms are unwilling to acknowledge the role their platforms play in the detrimental mental health and user experience of consumers.

1.60Meta also stated that:

… we understand that for some people, social media use may feel disruptive. However, as far as we are aware, there is no formally recognised medical or psychological condition known as social media or screen time addiction (for example, it is not mentioned in the Diagnostic and Statistical Manual of Disorders).[26]

1.61The research the Australian Gaming and Screens Alliance (AGASA) presented in its submission highlights that while this is technically correct, there is mounting evidence which suggests that social media addiction is becoming a significant clinical problem for all age groups, especially young people.

Currently there are two official platform-specific screen disorder diagnoses, both to do with gaming … Clinically, clients presenting with social media use that is clearly disordered have similar symptoms, similar addiction-like behaviours, and are subject to the same substantial negative impacts on key areas of function, as clients with gaming and other screen disorders.[27]

1.62AGASA stated that:

… because there is good prevalence data for gaming disorders, we start with this.[28]

1.63AGASA then highlighted that:

(a) gaming disorders affect around 3 times as many males as females;

(b) females tend to present with disorders around social media rather than gaming;

(c) many females [and some males] present for treatment for their problematic social media use; and

(d) from our clinical observations, problematic reeling (problematic levels of 'doom scrolling' on short video sharing social media platforms) seems to have been increasing in the last 18 months, with males and females both well represented.[29]

1.64AGASA highlights that criteria such as difficulty in reducing or resisting use, increased prioritisation of the behaviour over other activities and continuation despite substantial negative effects on key parts of life:

… are recognised as appropriate to other potential screen based addictive disorders such as disordered behaviour with social media, internet use, use of short video clip reels and phone use.[30]

1.65Most distressingly, AGASA has found that there is evidence that there is a significant loss of grey matter in the brain for those with gaming or internet disorders,[31] which, as described above, have similar symptoms to those people with similar addictive behaviours in their social media use.

1.66AGASA asserts that there are serious brain health concerns being observed with all social media users, even outside of drawing evidence from gaming disorders with similar symptoms:

… a recent longitudinal brain imaging study (Maza et al., 2023) found that 'social media behaviours in early adolescence may be associated with changes in adolescents' neural development, specifically neural sensitivity to potential social feedback' and 'changes in the brain's sensitivity to social rewards and punishments. Studies do find reductions in cognitive function specifically in populations with problematic social media use, demonstrating a decrement in brain function and a likely decrement to brain structure. For example, Muller et al (2024), using multiple measures, found a clear association between problematic social media use and reduced executive function, and Aydin et al (2020), who used a single measure, found evidence for reduced cognitive function in two subscales of their measure.[32]

1.67More distressingly, AGASA highlights that:

… the brain changes found by Maza et al were in normal 6th and 7th grade students, not those with problematic social media use, suggesting that social media use does not need to be problematic to have an impact on brain development.[33]

1.68AGASA pointed to a study that highlights causal effects between screen based sedentary activities (TV viewing) and Alzheimer's disease:

This study, for the first time, investigated the causal association between TV viewing and Alzheimer's disease … and found bidirectional causal effects between TV viewing time and Alzheimer's disease.[34]

1.69AGASA also pointed to a study that highlights the emerging evidence of excessive screen exposure affecting brain development.

Converging evidence from biopsychosocial research in humans and animals demonstrates that chronic sensory stimulation (via excessive screen exposure) affects brain development increasing the risk of cognitive, emotional, and behavioural disorders in adolescents and young adults. Emerging evidence suggests that some of these effects are similar to those seen in adults with symptoms of mild cognitive impairment … in the early stages of dementia, including impaired concentration, orientation, acquisition of recent memories (anterograde amnesia), recall of past memories (retrograde amnesia), social functioning, and self-care. Excessive screen time is known to alter gray matter and white volumes in the brain, increase the risk of mental disorders, and impair acquisition of memories and learning which are known risk factors for dementia.[35]

1.70AGASA is concerned by predictions that:

… the rates of ADRD [Alzheimer's Disease and Related Dementias] will rise far above the projected estimates from the CDC of a 2-fold increase from 2014 levels[36] up to a 4- to 6-fold increase in 2060 and beyond.[37]

1.71Another study AGASA pointed to highlighted the identified relationship between screen time and an exacerbation of ADHD symptoms:

Results showed that increases in screen time in a given year were associated with an exacerbation of ADHD symptoms within that same year[38] … our findings indicated that social media use, television viewing and video gaming were shown to be directly associated with ADHD symptoms during adolescence.[39]

1.72Persuasive design draws upon the motivation and reward system humans use to ensure our basic survival, much like gambling does. Social media currently uses persuasive design for its recommender systems, which provide an infinite dopamine loop, and never lets users achieve a sense of satisfaction. AGASA highlights the link between addiction and persuasive design by detailing that:

… an important component of the development of social media and other screen products, and is consistently involved in the risk of disordered use. It uses psychology, neuroscience and AI user tracking to increase the degree of user engagement with the product and to customise the user experience in ways that increase the length of time spent consuming it. All these elements seek to keep people’s attention on screens at levels that can be problematic, addictive and/or disordered.[40]

1.73Persuasive design has design elements such as frequent pop ups, banners and notifications, autoplay, use of colour, movement and striking design to capture attention, frequent small rewards such as likes, a reward schedule similar to those used in poker machines, and monitor users for specific emotional vulnerabilities or waning attention to tailor advertising or screen activity.

1.74AGASA also asserts that former Facebook President Sean Parker stated that the:

… thought process that went into building these applications … was all about: 'How do we consume as much of your time and conscious attention as possible?'[41]

1.75The impact of persuasive design on children, older people, and other vulnerable cohorts is not yet fully understood.

1.76The other concerning element of social media use for many people—especially children—is screen time.

1.77Social media platforms in this inquiry did not acknowledge there were any associated harms with screen time, or the addictive nature of their products, with Meta stating:

We do not have a view on the appropriate amount of screentime based on the age of our users because such an assessment is inherently subjective based on the amount of free time a user has, when the user is using our apps, and how our apps are being used by the user.[42]

1.78TikTok also would not describe screen time in any positive or negative way explicitly:

While there's no collectively endorsed position on the 'right' amount of screen time or even the impact of screen time more broadly, we consulted the current academic research and experts from the Digital Wellness Lab at Boston Children's Hospital in choosing this [60 minute default for under the age of 18] limit.[43]

1.79Confusingly, after contesting some Committee Members' views that social media screen time can be problematic, and highlighting that it is impossible to determine appropriate use of time on an app, both TikTok and Meta also highlighted how easy it was to set screen limits, and the proactive prompts they have enabled in app, suggesting that there is some acknowledgement of the benefit of consumers attempting to regulate their use of the products.

1.80TikTok omitted in its questions taken on notice from Committee Members that the limits can be overridden and are not absolute. Meta did not explain whether the time limits set by parents or caregivers could be enforced, or whether an absolute time limit could be set to limit regular use.

1.81In its response to a question on notice, TikTok stated:

… for all users, irrespective of age, we make it easy to set screen time limits, so our community can select for themselves a screen time limit that they are comfortable with. We also provide in-app prompts when users have been on the app for a particular period of time, encouraging them to 'take a break'.[44]

1.82In its response to a question on notice, Meta highlighted that:

… we provide many ways for people to control their time and the content they see.[45]

1.83As well as contradicting the clinical evidence provided by AGASA, this also contradicts the evidence we heard from numerous parents and organisations who provided submissions highlighting the signs of soft addiction. The Heads Up Alliance, a grassroots group of parents across Australia who are concerned with their kids' use of social media, described its experience:

Experiments have been conducted in every family home in the country, and we know that social media is distracting our children, addicting them, depressing them, exhausting them, inducing anxiety in them, isolating them, crushing their self-esteem, serving them X rated content, facilitating predation, scandalising them, radicalising them, bullying them, sleep depriving them, sextorting them, preying on their specific vulnerabilities, driving them to self-harm and, in some cases, even to suicide.[46]

1.84Witness Ms Toni Hassan described that:

… we've already pointed to the reality that they are eyeball merchants, that this is addictive by design.[47]

1.85The experience of not being fulfilled by social media, but still spending hours on it a day, seems to ring across many submissions and witness hearings for the committee—in other words, as Dr Sandersan Onie from the Black Dog Institute suggests, the experience of:

… mindless scrolling that might take up many hours late into the night and reduce sleep.[48]

1.86In its 2022 National Youth Mental Health Survey, headspace found that 33 per cent of surveyed young people experienced problematic social media use.[49]

1.87Considerable evidence was heard by the committee on what specific features witnesses thought contributed to addictive patterns of use—from the perspective of a consumer.

1.88Orygen proposed that 'putting an end to "sticky features" like infinite scroll' would contribute to improving the safety and experience of young people on social media, which headspace agreed would reduce the addictive nature of social media, whether through user-control settings or total removal of the infinite scroll/auto-play functions.[50]

1.89headspace also called out infinite scrolling as an avenue to 'excessive use'—a term social media platforms did not want to address:

Social media is designed to capture and hold young people's attention, often leading to excessive usage through features like recommended content, push notifications, and infinite scrolling.[51]

1.90Dr Zac Seidler, Global Director of Policy at Movember contested that:

There is no ability to change the system unless you change the profiteering.[52]

1.91In other words—the design features are marketed as a meaningful experience to users that is individualised, engaging, but there are profit motives behind this that are not regulated or acknowledged as such in the current assessment of this industry.

1.92We are grateful that the report has strong wording in Recommendation 4 to reflect this, but Coalition Members encourage future governments to consider consumer rights, and the vulnerability of users, in making such an addictive design not only legal but profitable in the Australian economy.

1.93Coalition Members acknowledge there are established developmental impacts of persuasive design (including as used in recommender systems) and recreational screen use for all people, especially children, and that the longerterm impacts are not yet known.

1.94To that end, we recommend that government require social media companies to provide easily accessible features for users to control or reset their recommender systems, including to reduce or remove persuasive design features such as infinite scrolling, and to have more proactive ability to customise their feeds. Features would favour the ability for a consumer to opt in or out of recommended content.

Recommendation 1

1.95That the Australian Government require social media companies to provide easily accessible features for users to control or reset their recommender systems, including to reduce or remove persuasive design features such as infinite scrolling, and to have more proactive ability to customise their feeds. Features would favour the ability for a consumer to opt in or out of recommended content.

The risk of radicalisation, division and foreign interference

1.96Coalition Members acknowledge that the Australian Security Intelligence Organisation (ASIO) has raised the terrorism threat level from possible to probable in 2024.[53]

1.97ASIO has highlighted that the face of terrorist attacks has changed:

The most likely terrorist attack involves an individual or small group, using rudimentary weapons such as a knife, improvised explosives or a gun … the acceleration of radicalisation[54]

1.98ASIO stated this was concerning:

Individuals are moving to violence with little or no warning, and little or no planning. Acts of violence can be almost spontaneous or purely reactive … Extremist ideologies, conspiracies and misinformation are flourishing in the online ecosystem, and young Australians are particularly vulnerable.[55]

1.99ASIO also raised concerns over the number of young people engaging in violent extremism.

In the recent cases, the oldest alleged perpetrator was 21, the youngest was14.[56]

1.100ASIO also raised concerns about the forum and way this behaviour flourishes as being an element characteristic of the current counter terrorism landscape:

Finally, the diverse drivers of extremism. When we last raised the threat level, individuals were often being radicalised by sustained exposure to a particular extremist ideology, or by an authority figure. Now, individuals are being motivated by a diversity of grievances and personalised narratives.[57]

1.101Personalised narratives, an individual experience, expedited timelines, and the spike in young perpetrators, as well as overt references to the content being spread online, highlight a clear link to Coalition Members between social media platforms and national security risks.

1.102In response to a question on notice, the Department of Home Affairs further advised that:

Between 1 June 2024 and 31 August 2024, Home Affairs referred 1,656 instances of Terrorist or violent extremist content (TVEC) to digital platforms for removal consideration against their terms of service.[58]

1.103CyberCX told the inquiry that:

… agencies of the People's Republic of China (PRC) and their affiliates have long sought to use social media as a means to undertake covert information campaigns against foreign states and communities, both domestic and foreign.[59]

1.104The Australian Federal Police (AFP) stated that social media platforms are a hotbed for this material:

The AFP caseload reflects these trends in the online environment, with a concerning number of young people being radicalised online (including through self-radicalisation) and accessing and sharing propaganda and violent extremist material. In some cases, young people may distribute extremist content without an understanding of the repercussions of their actions. Many have no offline social connections with extremists, instead they are accessing and sharing propaganda and violent extremist material mong likeminded individuals and groups online.[60]

1.105In August 2024, CyberCX uncovered a network of at least 5000 inauthentic accounts (bots) on X:

… which are almost certainly controlled in concert by an artificial intelligence large language model system based in China. This network, which we have dubbed the 'Green Cicada network', primarily engages with divisive US political issues and may plausibly be used to interfere in the upcoming presidential election. It has also amplified hot-button political issues in other democracies, including Australia. The Green Cicada network is one of the largest networks of inauthentic activity publicly exposed to date and may be the first significant China related information operation to use generative AI as a core element of its operations.[61]

1.106The network was designed to proliferate and divide discourse on hot topics in US politics, and other issues being debated across democracies, including Australia.

1.107In its 2023 Report on Antisemitism in Australia, the Executive Council of Australian Jewry said:

With the internet, vilification and other forms of racism are easily and widely propagated, often with few restraints, and often anonymously. Avowedly antisemitic organisations and individuals have been able to express and propagate their views on websites, social media platforms, video channels, and even via online mainstream media sites.[62]

1.108In 2023, the Senate Select Committee into Foreign Interference Through Social Media recommended a minimum set of transparency requirements, enforceable with fines. Coalition Members reiterate that recommendation and believe that it should be expanded to proactive labelling of foreign government and foreign political party pages, not just state-affiliated media.

1.109On ABC's Insiders on 11 August 2024, ASIO Director General Mike Burgess said, 'social media does make our job harder', and highlighted that social media and the internet were an 'incubator of violent extremism'. He cited 'the algorithms companies use to direct content' as one of the vectors of this.[63]

1.110Meta responded, in a question to them citing ASIO's comments:

We do not allow organizations or individuals that proclaim a violent mission, or are engaging in violence, to have a presence on Facebook and Instagram. We do not allow content that praises, supports, or represents individuals or groups engaging in terrorist activity or organized hate.[64]

1.111Meta's response, as well as contradicting the submission of evidence from the AFP, clearly misunderstands many of the ways in which extremism, radicalisation, division and foreign interference is manifested on recommender systems. Recommender system encourage increasingly outrageous and reactive content.

1.112As well as the actual presence of extremist, terrorist and radicalising content online, one issue not sufficiently addressed in this report is the role recommender system technology plays in proliferating this content, and causing societal division. If people felt satiated quickly in their social media experience, they wouldn't continue to spend extended times on a screen, and social media companies wouldn't have a compelling profit model.

1.113Meta frames a lot of its language around user experience as being positive, tailored and meaningful. When pushed on how they define a 'positive experience', Meta only explained a positive experience is detailed by a series of signals that describe someone's level of engagement:

[That] might include who created the post and how you previously interacted with them, whether the post is a photo, a video or a link, or how many of your friends liked the post. A person can also hide a post, which helps to minimise similar content from appearing in your Feed.[65]

1.114What seems more likely, from evidence the committee heard, is that a positive experience can be described by the interaction a user has had with a piece of content. Interaction does not just come from a sense of happiness. Negative emotional reactions increase use of screen time engagement, which could, according to Meta's definition, be classified as a positive experience. As Dr Seidler stated:

… the algorithm will feed it to [the user] because negative content is naturally going to drive their attention. They are watching something that they find disgusting. Disgust is a far more complex overwhelming emotion than positivity, so they keep watching it.[66]

1.115This view is substantiated by the Australian Human Rights Commission, which suggested that:

… algorithms are incentivised to provide content which is meant to be more engaging for users. However, this material is often more extremist, sensationalist or plainly incorrect, with algorithms having 'learnt' that such content generates greater engagement. Algorithms appear to prioritise optimising user engagement and advertising revenue over the human rights and safety of users.[67]

1.116The eSafety Commissioner posed that recommender systems can reaffirm an identity or sense of belonging.[68]

1.117The attention economy that social media companies profit from creates an environment where users are guided and encouraged to spend more time online. To ensure people stay, the content that is served to them by their recommender systems is more aggressive, extreme, controversial and likely to incite outrage than content served to them at the beginning of a session on a platform.

1.118Reset.Tech Australia did an experiment for Ms Zoe McKenzie MP in 2024 on the amount of time it would take a fictional 17 year old boy's Instagram account to receive content on Andrew Tate, a commonly known extreme misogynist 'manosphere' figure who captivates a young, typically male audience. The fictional boy searched and liked 40 Jordan Peterson posts, and within 2 hours, he was being recommended Andrew Tate content on Instagram Reels.[69]

1.119Dr Seidler highlighted that in his time analysing and sitting in 'incel' forums as part of his role that:

… there is a transgressive entertainment based attention focus that young men are seeking.[70]

1.120Dr Seidler also asserted that the volume and the way in which content is shared with users in a recommender system model means that users:

… can't necessarily discern what they like and what they don't.[71]

1.121This is substantiated by Meta's own description of its recommender system model. Meta also articulated that:

If many people have interacted in a positive way with a post on Instagram or with similar content, the post will appear higher in a person's feed. Alternatively, if those interactions were negative or if a piece of content is predicted to be objectionable based on our standards, guidelines or integrity policies, the content is removed or ranked lower in the feed.[72]

1.122The Department of Home Affairs stated that:

… while digital platforms may not intend their algorithms to cause harm, algorithmic curation of information nonetheless gives rise to or exacerbates risks to individuals and society, including … terrorist and violent extremist content.[73]

1.123As well as the nature of recommender systems, Reset.Tech Australia highlighted concern that private data of social media users is being weaponised against users by foreign agents.[74]

1.124Recommendation 13 of the Senate Select Committee into Foreign Interference Through Social Media's inquiry, called for the Australian Government to build capacity to counter social media interference campaigns by supporting independent research.[75]Coalition Members wish to reiterate this sentiment in the context of this report.

1.125CyberCX identified the need for collaboration with:

… independent researchers to develop better shared understanding of the evolving tactics, techniques and procedures used by malicious actors on their platforms.[76]

1.126The International Centre for Missing & Exploited Children Australia shared similar sentiments, stating that:

… collaboration is key to protecting children and creating a safer digital environment.[77]

1.127Finally, currently there is a limited landscape of digital literacy services that specifically target Australian kids and an Australian cultural lens over news locally and globally. Children and students need to be aware of how to consume media, and how to be respectful digital contributors in a democracy.

1.128As Mr Bryce Corbett of Squiz Kids identified:

We're talking about nothing less important than the future of Australian democracy here. Having a media literate generation, having a generation of children who are able to tell online fact from fiction, is going to ultimately protect and safeguard Australia's democracy.[78]

Recommendation 2

1.129That the Australian Government enforce minimum transparency and reporting requirements for social media platforms in relation to actual or suspected foreign interference or transnational crime, transnational commercial activities, and foreign news and media, with financial and legal penalties for noncompliance.

Recommendation 3

1.130Recommendation 25 of the House of Representatives Standing Committee on Education and Employment Report Study Buddy or Influence: Inquiry into the use of Generative Artificial Intelligence in the Australian Educational System called for the Australian Government to establish a Centre of Digital Educational Excellence, modelled on existing co-operative research centres, which could act as a thought leader in relation to communication technology, media and digital literacy skills in children and students in Australia.

The use of technology to sexualise, exploit and harm children

1.131While there is reference in the report to child sexual abuse on social media platforms, and the concerns raised by both committee members and witnesses at this material being so prevalent on social media platforms, the committee wanted to provide further analysis to ensure the gravity of the evidence taken could be realised.

1.132When asked in a question on notice how many CSAM reports were made by Australian end-users, platforms failed to answer, redirecting to global statistics, or pointing to their own efforts as opposed to the data requested.

1.133When asked in a question on notice about how many scam posts were reported by Australian end-users, Meta responded with:

We do not have the data to share in response to these questions with respect to an unspecified time frame.[79]

1.134In a public hearing on 28 June 2024, TikTok representative Ms Ella Woods-Joyce claimed of TikTok that:

… there is no pornography on our app.[80]

1.135On the same day, Ms Antigone Davis appeared on behalf of Meta, as its Vice President and Global Head of Safety. When asked about children's access to pornography on Meta's platforms, Ms Davis replied:

We don't have pornography on our site, so let me just correct that statement.[81]

1.136When asked about whether social media is a safe experience for children, Snapchat replied:

Snapchat is intentionally designed very differently from traditional social media platform. Snapchat is primarily used as a messaging service to talk with friends, not to find new friends or share ideas or information with large groups of people.[82]

1.137Yet, their app features 'Quick Add' friend requests which connect users with strangers; maps, stories and subscribe functions with public viewing and access; and algorithm and recommender systems which fuel both—not unlike other social media platforms.

1.138The opaque responses to committee questioning show the lack of good faith and transparency social media companies have demonstrated with respect to children's mental health, wellbeing and public safety.

1.139As Holt identified:

… the technology industry has been criticised for being less than enthusiastic about 'proactively policing' CSAM online.[83]

1.140In her evidence, the eSafety Commissioner aptly called out what she saw as 'adegree of wilful blindness' on the part of those companies.[84]

On the reporting of child sexual abuse and exploitation material

1.141Coalition Members recommend that the government introduce a proactive obligation on social media companies to provide to the relevant intelligence or law enforcement agencies any and all messages involving the actual or suspected trade in CSAM, CSEM, or non-consensual sexual images, regardless of whether end-to-end encryption is used.

1.142In its evidence, International Justice Mission (IJM) called for an obligation on:

… digital services and platforms who are not currently making reports of suspected and actual child sexual exploitation material to the National Center for Missing and Exploited Children to report these to the Australian Federal Police.[85]

1.143IJM also called for 'more appropriate penalties for failure to report content' than those which currently exist.[86]

1.144This is in line with the Joint Committee on Law Enforcement's Inquiry into Law Enforcement Capabilities in Relation to Child Exploitation, which in 2023 recommended that the eSafety Commissioner:

… apply pressure to technology companies to ensure that they apply robust procedures and technologies to detect child abuse material and report it to law enforcement, regardless of whether end to end encryption is used.[87]

1.145Coalition Members believe that social media companies and digital platforms must be required to report actual and suspected child abuse material to law enforcement, including in end-to-end encryption services, and should be penalised if this is adhered to. Requiring social media companies and other digital providers to report actual or suspected sexual exploitation and financial exploitation related to sexual offences will further strengthen ACIC, AUSTRAC, AFP and law enforcement in their efforts to counter child exploitation online.

Recommendation 4

1.146That the Australian Government introduce a proactive obligation on social media companies to provide to the relevant intelligence or law enforcement agencies any and all messages involving the actual or suspected trade in child sexual abuse material (CSAM), child sexual exploitation material (CSEM), or non-consensual sexual images, regardless of whether end-to-end encryption is used.

Recommendation 5

1.147That the Australian Government introduce a proactive obligation on search engines, metasearch engines, web indexes and similar platforms, to provide to the eSafety Commissioner a detailed annual report as to how they are combatting the indexation, publication, promotion and dissemination of unlawful Class 1 and Class 2 material, with financial penalties for noncompliance.

On the use of third party tools to avoid publishers' liability

1.148Creators and distributors are also employing the use of third-party tools to promote or procure harmful content. Currently, children and vulnerable users are able to access Class 1 and Class 2 material, as well as simply inappropriate content, and scam ads, through third-party link-in-bio tools. Link in bio providers offer hyperlinked landing pages for platforms which do not provide or limit the number of external links available in headlines and profiles, most notably on the Instagram platform.

1.149Landing pages can include links to websites or other platforms, which may contain pornographic, scams, or other kinds of harmful material.

1.150While admitting that they 'ingest vetted lists of external sites known for hosting CSAM and block access to those sites',[88] when explicitly asked about the continued use of third-party linking tools, Meta failed to acknowledge that harmful material is only one click away for children and vulnerable users.

1.151Some of the more popular linked platforms are known for graphic, violent, sexual, and exploitative material including the likes of OnlyFans, JustForFans, X/Twitter, and encrypted private messaging apps such as Telegram.

1.152There is no distinguishing between a social media platform allowing users to publish harmful material and allowing users to publish links to harmful material. In both instances, social media companies ought to be held accountable for failing to protect children and vulnerable users from harm.

Recommendation 6

1.153That social media companies be held liable for failing to remove and report links, tools and users who, through ‘link-in-bio’ platforms, facilitate access to Class 1 or Class 2 material and scam ads.

Class 1 – likely to be refused classification under the National Classification Scheme (including content that depicts child sexual abuse and exploitation material, advocates terrorist acts, or promotes, incites, or instructs in matters of crime or violence; and

Class 2 – likely to be classified R 18+ or X 18+ under the National Classification Scheme (including adult pornography and high-impact material that may be inappropriate for children under 18 years old, such as material featuring violence, crime, suicide, death, and racist themes).[89]

On the changing attitudes and disproportionate risk for boys

1.154Men, especially young men, seem to be particularly vulnerable to some of the more insidious elements of current social media design. While young women commonly experience significant body issue problems on social media, young men seem to be consistently more vulnerable in both their risk of radicalisation, and their views of gender and respect as they mature.

1.155As boys increasingly integrate social media into their everyday lives, the need to equip them to counter the risks to their safety, and to combat the rise in harmful gender stereotypes, is becoming increasingly urgent. It also plays an important role in the national endeavour to reduce family, domestic and sexual violence.

1.156The disproportionate impact of social media algorithms on boys and young men extends to advertising as well. The ARC Centre of Excellence for Automated Decision-Making and Society also submitted:

… that young people, and especially young men aged 18-24, are targeted with unhealthy food, predominantly fast food, ads more than other demographic.[90]

1.157Women's Health Victoria highlighted the danger of both social media advertising and algorithms regarding violence against women and gender stereotypes, using Andrew Tate and The Real World platform as an example of the way that TikTok and YouTube recommender systems are further entrenching dangerous attitudes toward women.[91]

1.158Distressingly, the evidence is clear that boys are the most at risk from financial sexual exploitation and 'sextortion'.

1.159According to Thorn research, 90 per cent of victims of sexual exploitation, detected in the National Center for Missing and Exploited Children’s reports were male, aged 14 to 17.[92]

1.160WeProtect Global Alliance research shows that boys between 13 and 17 are more likely to fall victim to financial sexual extortion due to their engagement in discussion forums, instant messaging, dating apps, and multi-player gaming platforms. They are also less likely to report.[93]

1.161Despite this, research shows that caregivers perceive that girls are at a higher risk of online sexual activity.[94] Meanwhile, misogynistic micro-messaging, violence, and exposure to pornography on social media and digital platforms are fuelling 'sexist and unhealthy notions of sex and relationships'.[95]

1.162As the Australian Institute of Family Studies found, 'the content of pornography may reinforce double standards of an active male sexuality and passive female receptacle'.[96]

Recommendation 7

1.163That the Australian Government work with experts, youth representatives, and lived experience participants to develop and implement a strategy to improve the online safety and wellbeing of boys, acknowledging the disproportionate impact of sexual exploitation, radicalisation, and addiction on young males.

On the need for research and development to counter child exploitation

1.164There needs to be more research and development to counter child exploitation. As the IJM highlights, technology like AI powered recognition technology works even in encrypted platforms and could be:

… implemented across the board to detect and prevent the sharing, production, and creation of child sexual abuse material.[97]

1.165CyberCX posited that:

… platforms should also be investing in research and innovating to mobilise AI tools to identify relevant content at scale. Use of such tools will be most important when it comes to identifying explicit, abhorrent or otherwise illegal AI-generated material that needs to be removed.[98]

1.166The Attorney-General's Department identified that 'the deployment of safety technologies, and client-side scanning tools to support prevention efforts' is an option to provide greater protection for children against child sexual abuse.[99]

1.167While private companies are increasingly investing in technology solutions to identify and prevent child sexual abuse and exploitation material, this research is not necessarily publicly available.

1.168More work and greater resources must be directed toward cutting-edge research and development into pattern-detection, open-source, network analysis, biometrics, machine learning, and the plethora of technologies available to law enforcement.

1.169We are grateful that the committee has agreed to recommend an increase to the resources available to the eSafety Commissioner (Recommendation 12) but given the increasing and important role played by the Australian Centre to Counter Child Exploitation to combat this insidious crime, we believe that it should also be included.

1.170This reflects the Coalitio's commitment to double the size of the Australian Centre to Counter Child Exploitation.

Recommendation 8

1.171That the Australian Government investigate options to resource and support the public research and development of technologies to combat child sexual exploitation and abuse material online.

Recommendation 9

1.172That the Australian Government increase the resources available to the eSafety Commissioner and the Australian Centre to Counter Child Exploitation in light of the significant increase to demand on their citizen-facing services and operational capabilities.

The lack of transparency in user data and targeted advertising

On transparency in the data used by platforms

1.173While Coalition Members recognise that Recommendation 3 addresses a clear need for effective mandatory data access, Coalition Members wish to note the detail of the type of data that should be accessed, and in what format.

1.174The government should develop a reporting framework that requires social media companies to provide regular transparency reports, in an effort to assure data transparency and a deeper understanding of consumer rights and data.

1.175The framework would require an articulation of:

the type of data social media companies collect on users, and for what purpose it is collected;

the data management systems social media companies use to store data on users; and

the type of data social media platforms use to generate personalised content on both recommender systems and advertising systems.

1.176Coalition Members are not satisfied in the level of granularity provided by social media companies. There should be mechanisms to revisit transparency requirements. Most importantly, in the articulation of transparency measures it should be noted that this does not mean personal identifiable data of consumers.

1.177In the course of the inquiry, it was hard to ascertain what information Meta uses, for what purposes, and why.

1.178Recommender systems provide consumers a personalised experience, but what is not clearly understood, is what data feeds into the individualisation of that algorithm. Social media platforms have been, at best, ambiguously optimistic in their articulation of what informs an algorithmically driven user experience.

1.179Meta describes their recommender systems as follows:

Over the years there has been a growing amount of content shared online and so it has been harder for people to find all of the content they cared about. We use a range of different algorithms to help us rank content. The ones that people are often most familiar with are those that we use to rank content in their Feeds on Facebook and Instagram. Those algorithms that help with ranking play different roles.[100]

1.180The current regulatory environment governing social media algorithms provides little opportunity to understand the data and data management systems behind recommender and ad algorithms employed by social media companies. There are no reporting obligations on the data captured and fed to recommender and ad algorithm systems by social media platforms.

1.181Researchers, thinktanks and mental health organisations consistently called for more ability to get 'under the hood' of social media algorithms, recommender systems and data use.

1.182Ms Rosie Thomas, Director of Campaigns at CHOICE stated that:

I think, ultimately, we need to make sure that our regulators have the powers they need to be able to look under the hood here a little bit more to shine a light on what might be happening.[101]

1.183Black Dog Institute, Reach Out and Beyond Blue all called for government to:

Require social media companies to provide regular transparency reports on the content served by their algorithms to Australian users. Additionally, provide real-time API access to public health researchers to monitor and analyse the impact of algorithms on mental health and wellbeing.[102]

1.184Dr Seidler stated that social media companies reject requests for people like himself to use platform data to understand how platforms work:

That is transparency: the ability to actually understand what is happening under the hood. That is literally all that we are asking for.[103]

1.185Dr Seidler highlighted the experiments organisations like his have to do to find information they need, which he states in some instances can take 'hundreds of hours to do'.[104]

1.186As well as a frustration toward getting the data, Dr Seidler highlighted that he believes social media platforms do in fact have this data, but are not choosing to share it:

[They] have the API. They have the ability to offer that to us. They choose not to. Seemingly, understanding the harms is not of interest to them. That's really problematic to me.[105]

TikTok is detecting harm. Meta is detecting harm. They know because they are getting rid of certain types of content. They always say it's Chatham House rules whenever we go to a conference and TikTok or Meta show up. They say, 'Don't talk to anyone about this.'[106]

1.187Reset.Tech Australia stated that:

… industry is unwilling or unable to produce meaningful baseline data on how their systems are working. That's a transparency issue.[107]

1.188Cyber CX said that the lack of data on how social media platforms work 'under the hood' was stopping them from uncovering the extent of a network of foreign bot accounts designed to proliferate X with divisive content.

Again, that lack of transparency, let alone accountability, makes it very difficult to understand to what extent social media content is driving particular narratives either to try and influence an election outcome or any other type of outcome or simply to undermine the broader faith that people have in institutions and democracy.[108]

1.189Noting the desperation that mental health organisations, research thinktanks, and other witnesses with a vested interest in consumer protection and human rights, as well as the poor articulation of the services Meta provide, and how personalised recommender systems and other user experience elements are developed, Coalition Members believe assuring transparency should be a significant priority for regulation and consumer protection policy relating to social media companies.

1.190Dr Onie (also a witness to this inquiry), has highlighted in a Ted Talk he made in 2022 that having an oversight of what data drives recommender systems and other algorithms, including advertising or recommended content, could even have positive mental health and public health impacts like reaching people at 'the source' – on a platform and in an experience:

Australians spend about 40 hours a week on the internet, and 93% of the time, it starts on a search page. Could the search page be an ideal place to intervene? What if we could use the same algorithms that tech companies use to tailor ads to identify and reach out to individuals thinking of suicide? We could use information like age, gender, language, and geography to do so in a deeply personal and engaging way.[109]

1.191Consumers of social media products should be assured that their data will first of all remain private, but further that their data will not be used against them, and that when their data is used to create a more personalised experience, they understand what that data is, how it’s being used, how it is being profited off, and understand any options to opt out of this customisation (as touched on in Recommendation 4 in the report).

On transparency and harm in targeted advertising

1.192Social media advertising has transformed the way Australian businesses attract and retain customers. Through AI-enabled advertising approval and targeted advertising systems, social media companies are able to target advertisements like never before.

1.193However, the evidence presented to this inquiry has shown that these systems can 'can exacerbate risks such as eating disorders and contribute to negative mental health impacts'.[110]

1.194Worse still, some companies like Meta have allowed advertisers to buy access to the data of young people profiled as having harmful interests.

1.195We also know that big tech companies are hosting scam ads, and are, as a consequence, profiting from advertising expenditure.

1.196Dr Seidler contested that:

There is no ability to change the system unless you change the profiteering.[111]

1.197Reset.Tech Australia warned the committee that 'personalised and persistent scam calls, texts and advertisements linked to digital advertising business models' are causing significant economic harm to Australians.[112]

1.198But even lawful and legitimate industries which are restricted or regulated are impacting on the health and wellbeing of vulnerable Australians—including children.

1.199In its submission, Alcohol Change Australia quoted a Deakin University study which found that:

Children and young people are being bombarded with online marketing for junk food, alcohol, gambling and vapes.[113]

1.200In its submission to the Senate Economics References Committee inquiry into the influence of international digital platforms, the Australian Medical Association identified that:

The alcohol industry collects data through loyalty programs which can be matched with social media data, to generate models that link purchase patterns with time of day, week or month, mood and social events.[114]

1.201At the same time, Butterfly Foundation pointed out that problematic body image and eating disorder content is being driven by advertising, submitting that:

… weight loss programs (including smartphone applications which charge users for dieting regimes under the guise of psychology, such as Noom) have proliferated as part of a 'fitspo' culture and a 'wellness' industry.[115]

1.202Thanks to app-tracking and the use of user data in AI-driven targeted advertising:

… people who are interested in appearance-related content (including those searching for help) may be exposed to such content at a higher rate, thereby increasing their risk for eating disorders.[116]

1.203As the Australian and New Zealand Academy for Eating Disorders identified, social media algorithms:

… can reinforce challenges related to development of an eating disorder and treatment seeking and recovery.[117]

1.204Some platforms have taken steps to remediate the issue of eating disorder and body image issues in their advertising and recommender systems. Meta has advertising standards which are, theoretically supposed to protect users from:

… content that attempts to generate negative self-perception in order to promote diet, weight loss or other health related products.[118]

1.205Pinterest prohibited all advertising with weight loss language and imagery in 2021, taking a big step toward the disruption of dangerous fad diet advertising.

1.206While embedded safety features, strict content moderation, and the Children's Online Privacy Code recommended by the committee is a big step in the right direction, Australia must address the lack of transparency in social media advertising.

1.207When asked ‘How much money does Meta make per year in Australia to scam ads?’ Meta failed to provide a clear answer to the question.[119]

1.208When asked for Meta’s revenue and expenditure for advertising by Australian businesses, Meta replied, 'Meta does not, in the ordinary course of business, separately track revenue and expenditure figures by user country'.[120] When asked 'how much money has Meta made from alcohol and gambling advertising?', Meta replied: 'Meta does not, in the ordinary course of business, separately track revenue by advertisement type for financial reporting purposes'.[121]

1.209Meta would not describe how much it had earned from scam ads, or even if this figure was quantified. Ms Garlick from Meta assured the committee that whatever it may incidentally earn from scam ads, it definitely spent more in the prevention of scam ads:

I think I can quite confidently say we're not profiting, given the amount of work that is involved in trying to stay a step ahead.[122]

1.210Social media companies are making money out of advertising from regulated and restricted industries. They therefore have a vested interest in using user data to exploit users for their clients. For these reasons, Coalition Committee Members recommend that government mandate detailed annual reporting to the eSafety Commissioner on revenue from regulated and restricted industries.

Recommendation 10

1.211That the Australian Government should develop a reporting framework that requires social media companies to provide regular transparency reports, in an effort to assure data transparency and a deeper understanding of consumer rights and data.

1.212The framework would require an articulation of:

the type of data social media companies collect on users, and for what purpose it is collected;

the data management systems social media companies use to store data on users; and

the type of data social media platforms use to generate personalised content on both recommender systems and advertising systems

Recommendation 11

1.213That the Australian Government direct social media companies to provide detailed reports to the eSafety Commissioner annually, declaring the amount of revenue received from companies in restricted and regulated industries including but not limited to alcohol, gambling, pornography, cigarettes, cannabis, weapons, pharmaceuticals, weight loss treatments, vaping, debt collection, and scams.

The issues in enforcing legal and taxation obligations

On issues of clarity and definition

1.214Coalition Members are concerned with the inconsistency and lack of clarity surrounding crucial definitions of social media under current legislation and the proposals of both the government and this committee.

1.215There is a current definition of social media, at s13(1)(a)(i) of the Online Safety Act 2021.

1.216The Hon Michelle Rowland MP, Minister for Communications, highlighted in a speech on 13 November 2024 that:

… our legislation will include a new definition of social media that is broad and robust and is designed to capture more services under the term 'age-restricted social media' than what is currently covered under the existing Online Safety Act. Common social media services such as Facebook, Instagram, TikTok, and X will be subject to the minimum age limit. Messaging and gaming services will not be in scope of this definition.[123]

1.217The onus of this new model is 'by exemption' – meaning there is an incentive for companies not to define themselves as social media:

Our legislation will also contain positive incentives as part of an exemption framework to encourage safe innovation, and also provide for access to social media type services that enable education or health support for young people. Social media platforms that can demonstrate they meet set criteria and do not employ harmful features, or provide positive benefits for children, may apply to the regulator for approval.[124]

1.218Whilst Coalition Members understand this logic and appreciate the concept of incentivising social media companies to innovate into safer versions of themselves, it should also be pointed out that to date, social media platforms have not demonstrated good faith or a meaningful duty of care when the onus on them is to self-regulate.

1.219Further, Coalition Members remain highly concerned about the government's proposed exemptions that some platforms may be able to secure from the government under the guise of either not falling within the definition of 'social media' or because it may have taken ‘reasonable steps’ to mitigate the harm to young Australians.

1.220Coalition Members remain steadfast in their view that platforms like Snapchat must not obtain an exemption on the basis that they are 'messaging services' rather than a social media platform. Likewise, platforms must not receive a ‘get out of jail free card’ simply because they may offer a more age appropriate platform that the government considers less risky to children.

1.221Social media companies have demonstrated time and time again that they cannot be trusted to act in the best interests of their users, or Australian children.

1.222The risk of this approach is that social media companies will phoenix and break apart into smaller offerings that remain unregulated.

1.223Dr Onie from the Black Dog Institute highlighted that social media is so broad and vague as a market offering, it is hard to establish what element of social media is the problem, or not the problem:

When we talk about social media platforms as a whole, it's like talking about medication without discussing the active ingredient. We talk about what is dangerous about social media. Is it the user interface? Is it the user experience? Is it the commenting function? Is it the scroll function? What are the different components within social media that make it dangerous? When we understand, we're able to break down these different components and that's when I feel we can really target and reduce what is harmful about it.[125]

1.224The lack of definitional clarity also poses serious questions in relation to the enforceability of age verification and restrictions, and those obligations outlined in the committee’s recommendations and our additional comments. Since announcing their commitment to the Coalition’s age restriction policy, the government has been forced to give a running commentary as to which platforms and entities will be included and excluded from its age restriction proposals.

1.225Having a dynamic discussion around what defines social media could mitigate the risk of platforms 'phoenixing' in creative ways to ensure they can exempt themselves from a definition.

1.226Coalition Members believe a regularly reviewed definition of social media – by function, by the inclusion of certain technologies, by user experience, by format, by component, or simply by platform entity - would support a dynamic policy response to a dynamic platform environment.

On a duty of care

1.227Coalition Members agrees that the Australian Parliament should legislate to impose a duty of care on social media platforms and similar technology companies, for the benefit of their users.

1.228This would bring social media and big tech in line with consumer law expectations across the economy.

1.229The committee has heard compelling evidence on the topic of the requirements for a case for change.

1.230In the wording of Recommendation 2 in this report, the scope of a duty of care burden requires 'digital platforms implement diligent risk assessments and risk mitigation plans' but does not also include a need for a defined set of risks, a burden to implement reasonable steps to mitigate each identified risk, or a requirement for these risks and attempted mitigations to be reviewed by regulators with attached penalties.

1.231The wording at present could limit the efficacy of a duty, noting industry has provided excellent evidence that a requirement for these assessments and mitigation efforts to be provided to regulators for review would support more effective risk mitigation of online platforms.

1.232Reset.Tech Australia believes that with:

an overarching duty of care placing broad obligations on platforms to ensure user safety in systemic ways;

requirements for platforms to assess all their systems and elements for a defined set of risks as part of a risk assessment process;

a requirement for platforms to provide these risk assessments and mitigation efforts to regulators for review;

the ability to 'issue penalties that match the scale of global profits of digital platforms';[126] and

a duty of care model would be responsive, broad reaching, proactive and dynamic—acknowledging the quick evolution of technology and social media platforms today.

1.233When asked if social media companies owed their users a duty of care, witnesses were almost unanimous in their affirmation.

1.234Joint witnesses ReachOut, Beyond Blue and the Black Dog Institute submitted that they:

… believe social media companies owe a duty of care to their users because their products have both positive and negative impacts on users' mental health.

1.235Catholic School Parents WA considered that whether a duty of care was owed by platforms to users was not a matter of contention, stating that:

… social media companies are organisations doing business in Australia and as such owe a duty of care to their users, particularly to our children and young people.[127]

1.236When asked by the committee: 'Do you think you have a duty of care in respect of the potential risk of addiction?' Meta did not support a duty of care burden on social media companies:

The phrase 'duty of care' is vague and undefined such that we do not think agreeing to it would be helpful to users or the industry. Instead of a ‘duty of care’, we support clearly defined standards that would apply equally to all social media platforms.[128]

1.237Ms Davis told the committee:

… I don't think that social media has done harm to our children. I think that social media provides tremendous benefits.[129]

1.238Further, Meta stated that the:

… existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes.[130]

1.239This directly contradicts significant evidence heard by the committee, and further highlights to the committee that social media platforms at present are not taking seriously their obligations or their duty of care to Australians. As a consequence of their patent failure to do so, the Australian Government should step in and legislate that duty.

Recommendation 12

1.240That the Australian Parliament legislate to impose a statutory duty of care on social media platforms and similar technology companies for their users. An overarching duty of care burden should articulate requirements for platforms to implement diligent risk assessments and safety by design principles, as well as broad obligations on those platforms to be responsible for taking all appropriate actions to ensure that the identified risks are mitigated. The duty of care regulation would define risks clearly to inform risk assessments. As part of a duty of care, there should be a requirement for companies to provide annual reporting on identified risks and actions taken to stop them, with an ability for regulators to review these, and penalties for if a regulator believes not enough action was taken. There should also be investigation of how duty of care can be enforced.

On issues of jurisdiction and onshoring

1.241The committee heard compelling evidence that regulating social media platforms effectively is challenging when the entities do not have a comprehensive and easily accessible Australian presence, that can provide top to toe resolution and redress for consumer and legal issues.

1.242Coalition Members are pleased to see that this is addressed in Recommendation1 but would like to note that calling out onshoring requirements for tax and legal obligations specifically, unlike in the recommendation wording, would highlight the nature of and conditions for onshoring.

1.243Tattarang highlighted as a witness to this committee that:

… by structuring their businesses so that all relevant operations are managed and controlled by US based companies with no relevant entities based in Australia they can frustrate attempts at service, refuse to comply with codes of practice, refuse to comply with legislation, render voluntary their compliance with injunctions and other court orders and force litigants to go through a convoluted process to sue or get a court order enforced in the US.[131]

1.244At the moment, social media companies, due to their digital and global nature, are able to structure their legal and corporate operations in such a way that it becomes nearly impossible to ensure there are clear lines of risk escalation, redress and resolution, and accountability. While the social media companies may have a presence in Australia that means they 'file returns with ASIC and the ATO'.[132]

1.245The structure of Meta, for example, is so opaque, that it is hard to determine which functions of the Meta Inc platform and corporation are 'in' Australia, as opposed to remote, and if there are any consequences for this in circumstances where direct accountability is needed, and to resolve consumer redress issues.

1.246When committee members asked Meta who responds to complaints, it was unclear if that team or responsibility was in Australia, or internationally—and what the practicalities and realities of this corporate structure means for the user:

The community operations team are staffed in different centres of excellence so that we can follow the sun but also make sure that they're getting the support they need in terms of some of the content that they might see and also that they can be in these dedicated centres to be making consistent decisions. They're based in different parts of the globe. Some of them are based in Singapore, some of them are based in different parts of the US and some of them are based in Ireland.[133]

1.247While this is the increasing reality of globalised business structures and remote work, there is clearly a lag in appropriate regulation to ensure compliance, liability and consumer redress in the Australian market, for Australian users, in Australia. Tattarang highlighted that this type of regulatory enforcement gap is not seen in other parts of the economy:

In other sectors of the economy which have the potential to pose serious systemic or other risks we require foreign corporations to submit to the Australian jurisdiction through licensing regimes and other mechanisms. For example, foreign banks must have local subsidiaries that hold Australian banking licences because the banking system is so critical to the economy and society.[134]

1.248Mr Tony McDonald, Assistant Secretary, Competition and Consumer Branch, at the Department of the Treasury highlighted this when he said:

… some of the challenges we have are applying the policy tools, if you like, for an analogue world, where you have infrastructure that is physically here and companies that are actually physically here and have key personnel here, to those that very much look like they're operating here, but, from a legal basis, that may not be quite as straightforward …[135]

and

Essentially, parliament can pass laws but, if we're not able to enforce them and validly impose sanctions, the question I'm grappling with is how we're able to effectively enforce the laws the parliament passes.[136]

1.249Mr Jeffrey Howard, Managing Director and Chief Executive Officer of Seven West Media, contended that:

Australia should not acquiesce to the demands of the digital platforms. They should be made to play by our rules. Other multinational industries are compelled to comply with all manner of laws and regulations when they want to trade here. It's time for the social media exemption to be addressed.[137]

1.250In its submission, Tattarang shared this view, calling for government to address the issue:

It is not acceptable that Australian litigants and regulators face jurisdictional impediments when seeking to enforce the rights and obligations that the Australian Parliament has considered appropriate for social media platforms servicing Australians on their Australian platforms.[138]

1.251Tattarang reiterated further that '[u]nless that fundamental jurisdictional issue is addressed first other measures will be ineffective'.[139]

1.252This evidence and these comments align with the recommendations of the Senate Select Committee on Foreign Interference through Social Media, which highlighted the need for an Australian presence for social media companies at paragraph 8.44 under Recommendation 1 of its Final report.

1.253The reality is that unless social media platforms have an Australian presence, including liability for Australian legal, regulatory and taxation purposes, no recommendation or legislation made by the Parliament will be meaningful, effective or enforceable.

The scope of the inquiry and the need for ongoing scrutiny

1.254While the scope of this inquiry was limited to social media and its impact on Australian society, many witnesses submitted evidence that the threats and opportunities posed by other online platforms, digital media, artificial intelligence and emerging technologies extend beyond the capabilities of a single inquiry.

1.255For example, IJM highlighted that while many livestreaming and social media platforms are being used in the online sexual exploitation of children (OSEC):

… it is so important to note that OSEC is happening on platforms that most legislators wouldn't dream of - such as on school websites.[140]

1.256In its evidence, IJM also identified that dating apps and websites play a significant role in romance scams and child sexual exploitation, calling for those companies:

… and social media platforms to collaborate closely to stem romance scamming.[141]

1.257The Australian Institute of Criminology also highlighted research which strengthens the case for dating platform regulation. Their research found:

… that within a sample of almost 10,000 people living in Australia who had used a mobile dating app or website in the last five years, nearly three-quarters (72.3%) had been subjected to online sexual harassment, aggression or violence by someone they had connected with through an online dating platform during this period.[142]

1.258The IJM also pointed to peer-to-peer sharing platforms as the most frequented channels for the dissemination of child sexual exploitation material, affirming that:

… more general, everyday communication applications with end-to-end encryption, such as WhatsApp and Telegram, are also frequently used.[143]

1.259In its initial submission and their responses to questions on notice on 20October 2024, the IJM further identified that Skype, Messenger and WhatsApp, among other platforms, were used by offenders to livestream child sexual abuse material, citing 2022 research from the Australian Institute for Criminology, a 2023 study from the University of Nottingham, and a 2024 scoping review published in the Trauma Violence Abuse Journal.

1.260Gaming platforms like Roblox and Fortnite are also being used by predators to harm and exploit children. As the Synod of Victoria and Tasmania of the Uniting Church in Australia submitted:

Online grooming has evolved particularly insidiously within social gaming environments.[144]

1.261Safe on Social, in their submission, said that platforms such as these:

… need to be scrutinised for grooming and child exploitation.[145]

1.262Even Google Maps has been used to publish age-inappropriate photos and videos, according to the Australian Parents Council.[146]

1.263We also have an opportunity to get ahead on the emerging augmented, extended and virtual reality technologies. WeProtect Global Alliance spoke about regulating the metaverse, highlighting that:

This is a tipping point. We are at a unique moment in time where there is potential to mitigate the risk of harm to children. While harms against children are not currently being reported on the same scale as other online harms across social media, gaming and live-streaming environments, as XR technology become more available, we have a window of opportunity to shape current and new technologies to ensure that they are safe for children from the start.[147]

1.264This issue extends beyond child safety to matters of public safety, too.

1.265In a question on notice, the Department of Home Affairs also identified that:

Mainstream platforms moderate user-generated content to ensure it is legal and in compliance with that platform’s terms of service. Fringe and alt tech platforms do not. These platforms may market themselves as freedom of speech or anti-censorship alternatives to their mainstream counterparts. People with extremist ideologies can utilise mainstream platforms to attract people to join fringe and alt-tech platforms.[148]

1.266In its 2022 report, the House Select Committee on Social Media and Online Safety recommended that the government propose the appointment of a 'House Standing Committee on Internet, Online Safety and Technological Matters'.[149]

1.267Coalition Members believe that a Joint Standing Committee would enable wider input and emphasise the importance of the committee as a mechanism to respond to the dynamic nature of technology's role in the lives and livelihoods of Australians.

1.268For these reasons, Coalition Members recommend that Parliament establish a new Joint Standing Committee on Online Safety, Artificial Intelligence and Technology, tasking it with investigating the strengths and weaknesses in Australia's regulatory framework, legislative tools, industrial base and technological capabilities, in the same spirit as the UK Science, Innovation and Technology Committee and the US House Science, Space, and Technology Committee.

1.269As a matter of priority, and in light of the evidence provided to this inquiry, the new JSCOSAIT could consider inquiring into safety and wellbeing concerns in relation to dating apps, sites and platforms; into gaming apps, sites and platforms; into messaging and livestreaming platforms which do not meet the criteria for social media obligations; and into the opportunities and threats posed by the emerging metaverse, including augmented, extended and virtual reality technologies.

Recommendation 13

1.270That the Australian Parliament establish a Joint Standing Committee on Online Safety, Artificial Intelligence and Technology, tasked with investigating the strengths and weaknesses in Australia’s regulatory framework, legislative tools, industrial base and technological capabilities.

Senator the Hon Sarah Henderson

Member

Senator for Victoria

Ms Zoe McKenzie MP

Member

Member for Flinders

Senator Jacinta Nampijinpa Price

Member

Senator for the Northern Territory

Mr Andrew Wallace MP

Member

Member for Fisher

Footnotes

[1]Office of the eSafety Commissioner, Submission 1, p. 22.

[2]Collective Shout, Submission 163, p. 6.

[3]eSafeKids, Submission 112, p. 4.

[4]Mrs Julie Inman Grant, Commissioner, Office of the eSafety Commissioner, Proof Committee Hansard, 21 June 2024, p. 59.

[5]Alcohol Change Australia, Submission 78, p. 3.

[6]Reset.Tech Australia, Submission 16, p. 1.

[7]Heads Up Alliance, answers to written questions on notice from Ms Zoe McKenzie MP (received 14 October 2024).

[8]Mr Ali Halkic, Member, The Heads Up Alliance, Proof Committee Hansard, 30 September 2024, p.31.

[9]Mr Ali Halkic, Member, The Heads Up Alliance, Proof Committee Hansard, 30 September 2024, p.35.

[10]Mr Ali Halkic, Member, The Heads Up Alliance, Proof Committee Hansard, 30 September 2024, p.35.

[11]Mr Dany Elachi, Co-Founder, The Heads Up Alliance, Proof Committee Hansard, 30 September 2024, p.35.

[12]Mr Dany Elachi, Co-Founder, The Heads Up Alliance, Proof Committee Hansard, 30 September 2024, p.35.

[13]Heads Up Alliance, answers to written question on notice (received 2 October 2024).

[14]Ms Bridget Gannon, Acting First Assistant Secretary, Online Safety Branch, Department of Infrastructure, Transport, Regional Development, Communications and the Arts, Proof Committee Hansard, 2 July 2024, p. 3.

[15]eSafety Commissioner, Roadmap for Age Verification, August 2023, p. 23.

[16]Children and Media Australia, Submission 140, p. 4.

[17]Children and Media Australia, Submission 140, p. 3.

[18]Collective Shout, Submission 163, p. 6.

[19]Ms Frances Haugen, 4 October 2021, Opening Statement, US Senate Committee on Commerce, Science and Transportation, Subcommittee on Consumer Protection, Product Safety and Data Security.

[20]Australian Gaming and Screens Alliance (AGASA), Submission 59, p. 7.

[21]ReachOut, Beyond Blue, and Black Dog Institute, Submission 168, p. 9.

[22]AGASA, Submission 59, p. 1.

[23]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[24]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[25]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[26]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[27]AGASA, Submission 59, pp. 3–4.

[28]AGASA, Submission 59, p. 7.

[29]AGASA, Submission 59, p. 7.

[30]AGASA, Submission 59, p. 4.

[31]AGASA, Submission 59, p. 5.

[32]AGASA, Submission 59, p. 5.

[33]AGASA, Submission 59, p. 5.

[34]Shiqi Yuan, Wanyue Li, Yitong Ling, Xiaxuan Huang, Aozi Feng, Shanyuan Tan, Ningxia, Li Li, Shuna Li, Anding Xu and Jun Lyu, 'Associations of screen‑based sedentary activities with all cause dementia, Alzheimer’s disease, vascular dementia: a longitudinal study based on 462,524 participants from the UK', BMC Public Health, 2023, https://doi.org/10.1186/s12889-023-17050-3, p.9.

[35]Laurie A Manwell, Merelle Tadros, Tiana M Ciccarelli, Roelof Eikelboom, 'Digital dementia in the internet generation: excessive screen time during brain development will increase the risk of Alzheimer's disease and related dementias in adulthood', IMR Press, 2022, DOI: 10.31083/j.jin2101028.

[36]Laurie A Manwell, Merelle Tadros, Tiana M Ciccarelli, Roelof Eikelboom, 'Digital dementia in the internet generation: excessive screen time during brain development will increase the risk of Alzheimer's disease and related dementias in adulthood', IMR Press, 2022, DOI: 10.31083/j.jin2101028.

[37]Laurie A Manwell, Merelle Tadros, Tiana M Ciccarelli, Roelof Eikelboom, 'Digital dementia in the internet generation: excessive screen time during brain development will increase the risk of Alzheimer's disease and related dementias in adulthood', IMR Press, 2022, DOI: 10.31083/j.jin2101028.

[38]Jasmina Wallace, Elroy Boers, Julien Ouellet, Mohammad H. Afzali & Patricia Conrod, 'Screen time, impulsivity, neuropsychological functions and their relationship to growth in adolescent attention‑deficit/ hyperactivity disorder symptoms', Scientific Reports, (2023) 13:18108, https://doi.org/10.1038/s41598-023-44105-7, p. 1.

[39]Jasmina Wallace, Elroy Boers, Julien Ouellet, Mohammad H. Afzali & Patricia Conrod, 'Screen time, impulsivity, neuropsychological functions and their relationship to growth in adolescent attention‑deficit/ hyperactivity disorder symptoms', Scientific Reports, (2023) 13:18108, https://doi.org/10.1038/s41598-023-44105-7, p. 8.

[40]AGASA, Submission 59, p. 10.

[41]AGASA, Submission 59, p. 10.

[42]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[43]TikTok Australia, answers to written questions on notice (received 31 July 2024).

[44]TikTok Australia, answers to written questions on notice (received 31 July 2024).

[45]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[46]Mr Dany Elachi, Co-Founder, The Heads Up Alliance, Proof Committee Hansard, 30 September 2024, p.29.

[47]Ms Toni Hassan, private capacity, Proof Committee Hansard, 30 September 2024, p. 33.

[48]Dr Sandersan Onie, Research Fellow, Black Dog Institute, Proof Committee Hansard, 1 October 2024, p.6.

[49]headspace, answers to written questions on notice (received 15 October 2024).

[50]Orygen, answers to written questions on notice (received 15 October 2024).

[51]headspace, Submission 153, p. 4.

[52]Dr Zac Seidler, Global Director of Policy at Movember, Proof Committee Hansard, 1 October 2024 p.17.

[53]ASIO Director-General of Security, Mike Burgess AM National Terrorism Threat Level, 5 August 2024.

[54]ASIO Director-General of Security, Mike Burgess AM National Terrorism Threat Level, 5 August 2024.

[55]ASIO Director-General of Security, Mike Burgess AM National Terrorism Threat Level, 5 August 2024.

[56]ASIO Director-General of Security, Mike Burgess AM National Terrorism Threat Level, 5 August 2024.

[57]ASIO Director-General of Security, Mike Burgess AM National Terrorism Threat Level, 5 August 2024.

[58]Department of Home Affairs, answers to questions on notice, 1 October 2024 (received 24 October 2024).

[59]CyberCX, answers to written questions on notice (received 14 October 2024).

[60]Australian Federal Police, Submission 161, p. 2.

[61]Mr Jordan Newnham, Executive Director, Corporate Affairs, Brand and Policy, CyberCX, Proof Committee Hansard, 30 September 2024, p. 50.

[62]Executive Council of Australia Jewry (2023), Report on Antisemitism in Australia, p. 152.

[63]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 4 September 2024 (received 24 September 2024).

[64]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 4 September 2024 (received 24 September 2024).

[65]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 4 September 2024 (received 24 September 2024).

[66]Dr Zac Seidler, Global Director of Policy at Movember, Proof Committee Hansard, 1 October 2024 p.19.

[67]Australian Human Rights Commission (AHRC), Submission 79, p. 9.

[68]Office of the eSafety Commissioner, Submission 1, p. 16.

[69]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[70]Dr Zac Seidler, Global Director of Policy at Movember, Proof Committee Hansard, 1 October 2024 p.19.

[71]Dr Zac Seidler, Global Director of Policy at Movember, Proof Committee Hansard, 1 October 2024 p.19.

[72]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[73]Department of Home Affairs, Submission 41, p. 5.

[74]Reset.Tech Australia, Submission 16, p. 9.

[75]Senate Select Committee into Foreign Interference Through Social Media, para 8.107.

[76]CyberCX, answers to written questions on notice (received 14 October 2024).

[77]International Centre for Missing & Exploited Children Australia (ICMEC), Submission 141, p. 4.

[78]Mr Bryce Corbett, Director, Newshounds by Squiz Kids, Proof Committee Hansard, 1 October 2024, p. 32.

[79]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 28June 2024 (received 26 July 2024).

[80]Ms Ella Woods-Joyce, Director, Public Policy, TikTok Australia, Proof Committee Hansard, 28 June 2024, p. 29.

[81]Ms Antigone Davis, Vice President and Global Head of Safety, Meta, Proof Committee Hansard, 28 June 2024, p. 19.

[82]Snap Inc, answers to written questions on notice and questions taken on notice at a public hearing, 28 June 2024 (received 26 July 2024).

[83]Holt (2018), 'Regulating Cybercrime through Law Enforcement and Industry Mechanisms', The Annals of the American Academy of Political and Social Science, Volume 679, p. 157.

[84]Mrs Julie Inman Grant, eSafety Commissioner, Proof Committee Hansard, 28 June 2024, p. 61.

[85]International Justice Mission Australia, answers to written questions on notice (14 October 2024).

[86]International Justice Mission Australia, answers to written questions on notice (14 October 2024).

[87]Joint Committee on Law Enforcement, Inquiry into Law Enforcement Capabilities in Relation to Child Exploitation, para 7.97.

[88]Meta, answers to written questions on notice and questions taken on notice at a public hearing, 4September 2024 (received 24 September 2024).

[89]Office of the eSafety Commissioner, Submission 1, p. 22.

[90]ARC Centre of Excellence for Automated Decision-Making and Society, Submission 120, p. 16.

[91]Women's Health Victoria, Submission 123, p. 7.

[92]Thorn & National Center for Missing and Exploited Children, Trends in Financial Sextortion, 2024, p.11.

[93]WeProtect Global Alliance, A Web of Deceit: Financial Extortion of Children and Young People, 2024, pp.3–4.

[94]Thorn, The Role of Caregivers: Safeguarding & Enhancing Youth Resilience Against Harmful Sexual Encounters Online, 2022, p. 62.

[95]Ms Toni Hassan, Submission 201, p. 11.

[96]Australian Institute of Family Studies, The effects of pornography on children and young people: An evidence scan, 2017, p. 18.

[97]International Justice Mission Australia, answers to written questions on notice (14 October 2024).

[98]CyberCX, answers to written questions on notice (received 14 October 2024).

[99]Attorney-General's Department, answer to a written question on notice (received 12 July 2024).

[100]Meta, Submisison 46, p. 44.

[101]Ms Rosie Thomas, Director of Campaigns, CHOICE, Proof Committee Hansard, 10 July 2024, p. 51.

[102]Black Dog, ReachOut, and Beyond Blue, Submission 168, p. 23.

[103]Dr Zac Seidler, Global Director of Research, Movember, Proof Committee Hansard, 1 October 2024, p. 21.

[104]Dr Zac Seidler, Global Director of Research, Movember, Proof Committee Hansard, 1 October 2024, p. 21.

[105]Dr Zac Seidler, Global Director of Research, Movember, Proof Committee Hansard, 1 October 2024, p. 21.

[106]Dr Zac Seidler, Global Director of Research, Movember, Proof Committee Hansard, 1 October 2024, p. 21.

[107]Ms Alice Dawkins, Executive Director, Reset.Tech Australia, Proof Committee Hansard, 10 July 2024, p. 9.

[108]Mr Jordan Newnham, Executive Director, Corporate Affairs, Brand and Policy, Cyber CX, Proof Committee Hansard, 30 September 2024, p. 55.

[109]Dr Sandersan Onie Ted Talk, The link between big data and suicide prevention, 31 October 2022.

[110]Reset.Tech, Submission 16, p. 5.

[111]Dr Zac Seidler, Global Director of Research, Movember, Proof Committee Hansard, 1 October 2024, p. 17.

[112]Reset.Tech Australia, Submission 16, p. 1.

[113]Alcohol Change Australia, Submission 78, p. 3; Deakin University, #DigitalYouth: How children and young people are targeted with harmful product marketing online, 2023, p. 2.

[114]Senate Economics References Committee, inquiry into the influence of international digital platforms, Australian Medical Association, Submission, p. 4.

[115]Butterfly Foundation¸ Submission 49, p. 10.

[116]Butterfly Foundation¸ Submission 49, p. 11.

[117]Australian and New Zealand Academy for Eating Disorders, Submission 3, p. 1.

[118]Butterfly Foundation¸ Submission 49, p. 12.

[119]Meta, answers to written questions on notice, 28 June 2024 (received 26 July 2024).

[120]Meta, answers to written questions on notice, 28 June 2024 (received 26 July 2024).

[121]Meta, answers to questions on notice, 4 September 2024 (received 24 September 2024).

[122]Ms Mia Garlick, Regional Director, Policy, Australia, Japan, Korea, New Zealand and Pacific Islands, Meta, Proof Committee Hansard, 4 September 2024, p. 15.

[123]The Hon Michelle Rowland MP, Minister for Communications, Speech at The Sydney Institute - The governance of digital platforms, 13 November 2024.

[124]The Hon Michelle Rowland MP, Minister for Communications, Speech at The Sydney Institute - The governance of digital platforms, 13 November 2024.

[125]Dr Sandersan Onie, Research Fellow, Black Dog Institute, Proof Committee Hansard, 1 October 2024, p. 6.

[126]Reset.Tech Australia, Submission 16, p. 15.

[127]Catholic School Parents WA, answers to written questions on notice, 30 September 2024 (received14October 2024).

[128]Meta, answers to written questions on notice, 28 June 2024 (received 26 July 2024).

[129]Ms Antigone Davis, Vice President and Global Head, Safety, Meta, Proof Committee Hansard, 28June 2024, p11.

[130]Meta, answers to written questions on notice, 28 June 2024 (received 26 July 2024).

[131]Mr Bruce Meagher, Head, Public Affairs, Tattarang, Proof Committee Hansard, 28 June 2024, p. 40.

[132]Ms Mia Garlick, Regional Director, Policy, Australia, Japan, Korea, New Zealand and Pacific Islands, Meta, Proof Committee Hansard, 28 June 2024, p. 7.

[133]Ms Mia Garlick, Regional Director, Policy, Australia, Japan, Korea, New Zealand and Pacific Islands, Meta, Proof Committee Hansard, 28 June 2024, p. 6.

[134]Mr Bruce Meagher, Head, Public Affairs, Tattarang, Proof Committee Hansard, 28 June 2024, p. 40.

[135]Mr Tony McDonald, Assistant Secretary, Competition and Consumer Branch, Department of the Treasury, Proof Committee Hansard, 25 June 2024, p. 3.

[136]Mr Tony McDonald, Assistant Secretary, Competition and Consumer Branch, Department of the Treasury, Proof Committee Hansard, 25 June 2024, p. 6.

[137]Mr Jeffrey Howard, Managing Director and Chief Executive Officer, Seven West Media, Proof Committee Hansard, 21 June 2024, p. 2.

[138]Tattarang, Submission 58, p. 2.

[139]Tattarang, Submission 58, p. 9.

[140]International Justice Mission, answers to written questions on notice (received 14 October 2024).

[141]International Justice Mission, answers to written questions on notice (received 14 October 2024).

[142]Wolbers, H, Boxall H, Long C & Gunnoo A (2022), Sexual harassment, aggression and violence victimisation among mobile dating app and website users in Australia: Research Report No. 2, Australian Institute of Criminology, p. 1.

[143]International Justice Mission, answers to written questions on notice (received 14 October 2024).

[144]Synod of Victoria and Tasmania of the Uniting Church in Australia, Submission 178, p. 24.

[145]Safe on Social, Submission 37, p. 1.

[146]Australian Parents Council, Submission 136, p. 3.

[147]WeProtect Global Alliance (2024), Beyond the Headset: Charting a course for safer experiences for children in extended reality environments, p. 9.

[148]Department of Home Affairs, answers to questions taken on notice, 1 October 2024 (received 24October 2024).

[149]House of Representatives Select Committee on Social Media and Online Safety, Parliament of Australia, Inquiry into Social Media and Online Safety, p. xvii at para. 1.25.