Chapter 8 - Online safety

Chapter 8Online safety

Overview

8.1Many submissions raised concerns about online children’s safety and the role of digital platforms.

8.2This chapter considers the potential risks from the collection of children’s data by digital platforms. It then examines evidence received by the committee, with a focus on the two main themes: unethical behaviours and criminal behaviour online.

8.3The chapter will then examine the current digital platform regulatory framework supporting online safety for children, and potential ways to strengthen protections for children online, including international approaches.

Digital platforms, data, and children

8.4Digital platforms harness high volumes of data from consumers, as well as connecting consumers to unprecedented levels of information. Among consumers are children, young adults, and vulnerable people, who unknowingly generate data that is captured, processed, and used for undisclosed purposes. As UNICEF highlighted, the ‘digital ecosystem is so complex and seamless that neither children or their adult guardians are fully aware of how their data is being captured and used, nor what the potential benefits and risks are’.[1]

8.5The committee considered evidence on the risks to consumers more broadly from digital platforms’ data collection practices in Chapter 5: Data.

8.6There is a strong connection between interaction with digital platforms and risks to vulnerable groups of people, in particular, children. The Office of the eSafety Commissioner (eSafety) and UNICEF’s submissions both highlighted that children increasingly rely on digital platforms to learn, develop cognitive skills, socialise, and build their identity.[2] Research indicates that 94 per cent of children in Australia are already online by the age of 4 years.[3]

8.7Ms Sarah Davies, Chief Executive Officer of the Alannah & Madeline Foundation, detailed the potential risks arising from children engaging with digital platforms:

Many digital platforms are designed to be highly engrossing and difficult to put down. They handle vast amounts of personal information about their users, and the users do not understand and certainly don’t have any say over what happens to their personal information. Now, there may be no intention to cause harm to children, but many of these platforms have turned out to be risky by design. Those risks include loss of control over personal information, contact with undesirable individuals, financial loss, exposure to age-inappropriate advertising and other content, and dysregulated tech use that then has flow-on effects for children’s mood, health, and overall well-being.[4]

8.8Child safety risks associated with data collection and algorithms include:

friend/follower suggestions that can pressure children to interact with strangers and recommend dangerous accounts;

encouraging ‘doomscrolling’[5], which can limit exposure to diverse content, and deliver increasingly problematic content;

encouraging dangerous viral challenges;[6]

promoting beauty stereotypes which may be unrealistic or harmful;

normalising the sexualisation of young people; and

recommending content that may be appropriate for adults but harmful to children who are not developmentally ready for it (i.e. violent or sexually explicit material).[7]

8.9Many submitters raised concerns about the quantity of children’s data being collected by digital platforms.[8] Ms Alice Dawkins, Executive Director at Reset.Tech Australia (Reset.Tech), emphasised the volume of data collection for the intention of marketing to children:

The best available estimate of the amount of data collected by advertisers about children and young people is that, by the time a child has turned 13, there are about 72 million data points on them … within that 72 million, that’s hundreds or, perhaps, thousands of companies who know things such as the precise geolocation of a child for the purpose of selling them products.[9]

8.10Ms Dawkins added that children’s data is collected from multiple sources of interaction with digital platforms. This not only includes social interaction, but also education interaction, emphasising that regulation of digital platform data collection practices is not currently effective (for an example, see Box 8.1).[10] MsDavies echoed these sentiments during the public hearing, agreeing that there is no day-to-day regulation of children’s data protection, ‘and there needs to be.’[11]

Box 8.1 Case study: Children’s data collection

A prominent videoconferencing provider in Victoria was found to be collecting data from students, presumably during their school day, including their unique identifier, such as their phone handset serial codes.

Crucially, the provider was engaging in something called ‘ID bridging’, which means it was identifying individual students and linking their data to enhance the data profile on them.

The provider collected precise location data of children, the time at their current location and their last known location. The provider collected contact information, including saved profile photos of contacts on children’s phones. The provider embedded three programs known as Software Development Kits (SDKs) that allowed third-party advertising technology (adtech) companies to access the students’ data.

The privacy policy was found to be deceptive and failed to declare the collection of these persistent identifiers, the practice of ID bridging, the call logs, the contact information, the use of embedded SDKs and the identity of third parties receiving user data.

There was a lack of ramifications for the deceptive practices.

Source: Ms Alice Dawkins, Executive Director, Reset.Tech Australia, Proof Committee Hansard, 26 July 2023, pp.16–17.

8.11In its submission, UNICEF highlighted the primary concern around children’s data collection:

Children are more vulnerable than adults and less able to understand the long-term implications of consenting to their data collection. Given children’s greater cognitive, emotional, and physical vulnerabilities, privacy concerns that exist for adults are amplified for children.[12]

8.12The Australian Medical Association echoed this concern and added that:

Children’s location data in particular is extremely sensitive, presenting significant risk if this data is inappropriately disclosed.[13]

Unethical and criminal online behaviours

8.13Child safety in the digital age was a high priority for a number of submitters to the inquiry. Concerns ranged from harmful product marketing to children and youth using targeted advertising through to online child sexual exploitation and abuse.

8.14Further, the Attorney-General’s Department emphasised that ‘while these digital platforms provide critical educational and social connections for young people, they have also enabled existing and new forms of online harms, the most serious being child sexual exploitation and abuse’.[14]

Harmful product marketing

8.15As discussed in Chapter 5: Data, profiling refers to the platforms’ practice of building a ‘profile’ of a person’s personal attributes and interests through tracking their behaviour over time, which can then be used for targeted advertising or to manipulate or discriminate against individuals.[15]

8.16The Attorney-General’s Department highlighted that that risks of targeting are especially ‘acute for children due to their particular susceptibility and developing cognitive abilities’.[16]

8.17Harmful product marketing was highlighted by submitters as an online safety concern for children. Evidence suggested that harmful product marketing is targeting children on digital platforms, undermining children’s health, and wellbeing. Submitters raised specific concerns around aggressive alcohol advertising, gambling, promotion of unhealthy food choices, and targeting children with dieting products, potentially increasing the risk of developing body image issues.[17]

8.18Alcohol Change Australia’s submission identified concerning behaviour by large social media digital platforms:

Meta has been found to have flagged children as being ‘interested’ in harmful products, including alcohol. It has also been found to use personal data collected to create profiles of young people with harmful or risky interests, including 13–17-year-olds interested in alcohol, smoking, and gambling. Even worse, Meta allowed advertisers to buy access to the young people profiled as having harmful interests.[18]

8.19The Alcohol and Drug Foundation echoed these concerns, highlighting the unethical behaviours of online tracking, profiling, and data collection enabled by digital platforms have facilitated harmful marketing of alcohol. For example:

During the COVID-19 pandemic, the alcohol industry used digital platforms to aggressively promote rapid delivery services and drinking at home, exacerbating vulnerabilities already caused by the pandemic.[19]

8.20The committee was warned that alcohol and unhealthy diets are a risk factor for cancer and increased exposure to harmful marketing is contributing to alcohol use among young people.[20]

8.21Submitters also highlighted concern for marketing of dieting products and the potential affects this can have on some people and communities, particularly young people and those who have experienced or are at risk of eating disorders.[21]

Child sexual exploitation

8.22The committee also heard concerns around the use of platforms to further illegal activities. The proliferation of online child sexual exploitation and abuse occurring on digital platforms has continued to escalate year on year in volume of reports and severity of crime type.[22] The Attorney-General’s Department emphasised the increasing volume of online child sexual abuse material (CSAM), noting:

The United States based National Centre for Missing and Exploited Children (NCMEC) reported 29.3 million reports of apparent child sexual abuse material made to their CyberTipline in 2021, up from 21.7 million in 2020 …

The UK-based Internet Watch Foundation reported that, ‘Imagery of primary school aged children being coached to perform sexual acts online has soared by more than 1,000 percent since the UK went into lockdown during the pandemic.’[23]

Current Australian regulations and frameworks

8.23The current regulation of online safety is spread across several different government agencies as outlined in Table 8.1.

Table 8.1Regulatory Function—Government Agencies

Agency

Legislation

Function

Australian Competition and Consumer Commission

Competition and Consumer Act 2010

Competition and consumer issues

Office of the Australian Information Commissioner

Privacy Act 1988

Data and privacy issues

Australian Communications and Media Authority

Broadcasting Services Act 1992

Industry code relating to targeting of a person with disinformation and information

Attorney-General’s Department

Criminal Code Act 1995

Telecommunications (Interception and Access) Act 1979

Surveillance Devices Act 2004

Privacy Act 1988

Commonwealth criminal justice and law enforcement frameworks across multiple areas

eSafety Commissioner

Online Safety Act 2021

Regulatory function for online safety

Source: eSafety Commissioner, Submission 2, p. 1; Attorney-General’s Department, Submission 51, p. 4.

eSafety Commissioner

8.24The primary agency focused on online safety is eSafety. The Online Safety Act 2021 (OSA), eSafety’s enabling legislation, provides the regulatory functions for online safety, including administering complaints and investigations schemes for four types of online harms:

cyberbullying of children;

cyber abuse of adults;

the non-consensual sharing of intimate images;

illegal or restricted online content.

8.25eSafety also holds powers to regulate digital platforms’ broader systems and processes.[24]

Mandatory industry codes

8.26The OSA requires certain online industry sectors to develop mandatory industry codes to deal with class 1[25] and class2[26] illegal and restricted content online.[27] These include providers of social media, email, messaging, gaming, dating, search engine and app distribution services, as well as internet and hosting service providers, manufacturers and suppliers of equipment used to access online services and those that install and maintain the equipment.[28]

8.27The codes outline such things as measures towards ensuring industry participants take reasonable and proactive steps to prevent access or exposure to, distribution of and online storage of class 1A material.[29]

8.28Other matters the codes address include:

measures to facilitate consultation and cooperation with other industry participants around removal and disruption and restriction of class 1A and 1B material and associated accounts.

ensuring communication and cooperation with eSafety with respect to the relevant material, including complaints.

tools and information to help people avoid exposure to these materials.

clear, accessible and effective reporting mechanisms and complaints mechanisms around handling of reports.

the mechanisms to effectively respond to reports and complaints on report handling.

publication of an annual report on compliance with the codes.[30]

8.29Following assessment by eSafety, the codes will be registered if they meet statutory requirements, or eSafety may determine an industry standard if requirements are not met by the proposed code.[31] Once registered, compliance with the codes is mandatory and enforceable.[32] eSafety will have powers to investigate breaches and direct platforms to comply, with civil penalties, enforceable undertakings and injunctions available to ensure compliance.[33]

8.30eSafety has so far registered six industry codes to deal with class 1 content for:

social media services

app distribution services

hosting services

internet carriage services

equipment

internet search engine services.

8.31The obligations contained in these codes will come into effect on 16December2023,[34] except the Internet Search Engine Services code which will come into effect on 12 March 2024.[35]

8.32eSafety is drafting industry standards for Electronic Services and Designated Internet Services in consultation with industry and the public as the proposed codes did ‘not provide appropriate community safeguards for users in Australia’.[36]

8.33A second phase of industry codes will follow, focusing on class 2 material and measures to prevent children accessing high-impact age-inappropriate content that can be harmful.[37]

Regulated reporting requirements: Basic Online Safety Expectations

8.34The OSA also introduced the Basic Online Safety Expectations (BOSE) which outline the Australian Government’s expectations that social media, messaging and gaming service providers and other apps and websites will take reasonable steps to keep Australians safe.[38]

8.35eSafety can establish mandatory reporting requirements for online service providers to report how they are meeting the BOSE such as protecting children from age-inappropriate content or proactively minimising unlawful material or activities. The reporting obligation is enforceable, backed by civil penalties and other mechanisms (see Box 8.2 for an example).[39]

8.36Since the OSA commenced, eSafety has issued 12 BOSE reporting notices focussed on the steps being taken by platforms to prevent, detect and remove child sexual exploitation and abuse on their services.[40] Statements summarising these responses can be found on eSafety’s website.[41]

8.37eSafety’s world-first report, Basic Online Safety Expectations (BOSE): Summary of industry responses to the first mandatory notices, found that some of the world’s biggest technology companies needed to do more to tackle child sexual exploitation and abuse material.[42]

Box 8.2 Findings of non-compliance: Google and Twitter (X)

eSafety found Google and Twitter (now X Corp.) failed to comply with nonperiodic notices given on 22 February 2023, contravening paragraph 56(2)(b) and section 57 of the Online Safety Act 2021.

Google

Google failed to answer several questions in response to the notice, including providing generic information where specific information was sought.

Google was issued a formal warning, notifying it of its failure to comply, and warning against non-compliance in the future.

Twitter (X)

eSafety considered Twitter’s failure to comply more serious. Twitter failed to provide any response to some questions, and provided some responses that were incomplete and/or inaccurate.

eSafety issued a service provider notification to Twitter, confirming its non-compliance, and an infringement notice for $610 500.

Source: eSafety, Responses to transparency notices, www.esafety.gov.au/industry/basic-online-safety-expectations/responses-to-transparency-notices (accessed 6 July 2023).

Safety by Design

8.38eSafety also promotes online safety measures through its Safety by Design initiative. Safety by Design encourages industry to anticipate potential harms and implement risk-mitigating and transparency measures throughout the design, development and deployment of a product or service.

8.39The initiative promotes online safety through three guiding principles:

Service provider responsibility – that platforms are responsible for the safety of users.

User empowerment and autonomy – that users should be empowered with safety tools and provided with autonomy.

Transparency and accountability – that platforms should be transparent and held accountable for their actions.[43]

The Attorney-General’s Department

8.40The Attorney-General’s Department is responsible for the Commonwealth’s criminal justice and law enforcement frameworks in relation to child sexual exploitation and abuse. Its focus includes combating CSAM. It operates under powers enforced by the following Acts:

Criminal Code Act 1995;

Telecommunications (Interception and Access) Act 1979;

Surveillance Devices Act 2004; and

Privacy Act 1988 (Privacy Act).[44]

8.41The Attorney-General’s Department also works with domestic and international agencies to combat CSAM. Examples of recent collaborative work include:

The Australian Government’s signing of the agreement on the US Clarifying Lawful Overseas Use of Data Act in December 2021.[45]

The Australian Federal Police and AUSTRAC’s partnership on ‘Operation Huntsman’ to combat sexual extortion in Australia.[46]

The Attorney-General’s Department’s work with digital platforms such as Google to procure information on CSAM investigations.[47]

Age Verification Roadmap

8.42In March 2023, eSafety submitted a roadmap on age verification to the Australian Government for consideration. It included complementary measures to prevent and mitigate harm to children from online pornography.

8.43The roadmap makes a number of recommendations for Government, reflecting the multifaceted response needed to address the harms associated with Australian children accessing pornography.[48]

Measures implemented by Big Tech

8.44Many large digital platforms highlighted their commitments to promoting online child safety.

8.45Meta submitted it recognises its responsibility to young people and invests heavily in it. Meta has many default protections for young people, including restricting advertisers to only be able to target young people by age and location. Ms Mia Garlick, Regional Director of Policy, Meta, stated that:

Since 2016 we've invested over US$16 billion. Really it's in our commercial interest to invest in safety and security, because people will only continue to use our services if they feel welcome and safe.[49]

8.46Mr Kyle Andeer, Vice President, Products and Regulatory Law, Apple Inc., stated Apple provides a range of tool for parents to help kids safely use Apple devices. These include parent controls that:

… empower parents to decide exactly which apps can be downloaded on their kids' devices. It allows parents to block or limit specific consent, features or websites, including explicit content. It allows parents to block apps from accessing their children's most personal information, including their contacts, photos and even their location. Parents can help protect their kids from what they see and send by setting up communication safety on their kids' devices. This feature uses privacy friendly on-device machine learning to analyse photos and videos. If your kid receives or attempts to send photos or videos that might contain nudity, we warn them, give them an opportunity to stay safe and offer to connect them to a trusted adult.[50]

8.47Microsoft submitted it invests in child safety across four pillars:

Platform architecture – Microsoft recognises design of platforms impacts safety and is working to embed safety by design principles across consumer services.

Content moderation – Microsoft publishes and enforces clear policies, such as its Code of Conduct Agreement which states users should not engage in any activity that exploits, harms, or threatens to harm children.

Culture – Microsoft believes in creating safe and inclusive communities, which includes providing users and families with the tools and information to support their engagement, provide choices about the content and users with whom they interact, and raising awareness of online safety risks.

Collaboration – Microsoft states a collaborative approach between regulators, industry and civil society is critical.[51]

8.48Google stated it is an industry leader in fighting child sexual abuse, including using proprietary technology to deter, detect and report offences, and remove material on its platforms. It also partners with NGOs and industry on programs to share technical expertise and develop and share tools to help organisations fight CSAM, and works with law enforcement.[52] Further, Google stated it is working to implement the new codes under the Online Safety Act, and alongside BOSE, have an effective protection framework.[53]

Concerns with the current regulatory framework

Fragmentation

8.49As highlighted above, the committee notes that there are a number of government bodies with a function in children’s online safety.

8.50In its submission, the Office of the Australian Information Commissioner outlines that ‘the different harms that can arise in the online environment have resulted in intersections between regulatory spheres in regulating digital platforms, which highlights the importance of regulatory cooperation and coordination’.[54]

Regulatory gaps

8.51Several submissions highlighted that fragmentation of regulations has resulted in regulatory gaps.

8.52Reset Australia commented that issues surrounding children’s rights, especially privacy, engagement with harmful communities and data, are currently overlooked:

Our framework focuses on a narrower understanding of online safety that does not adequately reflect the full scope of the risks children and young people face online. That is, when Australian children and young people engage with the digital world, many of the risks they encounter currently sit outside our regulatory system.[55]

8.53The committee was advised, for example, that the current legislative framework is limited in its capacity to comprehensively address the issue of reducing harmful digital marketing practices.[56]

8.54eSafety’s remit under the OSA addresses criminal and antisocial online behaviours and seriously illegal content. The committee was advised that there has been less Australian Government investment in addressing online safety aspects that arise from the design of digital platforms, such as contact risks.[57]

Effectiveness of BOSE

8.55BOSE provides eSafety with the power to compel online service providers to produce compliance reports, thus enhancing transparency of the actions digital platforms are taking to ensure user safety. BOSE also encourages platforms to maintain a focus on safety concerns.[58]

8.56However, while eSafety’s primary report indicated technology companies needed to do more to tackle child sexual exploitation and abuse material,[59] eSafety has no powers to compel changes to the way platforms operate.

8.57The Australian Research Alliance for Children and Youth noted that ‘the inability to enforce the BOSE is insufficient in preventing online harm from occurring … any impetus derived from reputational damage caused by the disclosure [is] inconsequential because of the lack of market pressures to conform.’[60]

Ineffective self-regulation

8.58Submitters commented that voluntary or co-regulatory approaches are insufficient for appropriately regulating technology companies.[61]

8.59The Alcohol and Drug Foundation stated:

The existing regulatory system, relying largely on voluntary, industrymanaged codes and practices, has proven to be ineffective in protecting the community from the negative impact of unhealthy marketing. The industry’s clear conflict of interest means that the industry-led processes will never restrict alcohol marketing in a genuinely effective manner. Existing codes do not adequately restrict alcohol companies from marketing on digital platforms that are heavily used by children, and there are concerning examples of alcohol advertising directed to children online.[62]

8.60Dr Jessie Mitchell, Advocacy Manager, the Alannah & Madeline Foundation, stated, while participating in consultation to develop the Online Safety Act Industry Codes, the foundation found ‘there was not a clear commitment to pursuing common global standards of best practice in upholding children’s safety’.[63]

8.61Reset Australia also argued the weakness of co-regulation is demonstrated by the limited protections in Australia as compared to other children’s codes in the United Kingdom (UK), Ireland and California. Reset Australia recommended not registering co-regulatory codes in future, and progressively replacing self or co-regulatory codes with mandatory regulations.[64]

8.62The Human Rights Law Centre commented:

Co-regulation is inappropriate for such a powerful and high-risk sector, in which business models frequently come into conflict with community needs and the public interest. In the European Union, introduction of the Digital Services Act was driven by growing recognition that self- and co-regulatory models are inadequate and ineffective.[65]

Globally inconsistent controls

8.63The committee notes online safety is a global concern. Australia does not appear to have aligned safety standards for Australian children with the rights of children overseas or emerging best practice. The committee heard:

… this is a transnational world where children and young people are living digitally across all of these platforms globally.[66]

8.64Several submitters argued various codes, in force in international jurisdictions, such as in the UK, California and Ireland, provide higher protections for children than Australia (See Box 8.3 for more information on two of these codes).[67]

8.65The Alannah & Madeline Foundation commented that Big Tech is currently taking a market-by-market approach and applying different standards in different jurisdictions.[68] Ms Davies provided an example:

… there are different standards for the privacy default setting in Australia, which is 16, and in other jurisdictions which are regulated independently, where the default setting is 18. There are different standards for collecting geolocation data.[69]

8.66Dr Mitchell added:

Something we experienced recently was the opportunity to take part in the consultation to information the development of industry codes under Australia’s Online Safety Act. While we value the opportunity to take part, we did come away with the belief that there was not a clear commitment to pursuing common global standards of best practice in upholding children’s safety.[70]

8.67Ms Davies elaborated that this approach is ‘disingenuous’ and allows platforms to act based on what they think they will get away with rather than employing high safety settings by default.[71]

8.68eSafety and UNICEF also emphasised that existing international regulatory instruments, such as the UK Age Appropriate Design Code, recognise the importance of services being able to identify which of their users are children and young people so they can create safe, private and appropriate online experiences for them.[72]

The Age Appropriate Design Code (UK)

8.69Dr Mitchell compared the different approaches between Australia and the UK in relation to online safety:

When we contrast what the new industry codes will require for digital platforms in Australia compared to what’s expected of them in the UK under their children’s code, we can see that there are some lower standards being accepted in Australia than would be accepted in the UK.[73]

8.70Dr Mitchell highlighted the important considerations that have been included in the UK’s Age Appropriate Design Code (Children’s Code):

Under the UK children’s code, services that are likely to be accessed by someone under 18 have to have fairly comprehensive, high privacy settings by default, whereas in the Australian situation both the definition of ‘children’ and the definition of ‘privacy’ are narrower. Children became defined as ‘under 16’ and the high privacy settings focus on preventing contact with strangers but not necessarily stopping inappropriate handling of children’s data—like in geolocation data, for example.[74]

8.71The committee heard about the positive changes resulting from introduction of the UK Children’s Code:

Following its introduction, we did see a number of positive changes being made by digital platforms that were operating there. As a few examples, you had Instagram introducing prompts to encourage children to take a break from scrolling, Google making SafeSearch their default browsing mode for children and YouTube autoplay being turned off. While we can't directly attribute that to operating in a country that had a more rigorous code, the timing is suggestive.[75]

Box 8.3 International approaches to children’s codes

The Age Appropriate Design Code (UK)

The UK Children’s Code came into force on 2September 2020. It is enforced by the UK Information Commissioner. The Code is derived from the principles in the UK Data Protection Act and European Union General Data Protection Regulation, including data minimisation, purpose limitation, and data protection by design and default.

The code mandates that all online services ‘likely to be accessed by children’ provide data security for children such as by:

providing a high level of privacy by design and default;

explaining the nature of the service in child-friendly language;

requiring data use to be in the child’s best interests; and

not using children’s data to recommend harmful material.[76]

California Age Appropriate Design Code Act (US)

This Act, passed on 15 September 2022, aims to keep children safe online by requiring companies to consider privacy and protection of children in the design of any digital product or service that children are likely to access.[77] It also restricts data collection and profiling of children, requires high privacy settings for children by default, and prohibits the use of nudge techniques to encourage children to weaken their privacy protections.[78] On 19 September 2023, the US District Court, Northern District of California, San Jose Division granted an injunction that prohibited the California Attorney General from enforcing the Act until ordered otherwise by the court as it likely violates the First Amendment of the US Constitution.[79]

Strengthening Australia’s framework

8.72A range of measures were proposed to the committee in response to the current challenges of regulating online safety.

Current processes

8.73The Tech Council of Australia supported a wait-and-see approach allowing time to observe and evaluate the impact of the recent reforms.[80]

8.74The current online safety framework will be assessed through a legislated independent review of the operations of the OSA, to commence by January 2025.

8.75Department of Infrastructure, Transport, Regional Development, Communications and the Arts advised this review timeframe allows for a proper assessment of the operations and weaknesses of the OSA, consideration of emerging international approaches, and is the appropriate mechanism to consider amendments.[81]

8.76Many submitters were in favour of implementing additional regulatory measures to protect children without delay.[82]

Support for the Privacy Act Review

8.77As discussed in Chapter 5: Data, many submitters raised broad support for the proposed amendments under the Privacy Act Review.[83]

8.78The Privacy Act Review report puts forward proposals to strengthen protections for children, such as encouraging entities to consider ‘whether the collection, use or disclosure of personal information is in the best interests of the child’. It also proposes prohibitions on direct marketing and targeting to children, and trading children’s data.[84]

8.79eSafety highlighted, as part of the Attorney-General’s Department review of the Privacy Act 1988, it proposed the creation of an Australian Children’s Online Privacy Code, to be modelled on the UK Age-Appropriate Design Code, with eSafety being consulted during the development.[85]

8.80Reset.Tech supported the proposals as being ‘sound. They’re justified, and they should be heartily supported by anyone with an interest in these issues’.[86]

Education

8.81UNICEF recommended any emerging regulations should be accompanied by greater education efforts:

The emerging standards for age assurance are embracing the full range of options on offer. These need to be coupled with education to understand the value of privacy-preserving age assurance options and the role they can play in improving online safety for children, along with a broad multi-faceted approach to protecting children online beyond age assurance.[87]

8.82The Tech Council of Australia suggested Kindergarten to Year 12 education should include e-safety courses from on topics such as cyberbullying, online privacy, digital footprints, online scams and safe online practices.[88]

A strong regulator

8.83The committee heard that any regulation should be overseen by an independent public regulator[89] and be adequately resourced so it can be ‘sufficiently muscular’.[90]

8.84Reset Australia submitted the risks to sector require strong enforcement because they are:

High impact, and include significant public health and community safety concerns

Significant to the community, and the public has an appetite for the certainty of robust regulations

Unable to be adequately dealt with by lighter touch regulations. Digital platforms have demonstrated a track record of systemic compliance issues, including multiple breaches of existing legislation and a generally anaemic response to self-regulation

This warrants a pivot towards primary and subordinate legislation and regulation for the sector. Alongside strengthening existing regulation, regulators need to be resourced and enabled to enforce this, and joined up in ways that do not reproduce the issue-by-issue approach hampering current legislative remedies.[91]

8.85The Alannah & Madeline Foundation argued that regulators need adequate resources and access to high-quality, up-to-date information about new and emerging developments in digital technology:

Legislation and regulations are only as good as a system's ability to enforce them consistently and reasonably. We urge the committee to reflect on the capacity of our national regulators such as the eSafety Commissioner and the Office of the Information Commissioner. Even at the level of individual complaint-handling, pressure on regulators is growing. For example, in the year between 2020-21 and 2021-22, eSafety received a 65% increase in reports of child cyber bullying and a 55% increase in reports of image-based abuse.

Moreover, these regulators are negotiating with large international digital platforms with immense wealth and lobbying power. This poses challenges for regulators - for example, legislation may allow for digital platforms to be fined for certain practices, but regulators still need the resources to pursue these cases, especially if platforms decide to contest them.[92]

8.86Dr Mitchell commented on the UK as an example:

… the experience in the UK has also been that constant vigilance is needed. Certainly, there are still digital platforms there that are not operating to the standards of the code. So, whenever there's a new form of regulation introduced, it's very much an ongoing commitment that that regulator needs to have the resourcing, the expertise and the teeth to be able to monitor and enforce what they've introduced.[93]

New regulation

8.87Reset Australia and UNICEF recommended new mandatory online safety codes should be implemented to bring Australia into line with comparable jurisdictions like the UK and California.[94]

8.88Evidence recommended any new or revised regulation, such a children’s code, include specific features such as:

obligations to ensure data is processed in children's best interests;[95]

writing privacy collection and consent notices in child friendly language;[96]

banning collection of data for children under 18 (unless specifically necessary for the provision of that service);[97]

banning or limiting the use of children’s data for targeted advertising or other content;[98]

banning children from contact with unknown adults that they know from their user profile are over the age of 18 and have not been explicitly tagged as friends or contacts;[99] and

prohibiting the tracking, profiling, monitoring or targeting of children for commercial purposes.[100]

Best interests of the child

8.89Some submissions recommended new regulations should be implemented that follow ‘the best interests of the child’ principle.[101] A proposal in the Privacy Act Review Report 2022 recommended enshrining a principle that recognises the best interests of the child.[102]

8.90UNICEF suggested application of the principle requires an assessment of the specific context and should have regard for all children’s rights, including their rights to:

seek, receive and impart information;

be protected from harm; and

have their views given due weight.[103]

8.91It would also need to ensure transparency in the assessment of the best interests of the child and the criteria that have been applied.[104]

8.92Reset.Tech submitted best interests should be considered in data collection and processing:

This means that where profiling, behavioural advertising or other uses are not clearly in young people’s best interests, it should not be allowed … We also believe that children and young people should also have more control and say in how data is collected and used, where it is in their best interests and not too much has been collected … Lastly, as a principle, we believe that children and young people should have the right to delete their data. We would like to see clear and simple ways developed that young people can ask for their data to be deleted, including for advertising and profiling if it is collected.[105]

Footnotes

[1]UNICEF, Submission 14, p. 11.

[2]Office of the eSafety Commissioner (eSafety), Submission 1, p. 11; UNICEF, Submission 14, p. 8.

[3]eSafety, Submission 1, p. 11.

[4]Proof Committee Hansard, 26 July 2023, p. 12.

[5]Doomscrolling refers to excessively scrolling through bad news on social media and is considered problematic news consumption. See Ms Freya Thomson, What is doomscrolling and why is it bad for us?, 9 September 2022, www.openaccessgovernment.org/what-is-doomscrolling-and-why-is-it-bad-for-us/143139/ (accessed 22 August 2023).

[6]Viral challenges are online or social media driven trends encouraging users to upload videos of themselves copying a dare or stunt. For example, the ‘ice bucket challenge’ saw users pouring a bucket of ice water over their own or another person’s head. Challenges are usually started by social media users, and some can be significantly dangerous.

[7]eSafety, Submission 2, p. 4; Association of Heads of Independent Schools of Australia (AHISA), Submission 13, p. 8.

[8]See, for example, eSafety, Submission 2; Cancer Council Australia, Submission 5; Alcohol and Drug Foundation, Submission 10; Alcohol Change Australia, Submission 11; Association of the Heads of Independent Schools of Australia, Submission 13; UNICEF, Submission 14; Obesity Policy Coalition, Submission 19; Reset.Tech Australia, Submission 31; Alannah & Madeline Foundation, Submission 41; Office of the Australian Information Commissioner, Submission 61.

[9]Proof Committee Hansard, 26 July 2023, p. 16.

[10]Ms Alice Dawkins, Executive Director, Reset.Tech Australia, Proof Committee Hansard, 26 July 2023, p. 16.

[11]Ms Sarah Davies, Chief Executive Officer, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 14.

[12]UNICEF, Submission 14, p. 11.

[13]Australian Medical Association, Submission 66, p. 3.

[14]Attorney-General’s Department, Submission 51, p. 4.

[15]Human Rights Law Centre, Submission 50, p. 12.

[16]Attorney-General’s Department, Submission 51, p. 16.

[17]See, for example, Cancer Council Australia, Submission 5; Alcohol and Drug Foundation, Submission 10; Alcohol Change Australia, Submission 11; Association of the Heads of Independent Schools of Australia, Submission 13; Obesity Policy Foundation, Submission 19; Australian Medical Association, Submission 66, pp. 3–5.

[18]Alcohol Change Australia, Submission 11, [p. 2].

[19]Alcohol and Drug Foundation, Submission 10, pp. 1–2.

[20]See, for example, Cancer Council Australia, Submission 5, [p. 1]; Obesity Policy Coalition (OPC), Submission 19, pp. 2–4.

[21]See, for example, eSafety, Submission 2, p. 4; Alcohol and Drug Foundation, Submission 10, p. 2; Reset Australia, Submission 74, p. 4.

[22]Attorney-General’s Department, Submission 51, p. 5.

[23]Attorney-General’s Department, Submission 51, p. 5.

[24]eSafety, Submission 2, p. 1.

[25]Class 1 material is material that is or would likely be refused classification under the National Classification Scheme, see eSafety, Illegal and Restricted online content, www.esafety.gov.au/key-topics/Illegal-restricted-content (accessed 11 October 2023).

[26]Class 2 material is material that is, or would likely be, classified as either: X18+ (or, in the case of publications, category 2 restricted), or R18+ (or, in the case of publications, category 1 restricted) under the National Classification Scheme, because it is considered inappropriate for general public access and/or for children and young people under 18 years old, see eSafety, Illegal and Restricted online content, (accessed 11 October 2023).

[27]eSafety, Submission 2, p. 6.

[28]eSafety, Industry codes and standards,www.esafety.gov.au/industry/codes (accessed 6 July 2023).

[29]eSafety, Consolidated Industry Codes of Practice for the Online Industry (Class 1A and Class 1B Material) Head Terms, www.esafety.gov.au/sites/default/files/2023-09/Consolidated-Industry-Codes-of-Practice-Head-Terms-12-September-23.pdf (accessed 15 September 2023).

[30]eSafety, Consolidated Industry Codes of Practice for the Online Industry (Class 1A and Class 1B Material) Head Terms, (accessed 15 September 2023).

[31]eSafety, Industry codes and standards, (accessed 6 July 2023).

[32]eSafety, Submission 2, p. 6.

[33]eSafety, Online industry asked to address eSafety’s concerns with draft codes, www.esafety.gov.au/newsroom/media-releases/online-industry-asked-address-esafetys-concerns-draft-codes-0 (accessed 15 September 2023).

[34]eSafety, Industry codes and standards, (accessed 6 July 2023).

[35]eSafety, Online industry asked to address eSafety’s concerns with draft codes, (accessed 15 September 2023).

[36]eSafety, Industry codes and standards, (accessed 6 July 2023).

[37]eSafety, Industry codes and standards, (accessed 6 July 2023).

[38]eSafety, Basic Online Safety Expectations, www.esafety.gov.au/industry/basic-online-safety-expectations (accessed 6 July 2023).

[39]eSafety, Basic Online Safety Expectations, (accessed 6 July 2023).

[40]eSafety, Submission 2, p. 7.

[41]eSafety, Responses to transparency notices, www.esafety.gov.au/industry/basic-online-safety-expectations/responses-to-transparency-notices (accessed 6 July 2023).

[42]UNICEF, Submission 14, p. 7.

[43]eSafety, Submission 2,p. 7.

[44]Attorney-General’s Department, Submission 51, pp. 4–5.

[45]Ms Lucinda Longcroft, Director of Government Affairs and Public Policy, Google, Proof Committee Hansard, Parliamentary Joint Committee on Law Enforcement, 10 August 2023, p. 4.

[46]Attorney-General’s Department, Submission 51, p. 5.

[47]Ms Lucinda Longcroft, Director of Government Affairs and Public Policy, Google, Proof Committee Hansard, Parliamentary Joint Committee on Law Enforcement, 10 August 2023, p. 4.

[49]Proof Committee Hansard, 22 August 2023, p.17.

[50]Proof Committee Hansard, 3 October 2023, p. 6.

[51]Microsoft, Submission 47, pp. 13–14.

[52]Google, Submission 49, p. 11.

[53]Google, Submission 49, p. 14.

[54]Office of the Australian Information Commissioner, Submission 61, p. 1.

[55]Reset Australia, Submission 74, p. 4.

[56]See, for example, Foundation for Alcohol Research and Education, Submission 33, p. 9; UNICEF, Submission 14.

[57]Dr Jessie Mitchell, Advocacy Manager, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 14.

[58]Australian Research Alliance for Children and Youth (ARACY), Submission 21, [p. 2].

[59]UNICEF, Submission 14, p. 7.

[60]ARACY, Submission 21, [p. 2].

[61]See, for example, Ms Alice Dawkins, Executive Director, Reset.Tech Australia, Proof Committee Hansard, 26 July 2023, p. 17; Mr Mark Nottingham, Submission 37, p. 5; Human Rights Law Centre (HRLC), Submission 50, p. 9; Obesity Policy Coalition, Submission 19, p. 2.

[62]Alcohol and Drug Foundation, Submission 10, p. 2.

[63]Proof Committee Hansard, 26 July 2023, p. 13.

[64]Reset Australia, Submission 74, p. 15.

[65]HRLC, Submission 50, p. 9.

[66]Ms Sarah Davies, Chief Executive Officer, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 13.

[67]See, for example, Reset Australia, Submission 74, p. 15; Alannah & Madeline Foundation, Submission41, p. 8.

[68]Ms Sarah Davies, Chief Executive Officer, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 12.

[69]Ms Sarah Davies, Chief Executive Officer, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 12.

[70]Dr Jessie Mitchell, Advocacy Manager, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 13.

[71]Ms Sarah Davies, Chief Executive Officer, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 12.

[72]eSafety, Submission 1, p. 13; UNICEF, Submission 14, pp. 4–5.

[73]Dr Jessie Mitchell, Advocacy Manager, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 13.

[74]Dr Jessie Mitchell, Advocacy Manager, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 13.

[75]Dr Jessie Mitchell, Advocacy Manager, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 14.

[76]5 Rights Foundation, Demystifying the Age Appropriate Design Code, https://5rightsfoundation.com/uploads/demystifying-the-age-appropriate-design-code.pdf (accessed 28 June 2023).

[77]5 Rights Foundation, We Need to Keep Kids Safe Online: California has the Solution, https://5rightsfoundation.com/uploads/California-Age-Appropriate-Design-Code_short-briefing.pdf (accessed 28 June 2023); The California Age-Appropriate Design Code Act 2022 (US).

[78]5 Rights Foundation, We Need to Keep Kids Safe Online: California has the Solution, (accessed 28June2023).

[79]Adi Robertson, Court blocks California’s online child safety law, 19 September 2023, www.theverge.com/2023/9/18/23879489/california-age-appropriate-design-code-act-blocked-unconstitutional-first-amendment-injunction (accessed 21 November 2023).

[80]Tech Council of Australia, Submission 63, p. 14.

[81]Department of Infrastructure, Transport, Regional Development, Communications and the Arts, Submission 9, p. 6.

[82]See, for example, Reset Australia, Submission 74, p. 19; Mr John Livingstone, Advocacy Manager, UNICEF Australia, Proof Committee Hansard, 26 July 2023, p. 26; Obesity Policy Coalition, Submission 19, p. 1; Alannah & Madeline Foundation, Submission 41, p. 3; ARACY, Submission 21, [p. 3]; Children and Media Australia, Submission 53, p. 1; Uniting Church of Australia, Submission52, p. 1; Cancer Council Australia, Submission 5, [p. 2].

[83]See, for example, ACCAN, Submission 20, p. 37; DIGI, Submission 65, p. 1; Mr Roger Somerville, Head, Australia and New Zealand Public Policy, Amazon Web Services, Proof Committee Hansard, 3 October 2023, p. 4.

[84]Office of the Australian Information Commissioner, Submission 61, p. 4.

[85]eSafety, Submission 2, p. 13; UNICEF, Submission 14, pp. 4–5.

[86]Ms Alice Dawkins, Executive Director, Reset.Tech Australia, Proof Committee Hansard, 26 July 2023, p. 17.

[87]UNICEF, Submission 14, p. 7.

[88]Tech Council of Australia, Submission 63, p. 14.

[89]Dr Jessie Mitchell, Advocacy Manager, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 12.

[90]Ms Alice Dawkins, Executive Director, Reset.Tech Australia, Proof Committee Hansard, 26 July 2023, p. 17. Also see CPRC, Submission 60, p. 5.

[91]Reset Australia, Submission 74, p. 3.

[92]Alannah & Madeline Foundation, Submission 41, p. 9.

[93]Dr Jessie Mitchell, Advocacy Manager, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 14.

[94]Reset Australia, Submission 74, p. 19; Mr John Livingstone, Advocacy Manager, UNICEF Australia, Proof Committee Hansard, 26 July 2023, p. 26.

[95]Mr John Livingstone, Advocacy Manager, UNICEF Australia, Proof Committee Hansard, 26 July 2023, p. 26.

[96]Mr John Livingstone, Advocacy Manager, UNICEF Australia, Proof Committee Hansard, 26 July 2023, p. 26.

[97]Ms Sarah Davies, Chief Executive Officer, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 13.

[98]See, for example, Ms Sarah Davies, Chief Executive Officer, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 13; Reset Australia, Submission 74, p. 21; Ben Blackburn Racing, Submission 1, p. 9; Foundation for Alcohol Research and Education, Submission 33, p. 5.

[99]Ms Sarah Davies, Chief Executive Officer, Alannah & Madeline Foundation, Proof Committee Hansard, 26 July 2023, p. 13.

[100]Foundation for Alcohol Research and Education, Submission 33, p. 11.

[101]See, for example, UNICEF, Submission 14, p. 4; Reset.Tech, Submission 31, p. 3; Alannah & Madeline Foundation, Submission 41, p. 3.

[102]Attorney-General’s Department, Submission 51, p. 17.

[103]UNICEF, Submission 14, p. 4.

[104]UNICEF, Submission 14, p. 4.

[105]Reset.Tech, Submission 31, p. 3.