5. Online Safety 2.0

The Pursuit of a New Digital Culture

Introduction

5.1
As outlined in the previous chapters, online spaces present unique and difficult challenges in regulation. The digital world is increasingly vast, complex and ever-changing. Nonetheless, it is imperative on industry, governments, and private citizens worldwide to continue to meet the challenge of ensuring safety and security in online environments.
5.2
The status quo is neither desirable nor sustainable. Australia, alongside the rest of the world, must address the causes and amplifiers of harm to ensure a safe and equitable internet, especially to the most vulnerable members of society.
5.3
Notwithstanding these principles, it is clear that while the technology industry and government both have roles to play in addressing online harm, the broader society also have a critical role in ensuring online environments are safe. The Committee found during the inquiry that this third element is critical to any success of online safety programs. Without broader societal and cultural change to address these issues, any action taken by government or industry will be moot.
5.4
This chapter outlines the issues facing cultural and societal potential methods of addressing online harm. It examines how regulators can balance the freedom of expression online with the need to prevent harm and provide safety to users. It then examines potential methods of protecting users online, such as a statutory duty of care applied to online platforms. It also considers the need for increased education about online safety, examining what digital education providers currently exist and what gaps have been identified. The report concludes with the Committee’s comments and recommendations.

Balancing freedom online with harm prevention

5.5
In attempting to find solutions to address online harm, it is critically important to recognise that any regulation in this area will necessarily curtail the operations of the online world, and indeed affect the broader freedom of expression. Maintaining an appropriate balance is therefore a delicate and complex task.
5.6
The evidence presented to the Committee ultimately fell into two distinct schools of thought. One perspective suggested that a heavy emphasis on online safety had the potential of impacting freedom of speech and expression on digital platforms. The alternative perspective suggested that current models of co- and self-regulation performed by the digital industry are failing to protect vulnerable users from harms such as online abuse and child sexual abuse material.

Protecting freedom of speech

5.7
Some social media and digital platform providers indicated their concern that the regulation of online content would stymie users’ capacity to express themselves and would thereby reduce personal liberty.
5.8
Among the defenders of this principle, Twitter stated that it ‘recognises the need to balance tackling harm with protecting a free and secure Open Internet.’1 Twitter explained that, while it supported regulation to the extent that it would empower people who otherwise would not speak out due to fears of abuse (which it stated was done via its Rules), it believed that a balance needed to be struck between regulation and free expression.2 This sentiment was also expressed by Meta, who stated that it particularly considered the need to provide a balanced approached when it came to issues such as age verification.3
5.9
Twitter opined that the balance between harm reduction and freedom of expression was vital in discussions regarding online safety, stating that over-regulation could potentially lead to the unintended silencing of debate, particularly for marginalised groups.4
5.10
Twitter emphasised the need for governments to protect a ‘free and secure Open Internet’, which included ensuring equitable access and compliance with human rights norms such as freedom of expression:
Governments should prioritise policies, partnerships, and investments at home and abroad that support and defend the Open Internet, both through regulatory and standards bodies, as well as ensuring domestic regulation does not undermine global norms or set dangerous precedents. Open standards championed by these bodies will provide for greater interoperability, connection, and competition.5
5.11
Further, Twitter argued that online harm is a reflection of offline societal harm, and that increased content moderation would not make these issues disappear:
More broadly, the policy issues addressed are often rooted in complex societal challenges that exist in offline, as well as online, contexts. As we continue to work together in good faith on these complex issues, we emphasise that these challenges will not be resolved by the removal of content online alone. Bad actors seeking to exploit online services to undermine elections, spread disinformation, and harm others will not be deterred by their accounts being removed. Effective solutions demand a whole of society response that recognises the full scope of the problem being addressed.6

Improving online culture

5.12
Evidence to the Committee suggested that online culture is extremely toxic and lacks basic standards of civility and social decency. Witnesses suggested that online culture is encouraging harmful standards of behaviour for broader society. Dr Kate Hall, Head of Mental Health and Wellbeing for the Australian Football League (AFL), stated:
As a psychologist, I think we're seeing social norms move further and further online, away from what is appropriate and socially normative in face-to-face. I think there are grave harms occurring, particularly for the mental health and wellbeing of our young people, as there is this progression away from what is socially normative behaviour in our schools, in our clubs, on the street and in public settings. Even in our stadiums these behaviours would never ever be acceptable, and any perpetrator would be addressed by all others sitting around them or walking past them; that nudges the socially normative behaviour back to a far safer and healthier environment. It doesn't mean every outlier then adheres to it, but the collective as a whole creates these expectations of those who are on that platform.7
5.13
One submitter stated that the nature of the social media industry’s business model encourages the proliferation of toxicity. They pointed out that the extreme and toxic content is profitable to platforms, who use technological devices such as algorithms promoting extreme content and sensationalism to ensure user engagement is maximised. This in turn fuels addiction-driven cognitive responses by users:
Platforms encourage addictive behaviour through positive intermittent reinforcement, such as limitless feed scrolling, ‘like’ and ‘share’ buttons and comment functions, creating an addictive buzz similar to poker machines. As Facebook’s founding president explained, the platform intends to ‘consume as much of your time and conscious attention as possible’: every interaction gives the user ‘a little dopamine hit’ to encourage addiction. Outrage and negativity equal more engagement, which means more dopamine rewarding potentially poor behaviour. Posting something abusive or defamatory thus acquires a seductive pull: the more extreme content gets more engagement, which humans are wired to crave.8
5.14
Further, the submitter argued that social media ‘supercharges polarisation and tribalism’, further amplifying toxicity.9 When drawn towards a particular tribe that a person identifies with, it can ‘encourage group attacks, reinforcing tribal connection’.10 It was also asserted that the forms of online abuse that proliferate on social media sites would be unlikely to occur offline:
Social media ‘pile-ons’ can be devastating for the target, and the notifications of abuse are non-stop, direct to your phone. Such bullying would probably not occur in person. But online, we have fewer physical and visual cues to encourage empathy. Some argue social media has facilitated bullying of marginalised populations. Profiles on dating sites now regularly proclaim ‘No Indians!’ or ‘No Asians!’ – prejudices most people would not announce at a public bar, but happily broadcast online. Anonymity can further embolden abuse, hate speech and intolerance.11
5.15
Other witnesses suggested that the nature and extent of harm caused by online abuse was not broadly understood in the community. Ms Tanya Hosch of the AFL put the view that her interactions in her professional and personal capacities indicated that in general people are unaware of the extent of harm or duration of the harm caused by bullying and discrimination even in offline environments.12
5.16
Professor Amanda Third suggested that an important element of any attempt to address online harm was to focus on championing diversity and creating strong online communities:
It's very tempting to think of these issues in a really small way, in a way that focuses on individuals and how we teach individuals to behave online. This might be one strategy that we deploy, but we also need to think about how we create vibrant communities that have cultures of acceptance, understand diversity and are sympathetic to it, and can relate to other people as human beings. We need to do much more there to think more holistically about the ways we address these kinds of issues.13

Industry’s limited emphasis on harm prevention

5.17
The alternative view presented to the Committee was that current regulatory models, which were said to prioritise freedom of speech and enable digital platforms to co- or self-regulate their activities, have so far failed to protect users from harm.
5.18
eSafety expressed the view that the current regulatory system has placed too great an emphasis on users taking responsibility for their personal safety online:
The burden of responsibility needs to be flipped so that large tech companies take responsibility for online safety and embed safety features in the design and development of their products. It should not be left to children, parents and members of vulnerable communities who experience online abuse at rates much greater than the general population to protect themselves online from harms that are enabled by the design of services.14
5.19
eSafety outlined this in a graph, which is provided at Figure 5.1. This figure reflects eSafety’s view that responsibility should be shifted away from children and their families, towards digital platforms and government regulators.

Figure 5.1:  eSafety’s proposed model of the burden of responsibility

Source: eSafety Commissioner, Submission 53, p. 62.
5.20
Many witnesses echoed the concerns of eSafety, similarly perceiving an imbalance in responsibilities in keeping people safe online. Witnesses such as The Alannah and Madeleine Foundation (AMF) suggested that regulatory approaches broadly had focused primarily on educating users on ways to remain safe, rather than placing the burden of ensuring safety on social media services.15 The AMF suggested this was due to the absence of a safety by design principle when initially designing digital platforms.16 The AMF stated:
An analogy is that, if you were going into a toy shop to buy a toy for your child, you would know that whatever you purchased had gone through some kind of quality control assessment. It was age appropriate. There would be no button batteries for young children or toddlers. There'd be no lead paint. You know that the system has actually put a safety-by-design element around that. That does not exist in the online world, and it should.17
5.21
This analogy was also used by Dr Michael Salter, who noted that most people have basic safety expectations of toys being sold, and do not ‘spend our time educating parents on the pros and cons of every single toy on the shelf’.18 Dr Salter also pointed out that, like toys, digital platforms are being produced in enormous scale, and that it is not feasible to expect parents to be aware of safety considerations for every individual platform.19
5.22
Basic safety expectations were highlighted as a critical component of adult online safety as well. Dr Kate Hall, Head of Mental Health and Wellbeing at the AFL, expressed concerns that victims of online abuse are expected to take on the burden of reporting and seeking resolution for online abuse:
[T]he onus is still on the victim to have the psychological capacity, readiness and energy, and trust in the system, to meet any kind of resolution. That's not a very victim-centric or survivor-centric mindset. It doesn't empower people, particularly in our industry, to act, even though they're not bystanders; they do stand up and support each other.20
5.23
As discussed in Chapter 4, the eSafety Commissioner has created the voluntary Basic Online Safety Expectations (BOSE) framework as part of the Online Safety Act 2021 (OSA) implementation. However, concerns were raised by submitters that the lack of a compulsory and enforceable duty to protect users will not encourage digital platforms to adequately protect its users.21 Further, it is not clear that breaches of the BOSE and subsequent response by the eSafety Commissioner would be sufficient to prompt change.

A statutory duty of care

5.24
A number of submitters were in favour of a duty of care placed on social media platforms and other digital services, which was legally enforceable and compulsory for all digital participants.
5.25
Dr Michael Salter stated that a duty of care should be imposed on social media platforms, particularly in relation to children:
We need social media companies to accept that they have a special duty of care to children on their platforms. We should also be cognisant of the broader impacts of social media and online platforms on sex offending as a whole. Online networks and infrastructure are awash with child sexual abuse material because technology companies are not legally obliged to proactively detect or remove that material. We have online communities of child abusers numbering in the millions taking advantage of anonymity and encryption. Establishing a baseline of online safety for children is an important step forward, but it can't be the final word, and there is a lot more work to do to ensure that social media and the internet are functioning in the best interests of children.22
5.26
The Centre for Digital Wellbeing (the CDW) also recommended that the Australian Government implement a regulated duty of care, suggesting that it be attached to a licencing scheme for social media platforms.23 In support of this proposal, the CDW suggested that:
Over time, this would likely impact on design features and algorithmic functions that are harmful to all users, but particularly youth. A clear legislative requirement such as this would compel social media companies to act on evidence that any of its features are significantly damaging.24

The UK model of a statutory duty of care

5.27
The UK Government has proposed the establishment of a statutory duty of care in its new bill in response to online harm.
5.28
The CDW noted that the Online Harms White Paper produced by the UK Government in April 2019 suggested a singular regulatory framework underpinned by a duty of care on digital platforms.25 The subsequent development of the draft Online Safety Bill placed duties of care on digital companies providing content-sharing services and search functions to mitigate against illegal content.26
5.29
Professor Lorna Woods stated that during the development of the UK’s legislative response to online safety, the Carnegie Trust had recommended a statutory duty of care, which would be supported by an independent regulator to monitor compliance and take enforcement action as required.27 Following on from this point, Baroness Beeban Kidron OBE, a member of the Joint Committee on the Draft Online Safety Bill (JCDOSB), argued that this approach enabled digital platforms to take a flexible approach in protecting users online and develop varying approaches to online safety dependent on the functions and goals of the platform in question.28
5.30
Mr Damian Collins MP, Chair of the JCDOSB, explained that the role of the regulator was crucial in enforcement action on social media platforms in particular. He noted that, when drafting the JCDOSB’s report, it was important to link offences in the proposed online safety regime to existing offences in law. In doing so, Mr Collins stated that the JSCOSB identified that many of the linking offences in other areas of law were drafted having never envisaged their use in a digital setting, or that the offences would be hosted by a third party such as a digital platform. He stated that this led the JSCOSB to consider the adoption of thresholds for offences online:
The question then is: how do you introduce the thresholds? Again, going back to racial abuse, UK courts have given rulings on racial abuse. One of the cases that we looked at and discussed quite a bit in the committee was the racial abuse directed towards England footballers after the final of the European Championship. Some people have been prosecuted for what they posted, sometime after the event, so the law has already created a threshold for understanding when we think abuse has taken place, and the question then for an online safety regime would be for the regulator, through its codes of practices, to set a bar that says to the online platforms, 'This is where we believe an offence is being committed; this is where we believe you should intervene and mitigate the content that's been posted which will clearly cause harm, based on guidance that already exists in law.' The job of the regulator through the codes of practices would be to say, 'Here are offences in law that already exist and that the UK parliament has already determined are offences; these are the thresholds where we expect you to mitigate and act on it; and here are the sanctions we could apply if you fail to do so.'29

The best interests of the child as a guiding principle

5.31
Children’s rights were suggested as a focus point for any potential reform in online safety. Ms Anne Hollands, National Children’s Commissioner, Australian Human Rights Commission, argued that the child’s best interests principle should be at the forefront of all business in the digital industry:
The best interests of children should be the priority requirement for all internet based businesses. This would include strong default privacy settings and human rights by design requirements. There should be a requirement to comply with children's rights principles, such as demonstrated under the National Principles for Child Safe Organisations in the physical world. This would include a requirement to assess and report the impact on the rights of children at every stage of design, implementation and operation.30
5.32
The National Children’s Commissioner stated that social media and other digital platforms should be required to demonstrate that their services meet the best interests of the child principle.31 This would include considerations of privacy, security of personal data, protection from harm, a voice to express their views, and the ability to seek, receive and convey information.32 The National Children’s Commissioner further advocated that:
A best interests approach may require implementing clear boundaries to prevent practices that both infringe upon children’s rights and are contrary to their best interests, including by curtailing routine and indiscriminate digital surveillance measures. Practices such as online tracking, profiling, behavioural monitoring and ‘nudging’, the collection of biometric and geolocation data from children, automated decisions affecting children and the unjustifiable sale or transfer of children’s personal data to third parties should be banned or heavily restricted to protect children’s rights.33
5.33
Similarly, the AMF recommended that the ‘best interests of the child’ principle be adopted in considering changes to regulation of online safety, stating that ‘If we do that, then all the steps that we take to uphold children's best interests on digital platforms will be as constructive and healthy as possible’.34 Other organisations that were supportive of the implementation of the best interests of the child principle included ReachOut35 and Orygen.36
5.34
The AMF also suggested that any changes uphold the rights contained in the United Nations Declaration of Children’s Rights, including the right to privacy, safety, dignity and expression.37 The National Children’s Commissioner also pointed to other international agreements outlining the rights of children, including the Convention on the Rights of the Child.38
5.35
The best interests of the child principle has been adopted in the proposed Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021 (the Bill). The Bill’s Explanatory Paper reflects that social media platforms pose a higher level of risk than other online services due to the popularity of social media services, the types of interactions that can happen, and the amount of personal information contained on social media services.39 As a result, the Bill provides for the proposed Online Privacy code to require that social media and digital platforms:
Ensure that the collection, use or disclosure of a child’s personal information is fair and reasonable in the circumstances, with the best interests of the child being the primary consideration when determining what is fair and reasonable.40

Implementation of National Principles for Child Safe Organisations

5.36
In February 2019, the Council of Australian Governments endorsed the National Principles for Child Safe Organisations (the Child Safe Principles), aimed at providing a ‘nationally consistent approach to creating organisational cultures that foster child safety and wellbeing’.41 These principles were recommended by the Royal Commission into Institutional Responses to Child Sexual Abuse, which recommended that all Australian institutions ‘that engage in child-related work’ be required to implement the standards.42 The Child Safe Principles broadly cover not only child sexual abuse but other forms of harm to children and young people.
5.37
The Child Safe Principles consist of:
1
Child safety and wellbeing is embedded in organisational leadership, governance and culture;
2
Children and young people are informed about their rights, participate in decisions affecting them and are taken seriously;
3
Families and communities are informed and involved in promoting child safety and wellbeing;
4
Equity is upheld and diverse needs respected in policy and practice;
5
People working with children and young people are suitable and supported to reflect child safety and wellbeing values in practice;
6
Processes to respond to complaints and concerns are child focused;
7
Staff and volunteers are equipped with the knowledge, skills and awareness to keep children and young people safe through ongoing education and training;
8
Physical and online environments promote safety and wellbeing while minimising the opportunity for children and young people to be harmed;
9
Implementation of the national child safe principles is regularly reviewed and improved; and
10
Policies and procedures document how the organisation is safe for children and young people.43
5.38
Witnesses to the inquiry urged that digital platforms adopt the protections outlined in the National Principles for Child Safe Organisations. The National Children’s Commissioner explained that Principle 8 was specifically directed at risks facing children in online environments, but that all of the principles were relevant for the digital industry.44 The AMF similarly recommended that digital platforms be required to align with the National Principles for Child Safe Organisations.45 This suggestion was also put by yourtown, who stated that the Child Safe Principles specifically refer to ‘not just safe physical environments but safe online environments for children under 18 years of age’.46
5.39
yourtown posited that, given young people are spending increasing amounts of time online and that cyberbullying is increasing, online spaces which provide services to children should be required to adhere to child safety principles in a way that is understandable and accessible to its users.47 yourtown also suggested that the incorporation of the National Principles into the Safety by Design principles would promote child safety in the design and development of online platforms from the outset.48

Education and support

5.40
A key theme in evidence was the need for increased education for all citizens in relation to digital wellbeing, in particular for children and families. As discussed in Chapter 2 in the context of children and young people, however, the educational needs of different groups need to be targeted to avoid oversaturation in certain areas while neglecting others.
5.41
eSafety currently has a legislative mandate to ‘coordinate education and online safety activities across the Commonwealth’ and uses a ‘holistic approach beyond prevention and protection’.49 eSafety stated that it was of the view that digital education was a ‘lifelong journey’, particularly given that the age of digital use continues to get younger.50
5.42
eSafety provides a suite of educational resources on its website, aimed at different groups, such as:
parents and carers as the front lines of defence, particularly in the early years
educators and schools to develop students’ critical skills across the 4 Rs (respect, resilience, responsibility and reasoning), to manage online safety incidents that may arise within the school community, and to support best practice online safety education
domestic and family violence frontline workers to upskill people who support those experiencing technology-facilitated abuse, and
specific diverse and vulnerable communities that our research shows are more likely to experience online harms.51
5.43
When reflecting on eSafety’s educational services, witnesses were almost uniformly in support of the agency’s educational materials, confirming that they were utilised successfully in varying contexts.52
5.44
Outside of government providers, multiple organisations provide educational programs to children and young people on digital literacy, citizenship and safety. A range of providers who submitted to the inquiry included:
The AMF;53
Body Safety Australia;54
The Carly Ryan Foundation (CRF);55
The Daniel Morcombe Foundation; and
Dolly’s Dream.56
5.45
These groups all provide unique perspectives on online safety, which flows through in their education programs. For example, the AMF currently provides educational programs for young people on digital literacy, which focus on digital resilience:
We deliberately take a strengths based approach because all of the evidence, not just here and not just recently but for decades around the world, around how you address disadvantage and vulnerability is about the importance of efficacy of building a strengths based approach. If you can help people identify and build their competencies, skills, strengths, resilience, EQ, IQ and now their digital quotient, DQ, it gives them the tools that mean they are much more likely to withstand the chipping away and undermining of their power and of their strengths to then succumb to what you've described around that coercion. That is deliberate, manipulative and very intentional. If you don't have, as a self, the recognition of those strengths that need then to be labelled as such and seen as such, it is much easier to chip away and undermine them. When we work in social change it is proven time and time again for individual wellbeing, resilience, strength and positive mental health that a strengths based approach is much more effective than talking to people about their deficits and problems.57
5.46
Education was seen as a critical component by many witnesses in reducing harm from online abuse. The Association of Heads of Independent Schools of Australia stated that education is a strong way of creating cultural change in addition to providing information and support to individuals.58 The AMF too was supportive of a strong educational program focussing on digital literacy, aimed at age groups from pre-school to the end of secondary school with developmentally appropriate content.59 It also recommended digital literacy training for adults, particularly parents and carers, and for rural and regional communities.60
5.47
Among other suggestions for focus points of digital literacy education included:
Disinformation and critical thinking skills when consuming online information;61
Addressing the needs and challenges of parents with little or no digital literacy;62
Educational materials for those with English as a second language or low proficiency in written English;63 and
Schools and teaching staff-specific training in dealing with situations involving social media or digital platforms.64

Developing appropriate and relevant educational programs

5.48
In expressing support for further education in relation to digital literacy, some providers cautioned that education in relation to online safety matters needed to be done carefully and consider the developmental stage of the age group that the education was directed. The AMF noted that responding to the issues relating to online safety required consideration of:
the externalities or context within which children and young people are experiencing social media in a digital world, which is how essential it is to build strength in early childhood settings, school communities, families and those ancillary support services.65
5.49
Increased education for children and young people in relation to available redress and remediation was strongly recommended by yourtown. Ms Kathryn Mandla, Head of Advocacy and Research at yourtown, stated that ‘having accessible ways to raise a concern, provide feedback, self-advocate or make a complaint’ would enhance transparency for young people and their advocates.66
5.50
Body Safety Australia also noted that education providers need to be conscious that a ‘one size fits all’ approach was not appropriate, and that programs needed to be carefully developed to meet the needs of the community:
Education for children, young people and their families/carers is an essential component of harm minimisation. It requires expert knowledge, a trauma-informed perspective and a child-first framework. We warn against a one-size-fits-all approach to child protection. Culturally appropriate and age-appropriate education is essential to an effective education program, as is a whole-of-community approach. Children, young people, educator, families/carers, and other community groups all need to be included in preventative education approaches.67
5.51
Ms Davies of the AMF stated that education focussed on digital ethics and citizenship has been found most effective:
I suppose one of our critical underlying philosophies is that consumers and people who use the digital world are not passive recipients. They are active agents in creating that space. So the more we can support children and young people to be positive digital citizens, coming back to that, understanding that everything they do and say not just curates their own experience but influences everyone else's and, build that sense that they're actually building other people's experience, then we are going to minimise and mitigate those experiences for other people. Recent research said that nine out of 10 Australian teens deliberately describe themselves choosing positive behaviours online. So the more we can get people to understand that it's not just their own experience that they're influencing and it's actually everyone else's, by building that awareness and that deeper understanding, I also think that then mitigates and minimises the negative.68
5.52
One potential model supported by some witnesses was the concept of a digital ‘licence’, directed at school-aged children, which focused on providing digital literacy skills prior to the award of the licence. Ms Kate Everett, Founder of Dolly’s Dream, explained that it had worked with the AMF to create a digital licence in addition to a ‘DigiPledge’, an educational program for families to complete together.69 She explained the concept of the digital licence, stating:
[W]e're not allowed to drive a car and we're not allowed to do so many other things in life without the appropriate education and licensing. It only seems logical that we would have something like this in the online world.70

Youth engagement in educational programs

5.53
Notwithstanding the need for effective educational strategies, evidence to the inquiry indicated that educational campaigns targeted at young people may not be as effective as possible and require improvement.
5.54
Research commissioned by the eSafety Commissioner suggested that young people believe that there is an ‘oversaturation’ of material at school and in the home in relation to cyberbullying materials in particular.71 Similarly, research from the Young & Resilient Research Centre suggested that young people found that educational campaigns tended to be overly prohibitive and mis-informed, which resulted in their feeling ‘misunderstood and disempowered by online safety conversations’.72
5.55
The research further indicated that young people were most concerned about particular issues in relation to online safety. The top three concerns reported by young people were identified as:
Interactions with people online (such as catfishing, fake accounts and contact from unknown people);
Privacy matters (including exposure of personal data, photos and stolen identities); and
Security issues (such as hacking, scams and malware).73
5.56
The findings identified six best-practice principles for youth engagement, as suggested by the participants of the study:
Incorporating diversity and inclusivity in engagement programs;
Ensuring programs are youth-led while being supported by adults where required;
Genuinely listening to young people and formulating active solutions or outcomes;
Collaboration with young people in designing, delivering and evaluating programs and policies targeted at their age group;
Providing benefits to engagement to promote learning, growth and development, in addition to fair compensation; and
Integrating fun, age-appropriate and accessible activities for youth engagement.74
5.57
The CRF provided anecdotal reports that its consultations with young people had identified that current educational methods do not adequately understand or address young people’s concerns regarding online harm.75 For example, the CRF stated that it had found that young people often used online pornography to learn about relationships and sexual activity, often because the sexuality education provided in schools did not reflect their lived experiences or concerns. This had the potential to result in damaging understandings about sexuality and relationships, particularly in relation to violent or aggressive behaviour.76
5.58
The National Mental Health Commission (NMHC) similarly suggested that it found that young people often can recognise danger or where they feel unsafe in online platforms, but that they were not being supported fully to manage the risks. The NMHC stated that, in looking at ways in which to best support young people online, it takes a broad view of young people and how they use the internet:
This isn't just about how we are managing content online or how we are providing them with the equivalent of the old-fashioned media literacy. I think we need to go much deeper than that. We need to be asking: what patterns of behaviour are our young people exhibiting? Are they any different to how they've been before? What are the consequences on their behaviours of being in an online environment which has two attributes that may not have been there in generations past? One is a plethora of information in real time and much deeper, broader information than was available before. The second is an urgency and an immediacy about decision-making that has not been there in past generations. They're the types of issues that I think we need to look at through a professional discipline lens and ask what we could or should be doing in that space.77
5.59
eSafety stated that, in developing educational materials for young people, it had a multifaceted approach targeting the specific needs and wants of young people in terms of digital literacy; eSafety has a youth engagement strategy, in addition to the recent establishment of its Online Safety Youth Advisory Council.78 eSafety reported that their efforts to engage with children and young people appear to be making a difference in how they addressed online safety:
Over the last several years, we have started to see evidence of real change to behaviours and attitudes, with children and young people taking multiple actions and accessing a range of tools and tactics in response to negative experiences. For example, eSafety’s 2021 survey of 3,600 young Australians aged 8-17 years-old found that 64 per cent of young people who have experienced negative online behaviour blocked or unfriended people who had bullied them online – a significant increase on 46 per cent of young people in 2017. The research also found that young people are increasingly reaching out to their parents and friends. Sixty-six per cent of young people who have experienced negative online behaviour told their parents (up from 55% in 2017) and 60% told their friends (up from 28% in 2017).79
5.60
The National Children’s Commissioner advocated that children and young people have the right to be consulted and their views to be appropriately considered in any reform process which impacts their lives.80 She explained that the UN has recognised the importance of not allowing adults to assume what children’s needs are, and that an overly restrictive approach can be developmentally stunting in terms of autonomy and independence.81

Parental and community education of online risks

5.61
Witnesses advocated for greater emphasis to be placed on providing parents and carers with sufficient educational sources to effectively manage their children’s usage and respond appropriately where issues arise.
5.62
A number of witnesses raised concerns that parents did not sufficiently understand the dangers or risks posed to online users. Dr Jessie Mitchell, Advocacy Manager at the AMF, pointed to recent research conducted by the Australian Centre to Counter Child Exploitation which identified that, for parents and carers, there are:
… still found quite low levels of understanding of how child sexual exploitation and abuse operates online, the severity of it and how easy that sometimes is to occur. There was a sense that, while parents were relatively across what they needed to do to keep children safe face to face, there was a certain lack of understanding of how those behaviours could function in a digital world.82
5.63
This point was similarly raised by the Council of Catholic School Parents NSW ACT (CCSP), which stated that parents do not necessarily have the appropriate understanding of the risks of online engagement for their children. The CCSP stated:
Not all parents are digitally literate or have a full understanding of the potential short- and long-term risks that children may face from using social media platforms. This extends to their digital identity/footprint, personal reputation, viewing of age-inappropriate content and unknowingly breaking the law by possessing and forwarding certain images.83
5.64
The CCSP further provided anecdotal detail of parents unwittingly assisting children to engage in risky behaviour, including setting up a social media account for underage children, and ‘turn a blind eye’ to how children are engaging with online platforms.84 In addition, the CCSP noted that some parents are unaware of how their own behaviour on their social media accounts may impact their children, such as publishing photos of their children on their personal accounts.85
5.65
Some witnesses argued that providing education to adults would assist in dealing with online abuse for children and young people. The NMHC argued that, for young people in particular, engaging with their ‘circles of influence’ – parents, carers, family members, and others who impact them – and providing adequate resources and support for these particular groups, would assist in reducing harm.86
5.66
Concerningly, witnesses told the Committee that large sections of the community were not aware of the powers of the eSafety Commissioner and other government agencies in addressing online harm. The AMF stated that there were low levels of understanding in the community about reporting mechanisms in relation to online abuse, expressing the view that the work of eSafety and other regulatory bodies was not widely known or understood. Dr Mitchell stated:
I think part of the difficulty may be at the other end: the public's knowledge around the fact that there are places where they can report concerns, whether that's to the eSafety Commissioner or to the ACCCE or to Scamwatch or somewhere like that. There's still, I think, not sufficiently high and even understanding of those reporting options in the community, particularly amongst families where there might be lower levels of digital literacy and particularly if there's a family where perhaps the parents or carers are not very confident online and don't necessarily have the digital literacy to respond quickly when something goes wrong. So I think there is work to be done to make those avenues for reporting more accessible and recognised by all Australians.87
5.67
This statement was echoed by the eSafety Commissioner, who explained at a Senate Additional Estimates hearing on 15 February 2022 that many social media companies provide training to police forces in relation to criminal compliance provisions in relation to their platforms, but that this was not necessarily reflected at the local level of law enforcement.88 The eSafety Commissioner further noted that eSafety works with law enforcement agencies at state and territory levels, and were working to establish agreements and education campaigns with police commissioners to improve awareness of eSafety’s powers.89
5.68
In relation to the levels of understanding of local law enforcement entities, Body Safety Australia recommended educational programs be expanded to include entities such as police and courts to address in training law enforcement officials in identifying and appropriately managing online harm matters.90

Education is not the ‘silver bullet’ in addressing online harm

5.69
The enthusiasm demonstrated by witnesses for improved education was tempered by warnings not to consider education as the ‘silver bullet’ in addressing online harm. Education is only part of the answer to increasing online safety. Dr Salter raised concerns that while education (and in particular parental education) is important, it should not be considered as the primary mode of defence against online harms, nor perpetuate the idea that young people and families are responsible for online safety:
Of course parental education will always be part of the online safety equation. There's no question that parents of course have a role in the safety of their children. It's also quite sensible to provide education to children about services and so on. The issue is that, over the last 25 years, that's been our primary bulwark to keep children safe. We know that it's not working, and we know that it cannot work. The example I give you is if we had no child safety standards for child playgrounds and if child playgrounds were filled with sharp edges, bits of metal and plastic that were unstable and so on. It wouldn't matter how much education we gave children and adults about how to play safe in playgrounds. If the playgrounds were not safe, children would come to harm in those playgrounds. That is the case in the online environment. These environments are fundamentally unsafe at the moment. So, unfortunately, the focus on parental education can act as a sort of a distraction from the structural issues that ensure that children are unsafe now and into the future.91
5.70
Dr Salter further put the view that education does not overcome the key problem that ‘these platforms are fundamentally unsafe’.92 He noted that relying on education did not reflect the reality that many parents are either unable for various reasons to monitor their children’s online usage, or are the perpetrators of online child exploitation themselves.93 He stated:
The other point I want to make is that when we look at child sex abuse material we see that a lot of that content is in fact produced in the home by sexually abusive fathers and family members. So what do we do for those kids who are unsafe because of their parents? It's their parent that is exploiting them. When we look on the dark web and online we see a lot of men who are abusers and are talking about the fact that they're abusing their child. As a community and as a country we have to get real about the range of risks that are posed to children. Absolutely we want to empower parents, but, unfortunately, they can't be the front line of defence. Child protection is a collective responsibility. That responsibility should be shared by online service providers and of course articulated and enforced by government.94

Government leadership in addressing online safety

5.71
Witnesses emphasised the need for government at all levels to be a leader in the promotion of online safety measures. yourtown stated that it viewed government as having a role in multiple ways:
There is the regulatory role. There is a leadership role in setting the culture and the standard and role modelling best-practice behaviour and creating public awareness of the harms that can come about through poor safety design for children and young people and having poor processes and policies in place that can harm children and young people.95
5.72
yourtown also suggested that further collaboration and leadership between the Commonwealth and the state and territory governments would improve the effectiveness of programs. Dr Marion Byrne of yourtown stated that issues in relation to the implementation of agreed principles at local levels and disconnected systems could potentially cause dysfunction when attempting to reduce harm.96

Committee comment

5.73
Social media platforms occupy an unprecedented position in modern society. As Mr Matt Berriman stated, social media platforms are no longer just a form of technology but are an ‘infrastructure business’ where they are no longer optional but a necessity for many users to conduct various aspects of their lives and – if turned off – can be extraordinarily detrimental to people’s personal and professional lives.97
5.74
One of the key challenges for all participants in this sector is recognising the importance of balancing the freedom of online participants to engage in democratic debate and express their views, and the need for protection for the most vulnerable users of online spaces. The Committee is conscious that its observations and recommendations may have significant ramifications for this balance.
5.75
Having said that, it is clear that the status quo cannot continue. The Committee supports the view expressed by Ms Frances Haugen and others that social media platforms, in addition to other digital services, have not demonstrated their willingness to put the safety of their users before other considerations. While privacy concerns are critical to the rights of all internet users, the Committee does not believe that these issues outweigh the fundamental issue of ensuring safety in online environments.
5.76
Indeed, during the course of this inquiry, Meta released the company’s new values, one of which, ‘Move Fast’, which includes the following:
Move Fast helps us to build and learn faster than anyone else. This means acting with urgency and not waiting until next week to do something you could do today. At our scale, this also means continuously working to increase the velocity of our highest priority initiatives by methodically removing barriers that get in the way.98
5.77
Despite the assurances of the social media giants that have provided evidence to this inquiry that they take safety concerns seriously, the Committee is unable to reconcile these statements with the demonstrated attitude of Meta and others in the social media industry which appear to value progress, pace and profit above all other concerns.
5.78
The time has come to fundamentally shift the burden of responsibility regarding ensuring online safety. For too long, the onus of maintaining online safety has been on the most vulnerable users, including children and their parents. This is unacceptable and unsustainable in an environment where users like children are exposed to the most risk online and suffer extreme forms of harm as a result.

Improving online discourse

5.79
When considering standards of acceptable behaviour and commentary online, the need to ensure freedom of speech must be weighed against a user’s right to be safe. Social media platforms ultimately make a judgement as to how these objectives are effectively balanced.
5.80
It is the Committee’s view that these companies do not always achieve the most appropriate outcome. Given the prevalence of online communication in the lives of Australians, this in turn impacts community standards of what conduct is deemed acceptable in society more broadly.
5.81
This can be better addressed if social media platforms place a stronger focus on a user’s right to be safe online, and clearly indicate that respectful dissent and disagreement will be tolerated, while abuse will not.

A statutory duty of care framework

5.82
A statutory duty of care was suggested by a number of witnesses and has previously been recommended by other parliamentary committees. The Committee notes the approach taken by the UK Government, which has been to ‘call time’ on social media companies’ system of co- or self-regulation, evident in the draft legislation governing its new approach to online safety matters. The Committee particularly notes the argument that the use of a statutory duty of care would ensure that social media platforms are responsible for creating systems which are protective and responsive to their users. This model has significant strengths, and flips the onus of responsibility to provide and ensure user safety back onto social media platforms.
5.83
The Committee supports the proposed model of a formalised framework for duty of care and believes it would be an enhancement to the existing Australian regulatory framework of the Basic Online Safety Expectations. A framework for a duty of care would assist in ensuring that digital platforms have an incentive to create systems, and improve current ones, to ensure the safety of all users, particularly children, women, and other vulnerable groups. A formalised duty of care would also ensure that such a model is incorporates penalties for non-compliance.
5.84
An introduction of a such a framework would require further amendments to the Basic Online Safety Expectations to ensure its strength as key requirement for industry actors. However, the Committee is conscious of the OSA’s short time in operation, and other ongoing legislative reviews. Further, the eSafety Commissioner’s Safety by Design program may go some way in addressing the concerns raised in relation to the need for compliance with a formalised framework for duty of care.
5.85
It would therefore be appropriate to consider the need for the implementation of such enhancements to the duty of care framework within the Basic Online Safety Expectations for social media and other digital platforms in a broad-scale review of the entire digital industry and its overarching legislative framework. The review could consider how the principles of duty of care would address any potential gaps left unregulated by the OSA, and also consider how a the Basic Online Safety Expectations framework could incorporate the duty of care within its model.
5.86
Further, a formal duty of care framework should incorporate the best interests of the child principle as its guiding model, which will ensure online platforms place this concept at the forefront when designing new products and updating existing services.
5.87
The Committee also supports the implementation of the National Principles for Child Safe Organisations for digital media platforms; however, it observed in evidence that these principles are designed to apply to digital platforms already. A more clearly articulated application of these principles should be considered in any further reform in relation to the protection of children online.

Recommendation 20

5.88
The Committee recommends that the Digital Review include in its terms of reference:
The need to strengthen the Basic Online Safety Expectations to incorporate and formalise a statutory duty of care towards users;
The scope and nature of such a duty of care framework, including potential models of implementation and operation;
Potential methods of enforcement to ensure compliance, including penalties for non-compliance; and
The incorporation of the best interests of the child principle as an enforceable obligation on social media and other digital platforms, including potential reporting mechanisms.

Education

5.89
The Committee recognises the urgent need for education for large parts of the community, particularly for the most vulnerable users. Having said that, the eSafety Commissioner, in addition to the non-government sector, currently provides extensive forms of education materials. It is not enough, therefore, to broadly state that more education is needed. To ensure that educational campaigns are effective, they must be targeted and carefully designed to avoid oversaturating recipients or focusing on material that is irrelevant to people’s lived experiences.
5.90
Young people in particular feel that they are oversaturated with material aimed at their age group, while they are also often the most at-risk and experience high proportions of harm. This may indicate that the current educational material is not sufficient in addressing the specific needs of young people in regards to online safety.
5.91
The eSafety Commissioner’s work in attempting to better understand the needs of young people and how best to engage with them is therefore encouraging. The implementation of these findings in eSafety’s educational campaigns will, in the Committee’s view, be indicative of where further work is needed.

Recommendation 21

5.92
The Committee recommends that the eSafety Commissioner:
Increase the reach of educational programs geared at young people regarding online harms, with a particular focus on reporting mechanisms and the nature of some online harms being a criminal offence;
Formalise a consultation and engagement model with young people through the Australian Government’s Youth Advisory Council in regards to educational themes and program delivery; and
Report to the Parliament on the operation and outcomes of the program, including research identifying whether this has resulted in a reduction in online harm for young people.
5.93
The Committee also agrees with the concerns raised by many witnesses to the inquiry suggesting that education more broadly does not sufficiently address digital literacy and safety. It also noted the observations of witnesses that education needed to begin from early childhood and beyond secondary schooling years. Separate to these concerns, however, it is clear that a large majority of the adult population similarly do not have the digital literacy skills needed to manage online interaction safely.
5.94
It a necessary and elemental step of Australia’s fight against online harm to invest in and provide digital education for all Australians, in all community groups, particularly as the digital world develops. Educational programs should be expanded to a much wider range of groups, including the most vulnerable users in society.
5.95
Given that education is required at all age groups and areas of society, the methods in which education can be provided should be considered carefully and with understanding of intersectional factors such as developmental stages, cultural differences, language barriers and access to technology. The eSafety Commissioner should work with relevant government agencies and departments to identify the most appropriate form of implementing educational campaigns to society, including inclusion into the National Curriculum.
5.96
While the Committee heard evidence in relation to the prospect of the widescale implementation of a digital licence for children and young people, it does not support the mandatory use of a digital licence for children to access the internet. It does, however, believe that a digital licence could be used as an educational tool aimed at school-aged children, similarly to the way that children are issued ‘pen licences’.
5.97
The Committee also observed that many people in the community were not aware of the powers of the eSafety Commissioner in removing harmful content. This should be rectified as a matter of urgency, as it is arguably of little use for the eSafety Commissioner to have these powers if people are unaware of them.
5.98
The Committee further suggests that the Australian Government continue to provide a leadership role in working with states and territories to implement digital education programs at local levels.

Recommendation 22

5.99
The Committee recommends that the eSafety Commissioner work in consultation with the Department of Education, Skills and Employment to design and implement a national strategy on online safety education designed for early childhood, and primary school-aged children, and secondary school-aged young people, including:
A proposed curriculum, informed by developmental stages and other relevant factors;
Potential methods of rollout, including consultation and engagement with children, young people, child development and psychology experts, digital education experts and other specialists in online harm; and
A roadmap provided to parents of these age groups detailing methods of addressing online harm.

Recommendation 23

5.100
The Committee recommends that the eSafety Commissioner design and administer an education and awareness campaign aimed at adults, particularly in relation to vulnerable groups such as women, migrant and refugee groups, and people with disabilities, with a focus on the eSafety Commissioner’s powers to remove harmful content and the mechanisms through which people can report harmful content and online abuse.

Recommendation 24

5.101
The Committee recommends that the Australian Government work with states and territories to ensure that relevant law enforcement agencies are appropriately trained on how to support victims of online harm. This should include trauma-informed approaches as well as a comprehensive understanding of police powers and other relevant avenues, such as the relevant powers of the eSafety Commissioner.

Recommendation 25

5.102
The Committee recommends that the Australian Government review funding to the eSafety Commissioner within twelve (12) months to ensure that any of the Committee’s recommendations that are agreed to by the Government and implemented by the Office of the eSafety Commissioner are adequately and appropriately funded for any increased resource requirements.

Engagement with young people

Recommendation 26

5.103
The Committee recommends that the Online Safety Youth Advisory Council, via the eSafety Commissioner, provide a response to this report and its recommendations within six (6) months of its establishment and full membership.
Lucy Wicks MP
Chair
5.104
11 March 2022

  • 1
    Twitter, Submission 50, p. 3.
  • 2
    Twitter, Submission 50, p. 11.
  • 3
    Meta, Submission 49, p. 30.
  • 4
    Ms Kathleen Reen, Senior Director of Public Policy, Asia-Pacific, Twitter, Committee Hansard, 21 January 2022, p. 21.
  • 5
    Twitter, Submission 50, pp 3-4.
  • 6
    Twitter, Submission 50, p. 3.
  • 7
    Dr Kate Hall, Head of Mental Health and Wellbeing, Australian Football League (AFL), Committee Hansard, 1 February 2022, p. 13.
  • 8
    Name Withheld, Submission 60, p. 17.
  • 9
    Name Withheld, Submission 60, pp 17-18.
  • 10
    Name Withheld, Submission 60, p. 18.
  • 11
    Name Withheld, Submission 60, p. 18.
  • 12
    Ms Tanya Hosch, Executive General Manager Inclusion and Social Policy, AFL, Committee Hansard, 1 February 2022, p. 17.
  • 13
    Amanda Third, Professorial Research Fellow, Institute for Culture and Society, Western Sydney University; Co-Director, Young and Resilient Research Centre, Western Sydney University, Committee Hansard, 21 December 2021, p. 30.
  • 14
    eSafety Commissioner, Submission 53, p. 61.
  • 15
    Ms Sarah Davies, Chief Executive Officer, Alannah and Madeline Foundation (AMF), Committee Hansard, 21 December 2021, p. 18.
  • 16
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 18.
  • 17
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 18.
  • 18
    Dr Michael Salter, Committee Hansard, 18 January 2022, p. 16.
  • 19
    Dr Michael Salter, Committee Hansard, 18 January 2022, p. 16.
  • 20
    Dr Kate Hall, AFL, Committee Hansard, 1 February 2022, p. 12.
  • 21
    Ms Frances Haugen, Committee Hansard, 3 February 2022, p. 1.
  • 22
    Dr Michael Salter, Committee Hansard, 18 January 2022, p. 11.
  • 23
    Centre for Digital Wellbeing (CDW), Submission 47, p. 5.
  • 24
    CDW, Submission 47, p. 5.
  • 25
    CDW, Submission 47, p. 51.
  • 26
    CDW, Submission 47, p. 51.
  • 27
    Professor Lorna Woods, Professor of Internet Law, University of Essex, Committee Hansard, 27 January 2022, p. 34.
  • 28
    Baroness Beeban Kidron OBE, Peer, House of Lords, United Kingdom, Committee Hansard, 27 January 2022, p. 36.
  • 29
    Mr Damian Collins MP, Chair, Joint Committee on the Draft Online Safety Bill, House of Commons, United Kingdom, Committee Hansard, 27 January 2022, p. 37.
  • 30
    Ms Anne Hollands, National Children’s Commissioner, Australian Human Rights Commission (AHRC), Committee Hansard, 2 March 2022, pp 1-2.
  • 31
    Ms Anne Hollands, National Children’s Commissioner, AHRC, Committee Hansard, 2 March 2022, p. 3.
  • 32
    National Children’s Commissioner, Submission 64, p. 4.
  • 33
    National Children’s Commissioner, Submission 64, p. 5.
  • 34
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 16.
  • 35
    ReachOut Australia, Submission 36, p. 11.
  • 36
    Orygen, Submission 27.1, p. 2.
  • 37
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 16.
  • 38
    Ms Anne Hollands, National Children’s Commissioner, AHRC, Committee Hansard, 2 March 2022, p. 1.
  • 39
    Attorney-General’s Department (AGD), Explanatory paper – Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021, October 2021, available at: https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/user_uploads/online-privacy-bill-explanatory-paper.pdf (accessed 9 February 2022), p. 11.
  • 40
    AGD, Explanatory paper – Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021, October 2021, available at: https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/user_uploads/online-privacy-bill-explanatory-paper.pdf (accessed 9 February 2022), p. 11.
  • 41
    AHRC, About the National Principles, available at: https://childsafe.humanrights.gov.au/national-principles/about-national-principles (accessed 28 January 2022).
  • 42
    AHRC, About the National Principles, available at: https://childsafe.humanrights.gov.au/national-principles/about-national-principles (accessed 28 January 2022).
  • 43
    AHRC, About the National Principles, available at: https://childsafe.humanrights.gov.au/national-principles/about-national-principles (accessed 28 January 2022).
  • 44
    Ms Anne Hollands, National Children’s Commissioner, AHRC, Committee Hansard, 2 March 2022, p. 1.
  • 45
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 17.
  • 46
    Ms Kathryn Mandla, Head, Advocacy and Research, yourtown, Committee Hansard, 21 December 2021, p. 33.
  • 47
    Ms Kathryn Mandla, yourtown, Committee Hansard, 21 December 2021, p. 33.
  • 48
    Ms Kathryn Mandla, yourtown, Committee Hansard, 21 December 2021, p. 33.
  • 49
    eSafety Commissioner, Submission 53, p. 3.
  • 50
    eSafety Commissioner, Submission 53, p. 7.
  • 51
    eSafety Commissioner, Submission 53, p. 8.
  • 52
    The Association of Heads of Independent Schools Australia, Submission 24, p. 6.
  • 53
    AMF, Submission 2.
  • 54
    Body Safety Australia, Submission 59.
  • 55
    The Carly Ryan Foundation (CRF), Submission 54.
  • 56
    Dolly’s Dream, Submission 4.
  • 57
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 17.
  • 58
    The Association of Heads of Independent Schools Australia, Submission 24, p. 5.
  • 59
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 21.
  • 60
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 21.
  • 61
    CDW, Submission 47, p. 5.
  • 62
    Council of Catholic School Parents NSW ACT (CCSP), Submission 32, p. 3.
  • 63
    CCSP, Submission 32, p. 3.
  • 64
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 20.
  • 65
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 17.
  • 66
    Ms Kathryn Mandla, yourtown, Committee Hansard, 21 December 2021, p. 33.
  • 67
    Body Safety Australia, Submission 59, p. 4.
  • 68
    Ms Sarah Davies, AMF, Committee Hansard, 21 December 2021, p. 20.
  • 69
    Ms Kate Everett, Founder, Dolly’s Dream, Committee Hansard, 27 January 2022, p. 5.
  • 70
    Ms Kate Everett, Founder, Dolly’s Dream, Committee Hansard, 27 January 2022, p. 5.
  • 71
    Young & Resilient Research Centre, Consultations with young people to inform the eSafety Commissioner’s Engagement Strategy for Young People, 2021, available at: https://www.esafety.gov.au/sites/default/files/2022-01/YRRC%20Research%20Report%20eSafety%202021_web%20V06%20-%20publishing_1.pdf (accessed 28 January 2022), p. 6.
  • 72
    Young & Resilient Research Centre, Consultations with young people to inform the eSafety Commissioner’s Engagement Strategy for Young People, 2021, available at: https://www.esafety.gov.au/sites/default/files/2022-01/YRRC%20Research%20Report%20eSafety%202021_web%20V06%20-%20publishing_1.pdf (accessed 28 January 2022), p. 5.
  • 73
    Young & Resilient Research Centre, Consultations with young people to inform the eSafety Commissioner’s Engagement Strategy for Young People, 2021`, available at: https://www.esafety.gov.au/sites/default/files/2022-01/YRRC%20Research%20Report%20eSafety%202021_web%20V06%20-%20publishing_1.pdf (accessed 28 January 2022), p. 12.
  • 74
    Young & Resilient Research Centre, Consultations with young people to inform the eSafety Commissioner’s Engagement Strategy for Young People, 2021, available at: https://www.esafety.gov.au/sites/default/files/2022-01/YRRC%20Research%20Report%20eSafety%202021_web%20V06%20-%20publishing_1.pdf (accessed 28 January 2022), p. 6.
  • 75
    CRF, Submission 54, p. 8.
  • 76
    CRF, Submission 54, p. 8.
  • 77
    Ms Christine Morgan, Chief Executive Officer and Prime Minister’s National Suicide Prevention Adviser, National Mental Health Commission (NMHC), Committee Hansard, 21 January 2022, p. 9.
  • 78
    eSafety Commissioner, Submission 53, p. 8.
  • 79
    eSafety Commissioner, Submission 53, pp 8-9.
  • 80
    National Children’s Commissioner, Submission 64, p. 5.
  • 81
    National Children’s Commissioner, Submission 64, p. 5.
  • 82
    Dr Jessie Mitchell, Advocacy Manager, AMF, Committee Hansard, 21 December 2021, p. 18.
  • 83
    CCSP, Submission 32, p. 3.
  • 84
    CCSP, Submission 32, p. 3.
  • 85
    CCSP, Submission 32, p. 3.
  • 86
    Ms Christine Morgan, NMHC, Committee Hansard, 21 January 2022, p. 6.
  • 87
    Dr Jessie Mitchell, AMF, Committee Hansard, 21 December 2021, p. 18.
  • 88
    Ms Julie Inman Grant, eSafety Commissioner, Office of the eSafety Commissioner, Additional Estimates 2021-22, Senate Environment and Communications Legislation Committee, Committee Hansard, 15 February 2022, p. 25.
  • 89
    Ms Julie Inman Grant, eSafety Commissioner, Office of the eSafety Commissioner, Additional Estimates 2021-22, Senate Environment and Communications Legislation Committee, Committee Hansard, 15 February 2022, p. 29.
  • 90
    Body Safety Australia, Submission 59, p. 4.
  • 91
    Dr Michael Salter, Committee Hansard, 18 January 2022, p. 14.
  • 92
    Dr Michael Salter, Committee Hansard, 18 January 2022, p. 14.
  • 93
    Dr Michael Salter, Committee Hansard, 18 January 2022, pp 14-15.
  • 94
    Dr Michael Salter, Committee Hansard, 18 January 2022, p. 14.
  • 95
    Ms Kathryn Mandla, yourtown, Committee Hansard, 21 December 2021, p. 33.
  • 96
    Dr Marion Byrne, Manager, Advocacy, Research and Innovation, yourtown, Committee Hansard, 21 December 2021, p. 33.
  • 97
    Mr Matt Berriman, Chair, Mental Health Australia, Committee Hansard, 21 January 2022, p. 22.
  • 98
    Meta, Culture at Meta, available at: https://www.metacareers.com/facebook-life/ (accessed 28 February 2022).

 |  Contents  |