Chapter 2

Key issues

2.1
Overwhelmingly, submitters and witnesses expressed support for the objectives and many provisions of the OS Bill.1 They agreed that the Australian Government should be improving and promoting online safety for Australians, especially children and young people.2
2.2
Most submitters appreciated the public consultations conducted by the department but raised concerns about matters that they argued have not been addressed in the OS Bill and which some considered long overdue for reform.
2.3
This chapter outlines some of the key concerns, being matters regarding:
the functions, powers and oversight of the eSafety Commissioner (Commissioner);
the clarity and breadth of the basic online safety expectations(BOSE);
services in scope of the online content regulation schemes;
clarity and proportionality of the blocking scheme;
the appropriate basis for the Online Content Scheme;
the reduced response time for removal notices; and
the public consultation processes associated with the bills.

Functions, powers and oversight of the Commissioner

2.4
Many submitters and witnesses questioned the proposed provisions in the OS Bill relating to the functions, powers and oversight of the Commissioner.3 In particular, they argued that there is no clear justification for the expansion of the Commissioner's powers as, for example:
existing powers have been infrequently used;4 and
providers' actions toward combatting cyber-bullying, cyber-abuse, the nonconsensual sharing of images and abhorrent violent conduct render the proposed powers redundant.5
2.5
Submitters and witnesses supported the addition of further provisions in the OS Bill that would increase transparency regarding the operation of the Commissioner:
the Interactive Games & Entertainment Association commented that 'there are very few transparency or reporting obligations imposed on the Commissioner, even at a basic statistical level';6
Assembly Four suggested that there should be a 'transparent reporting process on a semi-annual basis';7
Electronic Frontiers Australia added that comprehensive reporting would provide 'a more accurate picture of the effectiveness of the powers granted to the eSafety Commissioner in achieving [its] stated objectives'.8
2.6
Some submitters voiced concerns about the proposal to invest significant and broad powers in a single, unelected official. Scarlet Alliance, for example, voiced its concern about giving:
…disproportionate power to an unelected public bureaucrat to decide what types of content are viewable to Australian endusers... This power, combined with both a lack of oversight or rigorous public reporting requirements, leaves room for the individual holding this position to approach their task with a great deal of subjectivity and little accountability.9
2.7
For these reasons, several submitters contended that the OS Bill would benefit from the inclusion of additional checks and balances, for example, a multi-stakeholder body to oversee the operations of the Commissioner or more regular review or parliamentary scrutiny.10
2.8
Electronic Frontiers Australia indicated that it would be satisfied with neither of these options, as it should not be the role of the Commissioner to decide limitations to the freedom of expression:
While it is appropriate for there to be targeted, carefully designed limitations to freedom of speech, the exact limits on those freedoms and where they should lie is a matter of intense, perpetual, debate. It is for Parliament to decide, after due consideration, whether to move the boundaries. It is anathema to both freedom of speech and the rule of law for Parliament to delegate to a regulator the power to set limits on individual speech.11
2.9
The committee notes that the OS Bill provides for an independent review of its operations within three years of the commencement of the Act.12

Commissioner response

2.10
Ms Julie Inman Grant, the Commissioner, strongly supported the OS Bill:
…the online safety legislation that is proposed will enhance our ability to protect Australians of all ages from online harms, and this is precisely our primary mission: to enable us to build on our almost six years of unique regulatory experience in tackling targeted online abuse and to provide us with a new set of tools to offer Australians online safety protection that no other country in the world has afforded its citizens.13
2.11
In performing this function, the Commissioner told the committee that her office must be as transparent and as accountable as possible. She noted that there are existing mechanisms for public accountability:
We already openly publish our research, investigative trends, corporate strategy and produce an annual report. We do release our reporting numbers and are currently building our in-house "Insights capability" to release as much relevant data as we can. I also appear before estimates and other committees and have accountabilities under the [Public Governance, Performance and Accountability Act 2013].14
2.12
Ms Inman Grant highlighted however that there important transparency constraints, such as privacy, operational and resourcing reasons:
We remain a very small but nimble agency that receives a comparatively high number of complaints, so we also need to ensure that reporting requirements are not so onerous that they undermine our ability to provide compassionate citizen service. If the proposed legislation is enacted, we will release detailed regulatory guidance and publicly release our regulatory priorities.15

Internal review and appeal mechanisms

2.13
Some submitters focussed on the issue of legislative safeguards. In particular, they argued that the OS Bill should include improved review and/or appeal mechanisms for providers or people affected by the exercise of the Commissioner's powers, noting that the Commissioner would be indemnified in civil proceedings and for damages.16
2.14
The Communications Alliance submitted, for example, that appeals would only be possible to the Administrative Appeals Tribunal (AAT):
This limited avenue for recourse for providers (and end-users) is disproportionate to the civil penalties envisaged... It would be more appropriate and practical (also for the AAT) if the Online Safety Act provided for an internal review of a decision by the eSafety Commissioner. Internal review processes are not unusual – for example, the National Disability Insurance Scheme requires that an internal review be undertaken prior to a matter being escalated to the AAT.17
2.15
Scarlet Alliance questioned whether the right to appeal to the AAT would be of any value given the breadth of the Commissioner's proposed powers and the burden placed upon an applicant for review. Its submission argued that there is a lack of procedural fairness implicit in, for example, the proposed Online Content Scheme:
Within the Online Content Scheme, there is no process outlined for users to be notified that their material may be removed, no notice period offered to them, and no opportunity for them to be given a hearing to speak or write back in relation to the complaint. The expedited 24 hour time frame removes any opportunity for the Commissioner to develop or communicate considered reasons. That the office of the Commissioner is indemnified from liability for civil damages to those suffering the consequences of inappropriate or even illegal notices issued by the Commissioner is cause for further concern.18
2.16
Twitter indicated that concerns regarding administrative review must be viewed in the context of the OS Bill as a whole where there is no concomitant level of oversight of the Commissioner:
While section 220 of the Bill allows for an application to the Administrative Appeals Tribunal…to review a specific decision by the eSafety Commissioner, there are no regular or routine avenues for recourse, oversight processes, or transparency structures to counterbalance the extraordinary discretion vested within the eSafety Commissioner.19

Commissioner response

2.17
In addition to the transparency and accountability mechanisms outlined above, the Commissioner noted that the OS Bill provides for various types of review:
Key decisions under all of our statutory schemes may be referred to the Administrative Appeals Tribunal for review… We will also be subject to parliamentary scrutiny with respect to the Basic Online Safety Expectations and also with respect to our [abhorrent violent material] and ISP blocking powers. Any government decision can always be referred to the Commonwealth Ombudsman. And in certain circumstances, a party may seek judicial review of a decision made by a Commonwealth agency in the Federal Court or the High Court.20

Basic Online Safety Expectations

2.18
Submitters supported provisions in the OS Bill that would give the Minister the power to determine the BOSE.21 However, some submitters expressed concerns about when that might happen and what could be included in a determination.22
2.19
The Australian Lawyers Alliance argued, for example, that the core expectations that must be included in any determination are so broad that 'there is a significant risk that this will result in excessive proactive monitoring and removal of content that falls under Class 1 and 2 of the National Classification Code' (NCC) (see also Online Content Scheme below).23
2.20
The Australian Lawyers Alliance went on to say that it would not be appropriate, indeed it would be dangerous, to vest such broad discretion over those materials in a single individual, who would then be empowered to determine community expectations.24
2.21
Global Partners Digital considered that it would be preferable if the Minister were required to consider specific factors when making a BOSE determination, and that the determination should be as narrow as possible: 'This would help ensure that companies are only required to satisfy the [BOSE] where there is clear evidence or risk of harm'.25

Commissioner and departmental response

2.22
The Commissioner said that the BOSE is a key feature of the OS Bill: 'The time has come to be very clear with internet companies about what we expect from them when operating in this country'. She noted that the provisions in the OS Bill also provide powerful transparency tools to hold digital platforms to account.26
2.23
A representative from the department reminded the committee that the Minister's determination will be a legislative instrument for the purposes of the Legislation Act 2003. This means that it will have to be registered in the Federal Register of Legislation, tabled in the Parliament and subject to Parliamentary disallowance.27
2.24
Ms Bridget Gannon, also from the department, further highlighted that the primary legislation—when created by the OS Bill—will set out core expectations and therefore guidance as to what will comprise the BOSE.28

Online content regulation schemes

2.25
Submitters broadly endorsed the harmonisation of online content regulation schemes, as proposed in the OS Bill.29 Twitter, for example, submitted:
We are supportive of the Bill's overarching purpose to harmonise and achieve better consistency across the five different content schemes under the eSafety Commissioner's purview including: the existing Cyberbullying Scheme for children in Part 5; the existing Image-based Abuse Scheme in Part 6; the new proposed Cyber-abuse Scheme for adults in Part 7; the new proposed Abhorrent Violent Material Blocking Scheme in Part 8; and the new Online Content Scheme in Part 9 originally under schedules 5 and 7 of the Broadcasting Services Act 1992.30
2.26
The Queensland Family and Child Commission explicitly endorsed the enhanced cyber-bullying scheme for children and young people, highlighting that there is an ongoing need for such a scheme:
Cyberbullying remains a serious and pervasive issue for children and young people, and it is important that online providers take proactive steps and act quickly to minimise the trauma experienced by children and young people who are experiencing cyberbullying.31

2.27
Submitters raised concerns however about the scope of services captured by the proposed provisions for the complaints-based schemes. In their view, a 'onesize-fits-all' approach is not appropriate and would create significant business and compliance challenges for many services.32
2.28
The Communications Alliance, for example, questioned what would be an appropriate outcome for services that were unable to comply with the regulatory requirements:
For example, messaging services (e.g. WhatsApp, Signal, Telegram) are often end-to-end encrypted and may not offer an option for removal of individual parts of a conversation. Does this mean that user accounts would be required to be suspended, restricted or terminated when a complaint (that has been found valid) about cyber-abuse material has been received? It is not clear that wholesale suspension from a messaging service is a proportionate response to a report of bullying and harassment – especially given how nuanced and complex private conversations between adults can be.33
2.29
DIGI noted the different approach in overseas jurisdictions, namely, the United Kingdom's decision to develop a framework 'designed to reduce the burden on UK business by focussing on the areas that present the greatest risk of harm' and Germany's Netzwerkdurchsetzungsgesetz (NetzDG) law that exempts certain services (such as messaging services).34
2.30
A number of submitters commented on potential jurisdictional impacts of the OS Bill, for example:
the Interactive Games & Entertainment Association submitted that providers should not be compelled to provide personal data on end-users based outside of Australia, which would involve complex legal considerations;35 and
the Uniting Church in Australia, Synod of Victoria and Tasmania submitted that the Commissioner should be empowered to address complaints whenever an Australian is the alleged perpetrator of cyber-bullying, regardless of the location or visa status of the victim:
If a tourist in Australia is subject to cyber-bullying by an Australian resident, we would expect that the e-Safety Commissioner should deal with such a case. We would hope by setting this example, if an Australian living temporarily or visiting an overseas location was subjected to cyber-bullying by a resident of that jurisdiction, the local law enforcement bodies would assist the Australian.36

Commissioner response

2.31
The Commissioner emphasised that Australia's youth-based cyber-bullying scheme was 'a world first' and the adult-based cyber-abuse scheme aims to similarly become 'a world first'. Ms Inman Grant highlighted that a significant feature of the OS Bill is its wide scope:
[A] really important feature of the proposed legislation is that it would expand our remit to cover a range of online services that have been used to harm Australian citizens. Sadly, online abuse doesn't just happen on social media. It happens everywhere, from domestic violence victims receiving harassing emails, texts or direct messages to children receiving unwanted contact on dating apps or gaming platforms.37

Blocking scheme

2.32
Many submitters acknowledged the need to manage violent content online (such as the live-streaming of the shootings in Christchurch). However, some did not support the proposal to give the Commissioner the power to request or require ISPs to block access to abhorrent violent material and others contended that the proposed blocking scheme is not the appropriate solution as it is too broad.38
2.33
The Australian Lawyers Alliance, for example, submitted:
…the proposed scheme has significant overreaches and thus fails to strike the appropriate balance between protection against abhorrent violent material and due process for determining whether content comes within that classification.39
2.34
The Australian Lawyers Alliance noted particularly that the Commissioner is not required to afford any procedural fairness in the exercise of the blocking power but, similar to other submitters (see Functions, powers and oversight of the Commissioner above), contended that there should be some form of internal review and appeal.40

2.35
Submitters noted also that, while there are exemptions to the blocking power, the exemptions have some limitations. For example, the exemption for material relating to news or current affairs reporting relies upon the Commissioner's determination of what is 'in the public interest', which the Australian Lawyers Alliance submitted could be interpreted in such a way as to undermine 'these critical avenues for accountability'.41
2.36
The Communications Alliance noted that there is no exemption for material produced by bystanders, which Digital Rights Watch argued is critical:
In some circumstances, violent acts captured and shared online can be of vital importance to hold those in power accountable, to shine the light on otherwise hidden human rights violations, and be the catalyst for social change.42

The new Online Content Scheme

2.37
A high proportion of submitters did not support some of the provisions that would establish the proposed Online Content Scheme.43 Many argued that these provisions—primarily the key definitions of 'Class 1 material'44 and 'Class 2 material'45—encompass consensual adult content that cannot be considered harmful and therefore within the objectives of the OS Bill.46
2.38
The Eros Association, for example, submitted that 'the role of the eSafety Commissioner should be to focus on non-consensual, abusive and harmful content and not imagery of consensual sexual activity between adults'.47
2.39
The Scarlet Alliance emulated these views and added that, by adopting definitions from the NCC, 'the Bill replicates and reproduces the problems of existing classification and broadcasting laws, which have long been critiqued by industry and civil society stakeholders'.48
2.40
Digital Rights Watch continued: 'Further using [those definitions] as the basis of broad and discretionary powers entrusted to an administrative official with no appropriate accountability mechanisms and no judicial oversight is a concerning development'.49
2.41
Other submitters clearly distinguished between Class 1 material and Class 2 material. For example, the Interactive Games & Entertainment Association considered that Class 2 material should not be included in the proposed Online Content Scheme:
There is a significant gulf in both policy and practical terms between material that has or would be classified RC (and even X18+) on one hand, and R18+ material on the other. R18+ material is legal and permissible material that is already regulated under the National Classification Scheme. Including R18+ material within the scope of the Online Content Scheme will result in double regulation.50

Classification Review

2.42
A number of submitters highlighted a current review into the development of a modern framework for classification to reflect the digital environment, meet the needs of industry, and provide appropriate information and protections for Australians (the Classification Review).51
2.43
Sex Work Law Reform Victoria submitted that 'the [Classification Review] is the appropriate avenue to consider changes to classification of online content and changes to the Online Content Scheme'.52
2.44
Alternately, other submitters argued that the OS Bill should not proceed until the outcomes of the Classification Review are known:
the Eros Association argued that these outcomes might shortly render key terms within the OS Bill ('Class 1 material' and 'Class 2 material') 'obsolete';53
the Communications Alliance more broadly agreed that 'it is not clear how potential findings of [the] review will interact with the proposed new Online Safety Act';54 and
the Australian Lawyers Alliance submitted that '[the OS] Bill should not be reliant on…an outdated classification system'.55
2.45
Submitters noted that the proposed provisions for the Online Content Scheme do not incorporate elements of procedural fairness (such as notice of an impending takedown notice). The Victorian Pride Lobby suggested that this situation could lead Australian businesses to host adult content online on overseas platforms, denying custom to Australian web-hosting businesses and placing the content beyond Australia's jurisdiction.56

Commissioner response

2.46
The Commissioner noted that, as an independent statutory authority, she is bound to follow the legislative and regulatory framework in making decisions: 'it has to be a test using [that framework], and it has to stand up to AAT review and in a court of law'.57
2.47
Ms Inman Grant reiterated that a goal of the proposed Online Content Scheme is to modernise and pivot the Broadcasting Services Act 1992, which was:
…based on a morality and obscenity code—to one that uses an online harms lens, which we think is a more accurate and a better way to not put values and morality into the decision-making process and really look at the tweeter, the post, and the damage and harm it's causing to the person on its face. And so that's why I think updating and modernising the scheme is really important, because community standards have changed over that time as well.58

Impacts on stakeholders in the online adult industry

2.48
Several submitters considered that the proposed Online Content Scheme would adversely impact a broad range of industries and people associated with the online adult industry, including producers, sex workers, retailers and entertainment venues, as well as specific population groups, such as netporn consumers, LGBTQIA+ and disabled communities.59
2.49
Ms Pixie Styx, who works within the online adult industry, illustrated the personal circumstances of many workers:
I live with several disabilities… I am also a full-time student and sex work has given me the ability to attend my classes and schedule shifts at times that suit me. As a member of the LGBTQIA+ community, sex work also gives me the freedom of self-expression I require to assist with my gender presentation and the easing of dysphoria associated with said expression. It is imperative that I be able to continue my word in this industry ad it is one of the few that provides the flexibility, freedom and financial security I require while living under my current circumstances.60
2.50
Dr John O'Brien, an expert in workers' rights and industrial regulation, submitted that there is no justification for creating an environment that restricts sex workers' capacity to work:
Other businesses use social media and online platforms to advertise their products and services. Why, then, should the services of sex workers be afforded fewer rights than other businesses? The sex work is largely lawful (but regulated) in Australia and thus should not be subject to discriminatory regulation.61
2.51
Some submitters contended specifically that the proposed provisions would have human rights implications, including on the freedoms of privacy and expression, as guaranteed under Articles 17 and 19 of the International Covenant on Civil and Political Rights, respectively, and contrary to Australia's international human rights obligations.62
2.52
The Victorian Pride Lobby submitted, for example, that '[Classifications can lead] to pornographers making compromises in terms of aesthetics and ethics – or in other words, selfcensoring – in order to meet classification requirements and avoid law enforcement'.63
2.53
Numerous submitters referenced legislation in the United States that aimed to combat sex trafficking by penalising online publishers of adult content connected to sex trafficking.64 These submitters argued that the US legislation—which is similar in some respects to the OS Bill—significantly impacted the well-being and livelihoods of sex workers, and led to risk-averse behaviours from providers and producers.65

Cessation of services

2.54
Some submitters and submitters noted provisions in the OS Bill that would enable the Commissioner to seek an order from the Federal Court of Australia, in certain circumstances, for the cessation of services in Australia by a provider.66 For example, Twitch submitted:
…any scheme that justifies mandating the complete removal of a service on the basis of its non-compliance with notices should also take considerable steps to establish confidence that the service is demonstrating actual noncompliance, before proceeding to upstream disruption powers. It is desirable that the drafting is amended to better address issues of procedural fairness (i.e. confirming the notice has been sent to the right person and having an opportunity to work with the Commissioner before a Court application is made) and validity (i.e. there should be a mechanism by which a service can clarify or object regarding the issuing of a notice).67

Departmental response

2.55
A departmental representative noted that the Commissioner would not herself be able to effect the cessation of a provider's services in Australia. Instead, a judicial body would be given the power to make that decision in certain circumstances and only as a last resort.68
2.56
Ms Gannon acknowledged that both parties in the process would have the opportunity to present their case and there would be quite a 'high test', whereby the provider must represent a significant community safety risk: 'It's not merely two contraventions of the legislation'.69

Industry codes

2.57
A few submitters commented on provisions in the OS Bill that would enable bodies and associations that represent sections of the online industry to develop industry codes. They noted particularly that an industry code should be registered under the Act within six months of commencement.70
2.58
While welcoming these provisions, the Communications Alliance expressed concerns about the proposed six-month timeframe:
Over the past two decades, Communications Alliance has developed and revised hundreds of industry codes and standards for various elements of the communications industry and the related consumer experience environment. Drawing on this experience, we are concerned that the proposed timeframe of 6 months for registration (as opposed to development of a draft for consideration for registration) of an industry code, is extremely short or indeed unrealistic. This timeframe becomes even more unrealistic considering that several codes are likely to be required to cater for the needs of the different sections of the online industry.71
2.59
Mr John Stanton, Chief Executive Officer of the Communications Alliance, highlighted the complexities of developing codes across industry and indicated that nine to 12 months might be sufficient for industry to provide a draft code to the Commissioner: 'for a complex code, guaranteeing that it's done within six months, including registration, is certainly very challenging, unless it's a very simple code, which these ones will not be'.72
2.60
DIGI agreed that six months is an unrealistic timeframe and 12 or more months would be necessary to ensure proper compliance with the new requirement:
…the development of an industry code comes with a significant investment of time, expense and adjustment of other priorities... Based on our direct experience, we suggest that [the proposed] timeframe is unrealistic. Firstly, it may not allow enough time for the [Commissioner] to develop clear written guidance to the industry about the scope of the code. It is critical that there is a clear understanding of the specific responsibilities, deliverables and lead times required by both regulators and industry participants if a set timetable is to be achieved with the development of an industry code. DIGI's experience is that a workable code requires a minimum of 12 months to develop, from the time that the Commissioner releases written guidance on the code for the industry.73

Commissioner's response

2.61
Mrs Maria Vassiliadis representing the Commissioner pointed out that the relevant provision for registration of an industry code refers to 'reasonable efforts' being made and accordingly, 'our position would be that there is scope there to potentially make it longer than the way it's structured'.74
2.62
Ms Inman Grant added that there are some complexities across industry and developing industry codes will be time and labour intensive, however: 'we want to make sure we get them right'.75

Reduced response time for removal notices

2.63
Many services highlighted their commitment to protecting their users online, for example, by ensuring that illegal and harmful content is dealt with as quickly as possible or proactively prevented through, for example, online safety features, awareness-raising and education.76
2.64
The Communications Alliance submitted, for example:
…social media platforms and search engines dedicate vast amounts of time and resources to minimise abuse of their services and potential harm that may result from content that is accessible through their services. The overwhelming majority of abuses are detected and removed by the major platforms proactively and without requiring or using an internal or external escalation mechanism.77
2.65
Microsoft acknowledged however that 'voluntary, industry efforts are necessary but not always sufficient to address the full range of harms online'. Its submission supported further action by all stakeholder groups, with 'principled and carefully calibrated regulatory efforts' an important part of ensuring online safety.78
2.66
On this point, some submitters noted collaborative efforts among government and industry to enhance online safety—such as the Safety By Design initiative79—while others highlighted initiatives that they employ throughout their global reach—such as Microsoft's harmonised safety principles that inform its thinking on regulatory developments across jurisdictions.80

24-hour turnaround

2.67
Submitters strongly agreed that online content that harms children and young people should be taken down expeditiously. The Queensland Family and Child Commission, for example, endorsed provisions in the OS Bill that would require providers to remove harmful material within 24 hours:
The increase of children and young people accessing the internet has seen a corresponding upward trend in cases of online child sexual exploitation, including image-based abuse and online grooming... Harmful online content has the power to dramatically affect the lives of children and young people, and it is imperative that harmful content is taken down as quickly as possible.81
2.68
The Uniting Church in Australia, Synod of Victoria and Tasmania similarly submitted that a mandatory provision is both desirable and necessary:
Even for child sexual abuse material, providers have been slow at removing illegal content and have contested its removal… Given the delay and contestation of illegal material concerning child sexual abuse material, it is likely there will be many providers that would seek to contest or delay the removal of other material such as cyber-bullying material, abhorrent violent material and intimate images posted without consent unless compelled to comply by law and under threat of penalty.82
2.69
Some submitters questioned however the need to reduce the current 48-hour turnaround. Google Australia, for example, observed that 'the eSafety Commissioner has made repeated references to the fact that most platforms remove content upon receiving a request from her Office very promptly'.83
2.70
In these circumstances, the Communications Alliance argued:
It appears unwarranted to shorten the timeframes and, at the same time, to expand the scope of services and providers captured under the scheme to include a wide variety of (often very small) services… Should Government proceed with a reduction to a 24-hour time period, we believe there should be exceptions where an investigation requires more time to determine the nature and circumstances of the content, or where consideration of an appeal from the party whose content is to be removed is required.84

2.71
Other submitters concurred that not all takedown requests can be expeditiously or fairly accommodated and in those cases, they argued that there is a need for legislative flexibility. In their view, the OS Bill should stipulate a more practical requirement, for example: 'with all due speed'; 'without undue delay'; 'as soon as practicable'; or 'expeditiously'.85
2.72
Some submitters referenced overseas regulatory frameworks, particularly Germany's NetzDG law (where a notice can require the removal of illegal online content within 24 hours). However, these submitters explicitly rejected that that Germany's law is consistent with the 24-hour turnaround proposed in the OS Bill.86 DIGI, for example, submitted:
…the Netzwerkdurchsetzungsgesetz ("NetzDG") law in Germany relates only to "illegal" content by cross-referencing the German Criminal Code… Content that is not clearly illegal under NetzDG is subject to a seven day review period. There is no subjective discretion for any regulatory body in Germany to demand content be removed, nor to demand that such content is removed within 24 hours on that basis.87

Commissioner's response

2.73
Ms Inman Grant described the 24-hour turnaround as an important inclusion in the OS Bill:
One important thing we've learned over the past several years…is that the quicker we get this harmful content taken down…the quicker we relieve the trauma and stress the victim experiences because of this abuse. This is really at its heart a victim-focussed provision'.88
2.74
She added:
Moving the deadline from 48 to 24 hours should not be a big burden for big tech. Twenty-four hours is the standard time frame for removal of [Child Sexual Abuse Material] and terrorist content in other countries, such as Germany.89
2.75
Ms Inman Grant acknowledged that, in some cases, flexibility will be required: 'if a particularly complex case arose and a platform told us they would need more time, we would of course take that into consideration'.90

Public consultation processes associated with the bills

2.76
Numerous submitters and witnesses expressed a view that the first public consultation process for online safety reform, which commenced in December 2019, was productive. Submitters noted that several suggestions were incorporated into the exposure draft of the OS Bill.91
2.77
Public consultations on the exposure draft of the OS Bill, which commenced in December 2020, closed on 14 February 2021, ten days prior to the OS Bill's introduction into the Parliament. Many submitters commented on this short timeframe, as well as the timeframe allowed for this inquiry. For example, Digital Rights Watch stated:
…we are submitting to your Committee the same concerns and recommendations as we did to the government's initial consultation on the draft Bill, just over two weeks ago. We are extremely concerned that in spite of receiving over 370 submissions, the government moved the Bill to the Parliament with no notable amendments or alterations, and we are further concerned regarding the time pressure on your Committee to produce a final report by 11 March 2021. This consultation process is disproportionate to the large-scale impact of the powers presented in the draft Bill and sets up a disastrous precedent globally for how such legislation may be designed in democratic institutions.92
2.78
Similarly, Microsoft made the following comments:
The passage of the Bill through the Parliament has been completed at pace, with little opportunity for analysis and comment. Consultation on the exposure draft closed on 14 February, and the revised Bill was introduced to the House on 24 February, before being referred to this Committee on 25 February. Little time has been available for providers and other stakeholders to understand the changes that have been made to the Bill, and the potential implications of these before submitting to this Committee.93

Commissioner and departmental response

2.79
The department provided the committee with detailed information on the conduct of the two public consultation processes. In particular, it outlined the extent of stakeholder engagement and indicated the number of submissions received throughout the consultations (see Appendix 3).94
2.80
A departmental representative outlined the manner in which submissions had been received and published online for the consultation that had commenced in December 2020. With respect to this feedback, the representative advised that the Minister had further considered 56 matters and made seven technical amendments to the OS Bill:
…many of those issues were issues that the government had made a deliberate policy decision on as part of previous decision-making processes, informed by previous consultation and informed by the government's election commitment at the last election.95
2.81
Ms Gannon noted that the amendments were also informed by advice from the Commissioner and the Attorney-General's Department.96

Committee view

2.82
The committee recognises that the internet has brought substantial and widespread benefits to Australians, while also increasing opportunities for online harm.
2.83
Since at least 2018, the Australian Government has acknowledged the need to consider improvements to online safety. In December 2019, the government began extensive consultations with affected stakeholders and has expeditiously introduced the bills following those consultations.
2.84
The committee welcomes the government's attention to this important issue and notes that submitters and witnesses strongly supported the bills' objectives.
2.85
The committee accepts that, in giving effect to Australian Government policy, the department has for well over 12 months engaged in extensive consultations, which in the committee's view have provided sufficient opportunity to understand the provisions set out in the OS Bill and to provide the government with feedback.
2.86
The committee also notes that the Australian Government and the department considered in detail, feedback received throughout the public consultations. The committee acknowledges that this might not have been apparent to all stakeholders however, the committee particularly notes the department's advice that the consultations have informed the OS Bill.
2.87
Turning to the provisions of the OS Bill, the committee heard concerns about a lack of transparency, accountability and review in relation to the exercise of the Commissioner's proposed powers. The committee notes that there are extensive existing mechanisms for public and parliamentary scrutiny, as well as provision for statutory independent review of the operation of the OS Act, and administrative and judicial review of individual decisions made by the Commissioner. Despite this, some committee members continue to share the particular concerns expressly identified by some submitters and witnesses regarding transparency and reporting in the exercise of the Commissioner's powers.
2.88
The committee also heard concerns about the potential breadth of the BOSE. The committee notes the broad BOSE framework set out in the primary legislation and that the Parliament will have oversight of the Minister's BOSE which will be a disallowable legislative instrument.
2.89
In relation to the online content regulation schemes, the committee supports the harmonisation and extension of the arrangements designed to mitigate online harm. As noted by the Commissioner, and others, there is an increasing need to cover a broad range of digital services, although, as apparent in information provided to the committee, this will be an ongoing challenge in such a complex and dynamic environment.
2.90
The committee recognises that there is also a need to mitigate the potential harm caused by abhorrent violent content online. The committee notes that some submitters did not agree with the proposed blocking scheme on the grounds that it does not sufficiently provide for procedural fairness. The committee notes however that ultimately these matters would be independently determined through judicial or administrative processes, where both parties would have the opportunity to present their case.
2.91
Many submitters and witnesses voiced concerns about the proposed Online Content Scheme. Critically, these concerns stem from key definitions which were said to be outdated and which are part of a current review to examine modernising the classification scheme. In this respect, the committee notes the Commissioner's view that the new scheme is intended to focus on online harm, which the committee considers is an appropriate approach and which would not capture the significant majority of adult content online.
2.92
The committee heard submitter concerns regarding the provisions within the proposed Online Content Scheme that express an expectation that sections of the online industry and the Commissioner should formulate and register industry codes within six months. The committee considers that these provisions do not require registration of the codes within that timeframe and that there is scope to extend this timeframe.

Recommendation 1

2.93
The committee recommends that the Australian Government consider amending the Explanatory Memorandum to the Online Safety Bill 2021 to clarify that the requirement for an industry code to be registered within six months is for best endeavours and that the Commissioner has the discretion to work with industry over whatever timeframe is deemed necessary to achieve an effective outcome.
2.94
The committee supports the provisions of the OS Bill that enable the removal of harmful content online within 24 hours. The committee agrees that the Commissioner should have the power to remove such material expeditiously, noting that the Commissioner would be amenable and empowered to grant time extensions where warranted in the circumstances.

Recommendation 2

2.95
The committee recommends that the bills be passed.
Senator the Hon David Fawcett
Chair

  • 1
    Note: this report focuses on the Online Safety Bill 2021, as submissions and evidence predominantly raised concerns with that bill. The committee recognises that this does not mean that there are no concerns in relation to the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021.
  • 2
    On this last point, see for example: Australian Christian Lobby, Submission 22, p. 3; Queensland Family and Child Commission, Submission 25, pp. 6–7.
  • 3
    See, for example: proposed Parts 3 and 11–15 of the OS Bill.
  • 4
    See, for example: Interactive Games & Entertainment Association, Submission 3, p. 6; Electronic Frontiers Australia, Submission 30, p. 3, which called for a review of the operation of the Commissioner before the OS Bill progresses any further.
  • 5
    See, for example: Twitch, Submission 15, p. 2.
  • 6
    Interactive Games & Entertainment Association, Submission 3, p. 10.
  • 7
    Assembly Four, Submission 34, p. 1.
  • 8
    Electronic Frontiers Australia, Submission 30, pp. 6–7. Also see: Centre for Responsible Technology, The Australia Institute, Submission 7, p. 11, which argued that publication of up-to-date statistics would also serve an important educative function.
  • 9
    Scarlet Alliance, Submission 36, p. 3.
  • 10
    See, for example: Google Australia, Submission 1, p. [8]; Australian Lawyers Alliance, Submission 9, p. 9; Microsoft, Submission 26, p. 6; Digital Rights Watch, Submission 27, p. 7; Twitter, Submission 32, p. 8; Scarlett Alliance, Submission 36, pp. 2–3.
  • 11
    Electronic Frontiers Australia, Submission 30, p. 3.
  • 12
    Proposed section 239A of the OS Bill.
  • 13
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 23.
  • 14
    Ms Julie Inman Grant, eSafety Commissioner, Opening Statement, p. 3.
  • 15
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 24.
  • 16
    See, for example: Digital Media Research Centre, Queensland University of Technology, Submission 38, p. 6; proposed sections 220–222 of the OS Bill.
  • 17
    Communications Alliance, Submission 18, p. 13. Also see: proposed section 220 of the OS Bill.
  • 18
    Scarlet Alliance, Submission 36, p. 4. Also see: Digital Rights Watch, Submission 27, p. 3.
  • 19
    Twitter, Submission 32, p. 8.
  • 20
    Ms Julie Inman Grant, eSafety Commissioner, Opening Statement, p. 3.
  • 21
    Proposed Part 4 of the OS Bill. Also see, for example: Reset Australia, Submission 20, p. 11.
  • 22
    See, for example: Microsoft, Submission 26, p. 5, which questioned also how the obligations imposed on providers could unilaterally change over time.
  • 23
    Australian Lawyers Alliance, Submission 9, p. 5. Also see: Digital Rights Watch, Submission 27, p. 5; Digital Media Research Centre, Queensland University of Technology, Submission 38, p. 4, which argued that Class 1 and Class 2 materials should not be included as part of the core expectations. Note: the core expectations are set out in proposed section 46 of the OS Bill.
  • 24
    Australian Lawyers Alliance, Submission 9, pp. 5–6.
  • 25
    Global Partners Digital, Submission 16, p. 3.
  • 26
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 24.
  • 27
    Ms Pauline Sullivan, First Assistant Secretary, Content Division, Department, Committee Hansard, 5 March 2021, p. 29. Also see: Explanatory Memorandum, Online Safety Bill 2021, p. 29.
  • 28
    Ms Bridget Gannon, Assistant Secretary, Digital Platforms and Online Safety, Department, Committee Hansard, 5 March 2021, p. 29. Also see: proposed section 46 of the OS Bill.
  • 29
    Proposed Parts 3 and 5–9 of the OS Bill.
  • 30
    Twitter, Submission 32, p. 5.
  • 31
    Queensland Family and Child Commission, Submission 25, p. 4. Also see: Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 23; proposed Parts 3 and 5 of the OS Bill.
  • 32
    See, for example: Google Australia, Submission 1, pp. [4–5], which suggested that only content sharing services be within scope of the bill; Interactive Games & Entertainment Association, Submission 3, pp. 6 and 19; BSA | The Software Alliance, Submission 23, p. 2; Microsoft, Submission 26, p. 4.
  • 33
    Communications Alliance, Submission 18, p. 8. Note: the submission expressed similar concerns in respect of other services (eg. internet service providers and B2B services).
  • 34
    DIGI, Submission 29, p. 5.
  • 35
    Interactive Games & Entertainment Association, Submission 3, p. 9.
  • 36
    Uniting Church in Australia, Synod of Victoria and Tasmania, Submission 14, pp. 1 and 4–6.
  • 37
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 23.
  • 38
    See, for example: Communications Alliance, Submission 18, p. 12; Digital Rights Watch, Submission 27, p. 4. Also see: proposed Part 8 of the OS Bill.
  • 39
    Australian Lawyers Alliance, Submission 9, p. 6.
  • 40
    Australian Lawyers Alliance, Submission 9, p. 6. Note: the submission made similar arguments in respect of removal notices under the proposed new Online Content Scheme: p. 8.
  • 41
    Australian Lawyers Alliance, Submission 9, p. 7; Communications Alliance, Submission 18, p. 12; Digital Rights Watch, Submission 27, p. 4; Australia's Right To Know, Submission 41, pp. 1–2, which argued that news and current affairs reporting in text and audio visual should also be exempt. Also see: proposed section 104 of the OS Bill.
  • 42
    Digital Rights Watch, Submission 27, p. 4. Also see: Communications Alliance, Submission 18, p. 12.
  • 43
    Proposed Part 9 of the OS Bill.
  • 44
    The term 'Class 1 material' is defined in proposed section 106 and would include material classified as Refused Classification (RC) under the NCC.
  • 45
    The term 'Class 2 material' is defined in proposed section 107 and would include material classified as X 18+ or Category 2 Restricted under the NCC.
  • 46
    See, for example: Scarlet Alliance, Submission 36, p. 2, which described the OS Bill as failing to differentiate between 'actual harm and a subjective, moralistic construction of harm'.
  • 47
    Eros Association, Submission 2, p. 1. Also see, for example: Victorian Pride Lobby, Submission 10, p. 4; Sex Work Law Reform Victoria, Submission 12, p. 1; PivotNine Pty Ltd, Submission 31, p. 3.
  • 48
    Scarlet Alliance, Submission 36, p. 5.
  • 49
    Digital Rights Watch, Submission 27, p. 2.
  • 50
    Interactive Games & Entertainment Association, Submission 3, p. 10. Also see: Twitch, Submission 15, p. 3.
  • 51
    Department, 'Review of Australian Classification Regulation', www.communications.gov.au/have-your-say/review-australian-classification-regulation (accessed 7 March 2021). Note: there is also a current review into the Privacy Act 1988: Attorney-General's Department, 'Review of the Privacy Act 1988', www.ag.gov.au/integrity/consultations/review-privacy-act-1988 (accessed 7 March 2021).
  • 52
    Sex Work Law Reform Victoria, Submission 12, p. 2.
  • 53
    Eros Association, Submission 2, p. 3.
  • 54
    Communications Alliance, Submission 18, p. 5.
  • 55
    Australian Lawyers Alliance, Submission 9, p. 8. Also see: Twitch, Submission 15, p. 3, which noted that classification is 'difficult and fluid'.
  • 56
    Victorian Pride Lobby, Submission 10, p. 5.
  • 57
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 28.
  • 58
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 28.
  • 59
    See, for example: Eros Association, Submission 2, p. 3; Behind Closed Doors Radio Show, Submission 8, p. 1; Sex Work Law Reform Victoria, Submission 12, p. 3; Somatic Sex Educators Association of Australasia, Submission 24, p. 1; Assembly Four, Submission 34, p. 1.
  • 60
    Ms Pixie Styx, Submission 62, p. 1.
  • 61
    Dr John O'Brien, Submission 95, p. 2.
  • 62
    See, for example: Eros Association, Submission 2, p. 3; Global Partners Digital, Submission 16, p. 1.
  • 63
    Victorian Pride Lobby, Submission 10, p. 4.
  • 64
    Note: this legislation was the Stop Enabling Sex Traffickers Act of 2017 (SESTA) and Allow States and Victims to Fight Online Sex Trafficking Act of 2017 (FOSTA).
  • 65
    See, for example: Eros Association, Submission 2, p. 4; Victorian Pride Lobby, Submission 10, p. 4; Scarlet Alliance, Submission 36, pp. 2 and 7.
  • 66
    Proposed Division 9 of Part 9 of the OS Bill. Note: the circumstances are that the provider has failed to comply with a civil penalty provision more than twice in the past 12 months and represents a significant community safety risk.
  • 67
    Twitch, Submission 15, p. 4.
  • 68
    Ms Bridget Gannon, Assistant Secretary, Digital Platforms and Online Safety, Department, Committee Hansard, 5 March 2021, p. 31.
  • 69
    Ms Bridget Gannon, Assistant Secretary, Digital Platforms and Online Safety, Department, Committee Hansard, 5 March 2021, pp. 31–32.
  • 70
    Proposed Division 7 of Part 9 of the OS Bill; proposed section 137 of the OS Bill.
  • 71
    Communications Alliance, Submission 18, p. 6.
  • 72
    Mr John Stanton, Chief Executive Officer, Communications Alliance, Committee Hansard, 5 March 2021, p. 9.
  • 73
    DIGI, Submission 29, p. 15.
  • 74
    Mrs Maria Vassiliadis, Executive Manager, Office of the eSafety Commissioner, Committee Hansard, 5 March 2021, p. 32. Also see: Department, answers to questions on notice, p. 1 (received 10 March 2021); Ms Bridget Gannon, Assistant Secretary, Digital Platforms and Online Safety, Department, Committee Hansard, 5 March 2021, p. 32, who noted that the OS Bill frames this not as a requirement but as an expectation.
  • 75
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 32.
  • 76
    See, for example: Google Australia, Submission 1, pp. [2] and [3–4]; Interactive Games & Entertainment Association, Submission 3, pp. 3–4; Twitch, Submission 15, pp. 1–2; Microsoft, Submission 26, p. 1; DIGI, Submission 29, p. 1. Note: the Communications Alliance suggested that, on account of the preventative approach, there is greater scope for end users to also be within scope of the OS Bill: Submission 18, p. 8.
  • 77
    Communications Alliance, Submission 18, p. 4. Also see: Digital Rights Watch, Submission 27, p. 5; Digital Media Research Centre, Submission 38, p. 5, which both submitted that providers' automated moderation tools 'frequently make mistakes' when flagging and removing sexual content online.
  • 78
    Microsoft, Submission 26, p. 2.
  • 79
  • 80
    See, for example: Interactive Games & Entertainment Association, Submission 3, p. 5; Microsoft, Submission 26, pp. 2–3.
  • 81
    Queensland Family and Child Commission, Submission 25, p. 5.
  • 82
    Uniting Church in Australia, Synod of Victoria and Tasmania, Submission 14, p. 6.
  • 83
    Google Australia, Submission 1, p. [5]. Also see: Twitch, Submission 15, p. 4, which expressed a preference for the Commissioner to continue to use these informal processes; Twitter, Submission 32, p. 6.
  • 84
    Communications Alliance, Submission 18, p. 10. Also see: Google Australia, Submission 1, p. [6].
  • 85
    See, for example: Google Australia, Submission 1, p. [6]; BSA | The Software Alliance, Submission 23, p. 2; Twitter, Submission 32, pp. 6–7.
  • 86
    See, for example: Google Australia, Submission 1, pp. [6–7]; Communications Alliance, Submission 18, p. 10; Twitter, Submission 32, p. 6.
  • 87
    DIGI, Submission 29, p. 10.
  • 88
    Ms Julie Inman Grant, eSafety Commissioner, Opening Statement, p. 2.
  • 89
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 24.
  • 90
    Ms Julie Inman Grant, eSafety Commissioner, Committee Hansard, 5 March 2021, p. 24.
  • 91
    See, for example: Google Australia, Submission 1, pp. [3–4].
  • 92
    Digital Rights Watch Inc, Submission 27, p. 1. Also see, for example: Interactive Games & Entertainment Association, Submission 3, pp. 7–8; Communications Alliance, Submission 18, p. 1; Pivot Nine Pty Ltd, Submission 31, p. 1, which commented on the minimal difference between the exposure draft and the Online Safety Bill 2021; Assembly Four, Submission 34, p. 1.
  • 93
    Microsoft, Submission 26, p. 1.
  • 94
    Department, Additional Information, received 5 March 2021, pp. 1–2.
  • 95
    Ms Bridget Gannon, Assistant Secretary, Digital Platforms and Online Safety, Department, Committee Hansard, 5 March 2021, p. 25. Also see: Department, answers to questions on notice, Attachment 1 (received 9 March 2021), which identified amendments to the OS Bill arising from the second round of public consultation.
  • 96
    Ms Bridget Gannon, Assistant Secretary, Digital Platforms and Online Safety, Department, Committee Hansard, 5 March 2021, p. 26. Also see: Department, answers to questions on notice, p. 2 (received 9 March 2021), which identified the matters raised by the Attorney-General's Department.

 |  Contents  |