Chapter 2

Chapter 2

Introduction

2.1        Industry groups were generally supportive of the bill, arguing that it provides an appropriate regulatory framework for regulating content services across convergent devices, and provides sufficient flexibility to accommodate changing technological developments in the sector.[1] These groups suggested that specific areas of concern relating to particular parts of the bill could be addressed through minor amendments to the bill.

2.2        Other groups such as the NSW Council for Civil Liberties and Electronic Frontiers Australia opposed the bill on civil liberties grounds.[2] The Festival of Light supported the protections for children in the bill but pointed to inadequacies in the policy approach adopted by the bill.[3] The Arts Law Centre of Australia argued that bill does not adequately take into account the needs of filmmakers, multimedia and digital artists.[4]

Issues

The general approach to restricting access to services

2.3        The main focus of the bill is to extend the general approach adopted by the Government in relation to content services to those services where it considers adequate safeguards are not currently in place. Much of the content for these new services is likely to be based on content created for supply in relation to a range of other existing media services. The new regulatory regime will be aligned, as far as possible, with the regulation of traditional media content. At the same time, the framework takes account of the technical and other differences applying to the delivery of content on these new platforms. The proposed regulatory framework requires pre-assessment of content, access restrictions or prohibition where appropriate and complaints handling processes.[5]

2.4        Some submissions argued that the proposed regulatory regime will prove ineffective in preventing children from gaining access to unsuitable content. These submissions argued that the only effective technology available to protect children from unsuitable content is filtering on end-user devices, such as home computers.[6]

2.5        The NSW Council for Civil Liberties cited evidence from the United States in support of the use of filters:

We base that on a recent US decision where...his Honour Judge Lowell Reed of the District Court in Philadelphia undertook a trial...He took evidence on this from experts from the industry; interested groups including the ACLU—the American Civil Liberties Union—groups with opposing views; and experts from the US government. After hearing all the evidence, his Honour concluded that content filtering was the only proven technology and the only effective technology to ensure that children were protected from offensive material.[7]

2.6        The Government is supporting initiatives in this area, complementing the approach in the bill. DCITA noted that:

The government has put in place the Protecting Australian Families Online program, which will essentially take the issue of filtering content in the home a considerable number of steps further and draw much greater attention to it. At the same time we know that convergent devices are evolving as much more significant players enter the marketplace. This legislation is recognition of that. We also know that at the same time there is a whole lot of activity underway to try and put facilities on mobile phone and conversion devices to make them much more amenable to security and other aspects.[8]

2.7        Filtering technology is not currently available for devices like mobile phones. DCITA noted, however, that:

...as those things [security on mobile phones] are developed, you get much better access to a range of ways of more effectively providing these sorts of access regimes...when you are talking about access to content for children, there is a significant requirement on behalf of parents and guardians to take an active role.[9]

2.8        The committee recognises that home filtering will be an important vehicle for users managing content. However, this scheme in the bill is also consistent with accepted community expectations about regulating content, reflected in the longstanding practices of film and literature classification and existing schemes for other media. 

2.9        Some submissions also argued that the proposed regulatory regime is deficient in that it cannot prohibit or restrict content that is unsuitable for children that is hosted overseas.[10] DCITA noted, however, that regulation of content needs to adopt a multi-faceted approach, including international initiatives:

...the regulation of content needs to be seen as a holistic exercise which is tackled on a number of fronts. While there are obvious jurisdictional issues in terms of us regulating content that might come out of, say, the United States or Russia, the government’s other arm to this is the use of filtering technology, through Protecting Australian Families Online, which empowers people to filter out overseas content. It is a multifaceted content strategy which is a mixture of regulatory tools and empowerment of families to take steps to protect themselves against technology.

There is also an emerging concern internationally about the need to try and get more cross-border understanding of these issues. Australia is one of the leading countries in dealing with the problem as we see it here.[11]

Operation of take-down notice scheme

2.10      Schedule 1, Part 3, Division 3 of the bill sets out actions that can be taken in relation to hosting services and provides the Australian Communications and Media Authority (ACMA) with the necessary powers to issue hosting services with a 'take-down notice'. This notice directs them to remove material that has been judged to be prohibited content under the classification guidelines (see chapter 1 for overview). The provisions of the bill only relate to hosting services with an 'Australian connection', and Clause 3 provides that a hosting service has an 'Australian connection' if and only if any of the content hosted by the hosting service is hosted in Australia.[12]

2.11      Alternatively, where content has not yet been classified, but is considered to be 'potential' prohibited content, ACMA has the authority to determine that, if the material in question were to be classified by the Classification Board, it would be likely to be classified as RC, X18+, category 2 restricted, R18+ or MA15+. If ACMA makes an assessment that it could be classified as such, it must issue the hosting service with an interim take-down notice. This interim notice will apply until the Classification Board has given the content a classification.[13]

2.12      If the material is subsequently classified as prohibited content, ACMA is then required to issue the hosting service with a final take-down notice. However for lower classifications of R18+ or MA15+ host providers may be allowed to continue hosting the classified material provided it is subject to a restricted access system.[14]

2.13      The bill provides for anti-avoidance measures to ensure that a hosting service that is the subject of an interim or final take down notice, does not host content that is the same as or similar to the prohibited content. ACMA may issue a special take down notice if they believe that the hosting service is hosting, or intending to host, such content, directing the service provider to take down the material as in the earlier provisions of the bill.[15]

2.14      If a hosting service is issued with either an interim, final, special take-down or link-deletion notice by ACMA then, under Part 8 of the bill, the host can choose for ACMA's decision to be reviewed by the Administrative Appeals Tribunal (AAT).[16] This gives the service provider some recourse if they feel that the original decision to issue a notice by ACMA was inappropriate.

2.15      A hosting service may voluntarily stop hosting content while subject to an interim take-down order, and in this case ACMA may accept this course of action by the hosting service and may decide to revoke the interim notice and not proceed with seeking a classification of the content.[17]

2.16      Similar provisions apply for live content services and links services. In the case of prohibited or potentially prohibited live content, ACMA may issue final or interim service-cessation notices, however there are no anti-avoidance measures  (i.e. no special take-down notices) applicable to live content services like there are for non-live and links services. In relation to links services, ACMA may issue an interim or final link-deletion notice where the link is provided by a links service with an Australian connection, and these actions are subject to the same classification assessments and AAT review mechanisms applied to content hosting services.[18]

2.17      In the event that a content service provider fails to comply with a take-down, service cessation or link deletion notice (including where, in ACMA's opinion, it supplies content that is substantially similar to content which is already subject to such a notice), civil or criminal penalties may be pursued.[19]

2.18      AMTA and the IIA expressed some concern about the operation of special take-down notices.[20] They sought more clarity about how they would operate, particularly in terms of what would be the meaning of 'similar content', and queried the practical operation of the system in regard to user-generated content:

Satisfying a special take-down notice will be difficult without the content service provider having to monitor the service to ensure that it has not been reposted by the user or another user under a different guise; the threshold for substantially similar content is potentially very low and we are concerned that content service providers will be forced to implement monitoring and examination of all content after a special takedown notice has been issued.[21]

2.19      ACMA responded that its understanding was that special take-down notices were meant to be highly targeted, rather than set in the broad terms that appeared to be causing concern to industry participants.[22] ACMA drew attention to the second reading speech, which noted the intention of making it possible to deal with 'repeated and deliberate offences'.[23] ACMA also pointed out that subclause 53(4) of the bill 'provides a defence if the hosting service provider proves that it did not know and could not, with reasonable diligence have ascertained that the content was prohibited or potential prohibited content'.

Implementation of remedial situations (clause 47)

2.20      Under clause 47 of the bill, ACMA can, following an investigation, direct a hosting service provider to take steps to remove content from being publicly available: what are known as either Type A or Type B remedial situations. The Internet Industry Association (IIA) and the Australian Mobile Telecommunications Association (AMTA) argued that the definitions of these situations, contained in clauses 47(6) and (7) of the bill, do not make clear what needs to be done to ensure compliance.[24] They suggested that the requirement in both cases that the provider not 'host' the content lacks a definition of 'hosting'. They argued that this could be taken to imply that a firm would need to delete all copies of the content in its possession, which could be problematic if the firm needs to maintain a copy for evidence, or if it wishes to contest a classification decision. The bill also states that these remedial situations will only be satisfied when the content 'is not provided by a content service provided to the public'. The industry associations suggested this could be difficult to satisfy, as any given hosting service provider can only cease to provide the material itself – it cannot stop the content creator from seeking another service provider to host their material on the Internet.

2.21      DCITA responded pointing out that, though the term 'host' may not be defined in clause 47, clause 4 makes the intent clear by defining a hosting service as one in which content is provided 'to the public'. As a consequence, the department argued, 'there is nothing in clause 47(6) or (7) that would prevent a hosting service provider from retaining the potential prohibited or prohibited content for the purposes of evidence'.[25] DCITA also stated that any ACMA notice 'would apply only to the hosting service provider in relation to the content hosted by its hosting service which is the subject of the notice'.[26]

Content creators

2.22      Some evidence expressed concern at the likely impact of the bill on content creators. In relation to artists, the Arts Law Centre stated that the bill is likely to detrimentally affect artists' use of convergent content technologies for the purposes of artistic expression as well as their ability to disseminate such works through that technology.[27]

2.23      The Arts Law Centre pointed to some of the potential problems for artists:

Some of our further concerns are that it is difficult for an artist or a content creator to know when their work will be potentially be rated X18+. How will an artist, a content creator, know when their work is likely to be rated R18+? It also appears that artists who create anything that might be suitable for an MA15+ or above audience would be seriously disadvantaged under the bill. This is unsuitable when artists and content creators are creating material that might deal with serious dramatic scenes that question society or refer to drugs, sex or violence in any way that is beyond the scope of and MA15+ rating. When material is removed from a carriage service provider, who will decide that it is potentially prohibited content? Can this decision be appealed by the content creator? Who will decide that there is a substantial likelihood that the content would be prohibited content if it were classified? [28]

2.24      The Arts Law Centre added that:

...it creates a complex environment for artists who upload their work to carriage service providers, whether that is by mobile phone distribution or the internet. It means that they will need to seek more advice in relation to their work before it goes up and...the present system which is put forward in the bill would be difficult for them to negotiate. The lack of an appeals procedure would mean that they may have no recourse if their work is taken down.[29] 

2.25      DCITA argued that artists as content creators will need to work closely with content deliverers but it is not envisaged that the processes will be excessively onerous:

We discussed....the process that is involved in the classification, and you can appeal against that process if you believe that your material has been incorrectly classified. The artist, the original content creator, will need to work with the people who actually make their content available to the public to set up their relationship there...The objective of the bill is to do that [protect children] while trying to provide an environment for content creators and content developers where there is incentive to continue to work with new media. It will always be a balancing act. While we are conscious of the concerns of the group you talked about, there does not seem to be, in the way we intend the system to operate, something which could cause enormous constraints.[30]

2.26      Microsoft argued that the application of the bill to user-generated content represents a significant departure from the approach taken in other jurisdictions. In particular, Microsoft pointed out that under the European Commission's (EC) Audiovisual Media Services Without Frontiers Directive:

..."private websites and services consisting of the provision or distribution of audiovisual content generated by private users for the purposes of sharing and exchange within communities of interest" fall outside the scope of regulation.[31]

2.27      The EC's Audiovisual Media Services Without Frontiers Directive was agreed to on 24 May 2007. The Directive will replace the former Television without Frontiers Directive, and aims to provide a modern pro-competitive legal framework that covers audiovisual media services, while at the same time continuing to ensure a high level of consumer protection, including appropriate measures to protect minors.[32] As Microsoft has pointed out the new Directive limits the scope of the definition of audiovisual media services. In particular, the definition does not cover:

...activities which are primarily non-economic and which are not in competition with television broadcasting, such as private websites and services consisting of the provision or distribution of audiovisual content generated by private users for the purposes of sharing and exchange within communities of interest.[33]

2.28      In its submission, Microsoft strongly supported the EC's approach. To ensure consistency with the EC, Microsoft suggested that the bill be amended to introduce a 'user generated content' exception to the definition of 'content service' so that all content that is substantially generated by an end user of a content service is excluded from regulation.[34] 

2.29      DMG Radio also argued that user-generated content should be exempted from the bill, as commercial radio providers do not control that content and any requirements for them to pre-vet that content would lessen their ability to communicate with users 'live' and in 'real time'.[35]

2.30      DCITA noted that 'artistic content' will be treated no differently to other forms of content under the bill:

...government policy has sought to implement a content neutral approach. So the regulatory scheme is aimed at content services in a fairly generic sense without identifying particular groups of the community who might be treated differently from other groups. If artistic content or any other content meets the criteria of prohibited content, it will be considered prohibited content; if it does not then it will not. We do not treat ‘artistic content’ any differently from any other form of content.[36]

Restricted access system

2.31      The use of restricted access systems is required where content is, or is likely to be, classified R18+. It is also required where content is, or is likely to be MA15+, subject to certain exemptions.

2.32      ACMA may by legislative instrument declare a specified access-control system to be a restricted access system for the purposes of Schedule 7. The instrument may make different provision about R18+ content, and MA15+ content. ACMA must have regard to the objective of protecting children who have not reached 15 years of age from exposure to unsuitable content.[37]

2.33      AMTA and IIA argued that a 'restricted access system' is not defined in the bill although clause 14 gives powers to ACMA to declare that a specified access-control system is a restricted access system for the purposes of the Bill.

The problem for industry is that there is no certainty as to what ACMA may declare as a suitable access-control system. In effect, industry must obtain ACMA’s approval to whatever system industry wishes to use. There is also no requirement for ACMA to consider the practicality of the system it endorses nor the suitability of access-control systems already implemented by industry, for example, as a result of the MPSI scheme or, by most ISPs, through requiring a credit card to be used for on-line services.[38]

2.34      AMTA and IIA suggested that ACMA should be required, under the bill, to approve an access-control system that is consistent with current industry practice, is easily understood by consumers and can be readily implemented by industry.[39] Currently, the only way to pragmatically restrict access to Internet content services to persons over 18 years, and thus achieve the policy objectives of the bill, is to require production of a credit card. ASTRA also argued that ACMA needs to clarify what progress it is making with the development of the instrument that will define 'restricted access systems'.[40]

2.35      In response to these concerns ACMA advised that:

In developing a restricted access system declaration, ACMA will consider the requirements of the legislation, take into account the different business practices of service providers affected by the declaration, and take into account the methods they currently employ for age verification. As is its usual practice, ACMA will consult closely with industry in developing the declaration.[41]

2.36      DCITA noted that it would 'expect ACMA to consult with the industry prior to the making of the legislative instrument. Note, however, that a failure to consult will not affect the validity or enforceability of a legislative instrument'.[42]

Relationship to the Mobile Premium Services Determination (MPSD)

2.37      AMTA and IIA argued that is unclear how the bill will affect the regulatory scheme operating under the MPSD. It was argued that certain provisions of the bill conflict with the MPSD and would override the relevant MPSD provisions to the extent of the inconsistency.[43]

There are no provisions addressing transition from the MPSD (and the related industry Mobile Premium Services Industry (MPSI) Scheme). This leaves as uncertain the regulatory status of a number of services that are not covered as comprehensively in the Bill, such as chat services. In addition, unlike the MPSD, the Bill is not limited to mobile content provided via premium services or proprietary network services.[44]

2.38      AMTA and IIA suggested that if there is to be one regulatory regime which governs both mobile and internet content, the Content Services Bill regime is preferable, in that it provides a comprehensive regime for all new media, including that transmitted over the internet and mobile networks.

2.39      ACMA stated that it is aware of industry concerns and would be reviewing the MPSD once the bill is passed:

We are aware of AMTAs concerns about the application of the mobile premium services determination and the industry scheme that is underneath that. Once the legislation is passed we would move to look at that determination again to see if the bill has picked up the content matters under the original determination and to review the determination to make sure that we were not adding to the confusion in the industry between having an existing determination and new legislation.[45]

2.40      DCITA stated that the MPSD was implemented as an interim regulatory measure pending the development and implementation of new legislation which is proposed through the Content Services Bill – 'the MPSD is a legislative instrument made by ACMA, and it is preferable to amend the instrument to comply with the new legislation rather than the other way round'. The Department noted that ACMA advised the committee that it would be reviewing the MPSD and its relationship to the requirements in the new legislation once it is passed – 'there is thus no need to amend primary legislation to ensure that it is compatible with the MPSI'.[46]

Australian connection

2.41      The existence of an 'Australian connection' is integral to the terms 'content service' and 'hosting service'. A content service will only be subject to the Schedule 7 regulatory scheme to the extent that it has an Australian connection.

2.42       An Australian connection is established if one or more of the following situations exists (clause 3):  

  1. any of the content provided by the content service is hosted in Australia;
  2. any live content provided by the content service originates in Australia;
  3. in the case of a  content service supplied by way of a voice call or video call using a carriage service, any of the participants in the call, other than an end-user of the service, are physically present in Australia.[47]

2.43      The Explanatory Memorandum stated that live content originates in Australia if the content is located in Australia. A hosting service has an Australian connection for the purposes of the Schedule if, and only if, any of the content hosted by the hosting service is hosted in Australia.[48]

2.44      AMTA and IIA argued that the application of the 'Australian connection' test in relation to links services is unclear:

On looking at the interaction between the definition of links services (i.e. a content service that provides one or more links to content and is provided to the public whether on payment of fee or not) and Australian connection, it would seem that the content it is linked to has to be hosted in Australia for there to be an Australian connection. Accordingly, if the link is on an Australian-hosted site, then the link-deletion notice should be addressed to the host of that site. [49]

2.45      AMTA and IIA suggested that for clarity, it would appear more reasonable if the links themselves had to be hosted in Australia, irrespective of where the linked content is hosted.[50]

2.46      DCITA stated that 'the existing provisions achieve what is being suggested' by AMTA and IIA. The Department noted that:

There are two elements that need to be satisfied to establish an Australian connection with respect to a links service.

Firstly, in accordance with clause 3(a), the link (which would fall within the definition of ‘content’ under clause 2) to the content will need to be hosted in Australia.

Secondly, to be subject to an ACMA notice, the links service must provide end-users in Australia with a link to another content service which specialises in prohibited or potential prohibited content (refer clause 8). The prohibited content itself may be in Australia or overseas. The enforcement action taken by ACMA under Division 5 of Schedule 7 in relation to an Australian-hosted link to potential prohibited or prohibited content would be by way of a link deletion notice issued to the links service provider.  However, if the prohibited content itself is hosted in Australia, ACMA can direct a take down notice to the hosting service (as opposed to the link provider) ie the server which actually hosts the content.[51]

2.47      Microsoft argued that the 'Australian connection' test for live content may operate to bring offshore content service providers within the ambit of proposed Schedule 7 in a manner that is not intended by the Government. Microsoft's argued that the bill does not specify how to identify the place where the live content originates.[52] 

Definition of 'content service'

2.48      'Content service' under the bill (Schedule 7, clause 2) is defined as:

  1. a service that delivers content to persons having equipment appropriate for receiving that content, where the delivery of that content is by means of a carriage service; or
  2. a service that allows end-users to access content using a carriage service.

2.49      The definition of 'content service' is broad in scope to ensure that all of the categories of content providers are covered. The bill also provides for 22 specific exemptions. In addition to the specified exemptions, the definition of 'content service' is further narrowed by reference to its Australian connection (clause 3); the provision of content by a content service (clause 6); and the provision of a content service to the public (clause 7).[53]

2.50      AMTA and IIA argued that by referring to a service that 'delivers' content to end-users, the definition of a content service appears to confuse the roles of a content service provider and a carriage service provider.[54] AMTA and IIA considered that the definition is confusing in the context of new media technologies where there is clear distinction between content and carriage:

In industry’s view, a content service provider is not responsible for the delivery of content but rather it makes content available to the public. This is distinct from a carriage service provider which has no control over the content which it carries over its network. The definition of a content service in the Bill could potentially include activities of carriage service providers, Internet service providers, premium service providers and content hosts. In this context, the carriage service provider exception in clause 5 gives no protection for a carriage service provider who may provide more than a mere carriage service eg. billing of services, collection of revenues or accepting advertising in conjunction with a content service, but nonetheless has no role in securing that the content is made available. [55]

2.51      AMTA and IIA suggested that the word 'delivers' in paragraph (a) of the definition of content services should be replaced with the words 'makes available'. This adopts the technology-neutral concepts in the Copyright Act 1968 in relation to communicating content to the public.

2.52      DCITA stated that it is further considering the definitional issues around 'content service':

Senator WEBBER—Has any consideration been given to further refining the bill to address those issues? They seemed to still think that they have got problems with no definition of ‘host’ and the definition, with content service, around ‘delivers’.

Dr Pelling—We are looking closely at those and are seeking the advice of our own legal staff. Once we have worked our way through those, we will prepare advice for the minister.[56]

2.53      DCITA further advised that the definition of ‘content service’ has been drafted broadly 'to capture all possible categories of content services.  The definition is then limited by certain exemptions and by reference to clauses 3, 6 and 7'.

The definition of ‘content service’ may extend to certain carriage service providers that are in the business of both providing a carriage service and also providing a content service.  These types of carriage services will be subject to Schedule 7 to the extent that they provide content services.  The intent of the scheme is to regulate all content services irrespective of whether or not they are also carriage services.

Schedule 7’s application, however, is limited by clause 5, which provides that, for the purposes of Schedule 7, a person does not provide a content service merely because the person supplies a carriage service that enables content to be delivered or accessed.  Clause 5 does not exclude a carriage service that also collects revenue in relation to the provision of the carriage service.  The issue that will trigger the application of Schedule 7 is whether the carriage service is also a ‘content service’, not whether the carriage service bills for its services.[57]

2.54      Microsoft argued that given the nascent state of the content services industry in Australia and around the world, it queried the suitability of a definition of 'content service' that has 22 specific exceptions. Microsoft argued that this regulatory approach is likely to have significant unintended consequences as new services evolve and attempts are made to apply to the definition of 'content service' to them.

2.55      Microsoft suggested that the government adopt a narrower basic definition of 'content service' that delineates the nature of the content services that are intended to be regulated, in addition to the means by which those services are delivered or made available. In its view, this approach is preferable to defining what constitutes a content service by way of exception.[58]

2.56      As discussed above, DCITA stated that the definition of ‘content service’ has been drafted broadly to capture all possible categories of content services. There is also  the ability to add new types of services:

Senator WORTLEY—In the definition of ‘content services’ in the bill, there are lots of references to exempt services—for example, an exempt internet directory service. Will the exempt services that are not defined in the bill be spelt out in the regulations for the bill?

Dr Pelling—There is no intention to define services further than is in clause 2 of the bill. The bill has a list under the definition of content service and then a number of those are defined by specific definitions. The only matter that I would add to that really is that paragraph 10 of the definition does allow a service to be specified in the regulations. So there is capacity to add to that list if a new type of service becomes available.[59]

2.57       ASTRA raised the concern that subscription TV's 'on-demand' services, which is available on the FOXTEL (and OPTUS) digital platform and will be available on AUSTAR after it is deployed, may fall into the definition of a 'content service' and that this may be an unintended consequence of the bill.[60]

2.58      DCITA stated that it appears to be a broadcasting service and therefore outside the regulatory framework of the bill:

We are in discussion with ASTRA about the interpretation of ‘broadcasting service’ around that. The initial advice we have is that it is a broadcasting service, but we will continue to look at the ramifications of that to ensure that services that are essentially designed to be part of, say, the Foxtel pay TV service, for example, or the other pay TV service operators, and are designed to operate in that regulatory environment, are not inadvertently caught up in this regime.[61]

Conclusion

2.59      The committee believes that young and vulnerable children must be protected from unsuitable content available on the internet and via mobile phones and other technologies. The committee believes that the bill strikes a balance between adequate protection of consumers, especially children and fair access rights for adults.

2.60      The committee is satisfied with the bill as a whole. The committee is pleased to note that certain technical amendments proposed by industry stakeholders are being considered by the government, and is confident that these proposals will be addressed in any instances where fine-tuning to the bill is needed.

Recommendation 1

2.61      The committee recommends that the bill be passed.

 

Senator Alan Eggleston

Chair

Navigation: Previous Page | Contents | Next Page