Chapter 5 - Data

Chapter 5Data

Overview

5.1This chapter examines privacy and competition concerns in relation to data collection practices of Big Tech and proposed solutions to address these concerns.

Data collection and collation

5.2Data in this context refers to information about consumers. Data can include sensitive information such as a consumer’s full name, address, phone numbers, driver's license details, income, occupation and educational background. It may be collected many ways including online through use of websites or in-person by using store loyalty cards.

5.3Data collation, aggregation or linking refers to how data is sorted within a database. Data collected from multiple sources, such as browsing activity, signing up for a service, public records and financial records, may be collated to build detailed profiles of users.

5.4The Human Rights Law Centre (HRLC) explained profiling:

Profiling refers to the platforms’ practice of building a ‘profile’ of a person’s personal attributes and interests through tracking their behaviour over time, which can then be used for targeted advertising and personalised recommender systems.[1]

5.5These profiles are extremely valuable to companies, who use them to target ads, sell products and influence user behaviour.[2] The potential harms to consumers from data collection and collation are discussed below. The particular harms and safety concerns arising from the collection and collation of children’s data are considered in Chapter 8: Online safety.

Concerns with data collection

5.6Submissions raised concerns that consumer data may be used to manipulate user behaviour and/or put consumers at risk of price discrimination.[3]

5.7The Foundation for Alcohol Research and Education explained how profiles made through data collection are used to tailor marketing to influence consumer purchases:

By design, people who purchase harmful and addictive products the most are also targeted by digital marketing models the most. Extensive data collection allows digital platforms to develop detailed psychometric profiles that are combined with detailed accounts of people’s browsing behaviour. These insights are used to tailor marketing activities, including content and messaging, towards an individual’s specific susceptibilities. In the case of alcohol marketing, this ability to prey on people’s susceptibilities is particularly harmful because it can disproportionately target people experiencing alcohol dependence.[4]

5.8The Consumer Policy Research Centre (CPRC) identified that digital platforms may use and link data to unfairly exclude consumers from accessing certain products and services or target consumers to expose their vulnerabilities for commercially beneficial outcomes.[5] Profiles made through data aggregation:

… effectively “score” their value – with a view to identifying and retaining profitable customers through advertisements (and avoiding those who are not profitable). A lack of transparency and accountability within such processes means it is difficult for consumers to see how their profile is produced; understand the impact it will have on them; or influence, appeal or correct assumptions based on wrong information. Profiles can also be used to set prices, leading to some groups of consumers paying more for the same service.[6]

5.9CHOICE detailed its finding on the use of facial recognition technology by major Australian businesses, employed without informed consumer consent or with only inconspicuous disclosure. CHOICE noted the risks of this practice included data breaches involving biometric data, inaccurate assessments and potentially hardcoding biases and discrimination.[7]

Data brokers

5.10Data collected by businesses is also used to create revenue by selling this data on to data brokers. Data brokers are organisations that collect and buy vast amounts of data to aggregate and on-sell to other companies. Aggregated data is bought and used for commercial purposes, such as creating targeted advertising.

5.11Mr Rob James, Principal Consultant and Chief Executive Officer, Rob James Consulting Pty Ltd, explained that data brokerage is a $200 billion industry with significant power:

… that industry has spent over $56 million lobbying the US government for regulation in its own favour. That's more than Facebook, Apple, Microsoft and Google combined have spent lobbying the US government. So it's a huge market sector, extremely valuable, and it does impact Australian citizens' data and the privacy of their information. The depth of the data that is held by these brokers is quite broad, and, if that data were to be breached, the malicious activity—what we've seen historically from hacks—could be quite significant to us.[8]

5.12Digital Rights Watch described how digital platforms have changed from focusing on providing services to collecting data, creating privacy risks:

This shift of focus away from the service or product itself and towards the commodification of data enables and encourages a data-gluttonous logic in which data is collected for the sake of it, rather than to meet a specific functional or practical necessity. In turn, this amplifies invasion of privacy, broadens the risks associated with compromised digital security, and, critically, creates a dynamic in which people are data subjects but never data agents. By and large, people generate an immense amount of data to the benefit of a handful of corporations, which is in turn used to fuel further profits by way of targeted advertising and the manipulation of attention, as something to be bought and sold in the data broker industry, or used to build more products divorced from the underlying wants and needs of the people from whom the data was extracted.[9]

5.13The committee notes the next report in the Australian Competition and Consumer Commission’s (ACCC) ongoing Digital Platforms Services Inquiry is considering the supply of data broker services in Australia.

Lack of user control

5.14Many submissions were concerned that companies obtain data without active participation or even awareness of the people from whom data is being extracted.[10]

5.15The Attorney-General’s Department stated:

Submitters [to the Privacy Act Review] noted that targeting has the potential to cause significant harm when individuals have limited awareness of why and how they are being targeted and no control over it, and where targeted content and advertising may be used to manipulate, discriminate, exclude and exploit individuals based on their vulnerabilities.[11]

5.16The CPRC’s research indicated:

… consumers are uncomfortable with the amount of information collected about them and would prefer to have greater control over that data collection. Control is particularly lacking given that personal data can often be traded between firms deeply embedded in supply chains without a direct link to consumers or even the basic service they’d signed up for. In addition, it can be difficult for consumers to know where and how to remove their associated data from brokers’ holdings.[12]

5.17Professor Toby Walsh, Chief Scientist, AI Institute, University New South Wales, explained how data is collected even when consumers are offline:

We need stronger privacy laws to protect what might be called our “analog privacy”. Increasingly, our digital devices are collecting information about our non-digital selves. I can give many examples. Fitbit monitors your heartbeat. Your Android smart-watch tracks your every movement in the real world. AndMe collects information about your genotype. All of this is information about our “offline” analog selves rather than our online digital selves. With digital information, we can refuse cookies, connect with a VPN, or find other ways to hide our digital footprint. But analog information is much harder to hide. FitBit’s term of service mean that FitBit own your heart-beat. AndMe claim a non-exclusive license to your genes. As most of these devices collecting analog information about us are not considered to be medical, such analog information is currently not treated with the care and sensitivity it deserves.[13]

5.18The United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression outlined that:

The systematic collection of behavioural data and targeted advertising can violate the right to freedom of opinion; and the lack of transparency around platforms amplification of content online ‘points towards an unacceptable level of intrusion into individuals’ right to form their ideas free from manipulation and right to privacy’.[14]

Ineffective consent procedures

5.19Submissions suggested that processes for user consent to collection of data are inadequate. Consent is typically bundled with the decision to use a particular product and may not be considered free and informed because users tend to not understand the quantity of their data that will be obtained or how it will be used.[15]

5.20The CPRC argued that consent processes need reform:

Australia’s privacy law still relies on notification and consent as the primary means of protecting consumers. By forcing consumers into a situation where they “decide once” about whether to share their data but bear the consequences potentially for the remainder of their life is not a fair trade. This starkly contrasts with the knowledge and capability of firms to understand the value and potential use of data.[16]

5.21Submissions highlighted terms and conditions as being ineffective at helping consumers understand what data is being collected.[17]

5.22The Office of the Australian Information Commissioner’s (OAIC) Australian Community Attitudes to Privacy Survey 2020 found that 69per cent of individuals do not read privacy policies attached to any internet site, largely due to their length and complexity. The OAIC commented:

Even where individuals do read privacy policies and APP 5 notices, they may feel resigned to consent to the use of their information to access online services because they do not feel there is an alternative. As digital products and services become more entrenched in individuals’ lives as the way in which we work, study and engage socially it is increasingly difficult to avoid pervasive tracking and data handling practices that do not align with their preferences.[18]

5.23The Australian Media Literacy Alliance (AMLA) stated:

Currently only a quarter (26%) of adults are confident they understand the terms and conditions of social media platforms, including what data is being collected, and by implication how that is being used. The increased proliferation of AI generated content, where it is unclear what underlying data is being drawn upon and how answers are being created, only increases these challenges.[19]

5.24The committee noted that under the Australian Privacy Principles (APPs), APP entities[20] are only to collect personal information about an individual directly from that individual. Data collation or profiling using information beyond that provided directly from a user is illegal if a company can ‘reasonably and practicably’ request it from the user themselves.[21]

5.25Examples of times it may be unreasonable or impractical include ‘collection by a law enforcement agency of personal information about an individual who is under investigation’, where direct collection may jeopardise an investigation; or to obtain updated address details for an individual where delivery of legal or other official documents is necessary.[22] Organisational privacy policy terms discussing collection from third parties or disclosure of data to ‘trusted partners’ do not create an exception to the direct collection rule. However, the provision is rarely enforced.[23]

Privacy and security concerns

5.26Submissions asserted that privacy is a human right which needs to be protected.[24]

5.27With the widespread collection and processing of personal data by digital platforms, individual privacy concerns are growing.[25] Submitters were concerned about the use of their data for profiling and manipulation, as well as risks from potential data breaches, such as identity theft or online harassment.[26]

The Privacy Act

5.28The Privacy Act 1988 (Privacy Act) contains a set of principles that outline how organisations are permitted to handle personal information. This includes obligations in relation to the collection, use and disclosure of personal information, adequately protecting personal information and providing transparency in relation to this information handling.

5.29The OAIC highlighted ‘[t]he flexible principles-based nature of the Privacy Act also means it is adaptable to changing technology, and able to complement other legislation or regulatory frameworks that deal with related issues’.[27]

The Privacy Act Review

5.30The Privacy Act Review commenced in October 2020, following the ACCC 2019 Digital Platforms Final Report that recommended broad reform of Australia’s privacy framework. The review aimed to modernise the Privacy Act to keep up with the increased volume and granularity of personal data being collected by companies including digital platforms.

5.31The Privacy Act Review Report 2022 makes 116 proposals for reform, which are designed to better align Australia’s laws with global standards and improve protection of Australians’ privacy. The proposals focus on three categories:

Information protections: these proposals include recognising the public interest to society of protecting individuals’ privacy, regulating ‘targeting’ of individuals based on information which relates to them but that may not uniquely identify them and enabling privacy codes to be made by the Information Commissioner in certain circumstances.

Privacy protections: these proposals include strengthening privacy protections for children, improving individuals’ control over their personal information, including through a right to seek erasure of personal information, and giving individuals more transparency and control over direct marketing.

Enforcement: these proposals include equipping the OAIC with more options to enforce privacy breaches, enhancing the OAIC’s ability to proactively identify and address privacy breaches, and providing new pathways for individuals to seek redress in the courts for privacy breaches, including through a new tort for serious invasions of privacy.[28]

Security concerns

5.32As companies increase the amount of data they hold, consumer concerns about privacy and cyber security are growing.

5.33Vault Cloud commented on cyber security as an increasing concern:

The lack of transparency and control over how personal data is being used is a critical concern that needs to be addressed … The increasing reliance on IT has created new security challenges, such as cyberattacks and data breaches. These threats can cause significant damage to individuals, businesses, and even entire countries. There is a need for stronger regulations that can protect against these threats and ensure that appropriate cybersecurity measures are in place.[29]

5.34CHOICE discussed how the increased collection of data has led to privacy breaches:

Data breaches can have devastating impacts on Australian consumers, exposing people to financial loss, emotional distress and loss of trust in private markets. In light of recent major data breaches affecting millions of people, the case for strengthening Australia’s privacy laws and regulatory enforcement powers has never been clearer. Businesses profit from monetising consumer data – and not just technology businesses. They often collect unnecessary amounts of data to exploit and on-sell to data brokers. Recent data breaches in Australia were a result of the vast amount of data collected by big and small businesses in recent years, as well as outdated regulations governing data collection.[30]

Australian government data and use of cloud services

5.35The Department of Home Affairs noted ‘[t]he rising use of online and digital services by Australians requires increased efforts to manage government systems and data holdings, effectively and securely’. The Hosting Certification Framework (HCF) is one policy designed to ensure data is ‘hosted with the appropriate level of privacy, sovereignty, and security controls’.[31]

5.36Submitters commented on current controls in place to safeguard Australian Government data, including the HCF, in addition to concerns around data sovereignty and localisation.

Hosting Certification Framework

5.37Certification under HCF is required for businesses wanting to enter cloud contracts with government. Under the HCF, hosting providers must demonstrate that data will only move between customers and geo-locked strategic certified data centre facilities and will not leave Australia at any point.[32]

5.38The Digital Transformation Agency (DTA) stated:

The DTA has found that global service providers, with workforces and facilities located outside of Australia, have had challenges in complying with some of the control objectives under the HCF … An issue is ensuring personnel that would have unescorted physical or logical access to sensitive or classified government data, obtain Australian Government Security Vetting Agency (AGSVA) security clearances. There are also requirements around executive control and influence of strategic decisions that have been challenging for global providers. The DTA is addressing these challenges in partnership with the relevant industry and government parties by applying interim controls to address the associated risks until the HCF requirements can be met.[33]

5.39The Tech Council of Australia (TCA) recommended examining the impact of the HCF. It argued the framework may duplicate existing requirements, which may be covered by other standards, such as the International Financial Reporting Standards or the European Union (EU) General Data Protection Directive (GDPR). This disproportionately disadvantages small businesses when competing with larger companies that may be better placed to respond to regulatory uncertainty and absorb costs.[34]

5.40The TCA stated there are high barriers to entry for tech businesses wanting to enter into government contracts generally and there is a need to make procurement systems more open and transparent.[35]

Data sovereignty and localisation concerns

5.41Data sovereignty is a growing concern for Australians, with the OAIC’s Australian Community Attitudes to Privacy Survey 2020 stating that 74 per cent of Australians consider it to be ‘a misuse of personal information’ if their data has foreign processing access, an increase from 68 per cent in 2013.[36]

5.42Vault Cloud provided a helpful definition:

Data sovereignty refers to the right of a nation to control and manage its own data, regardless of where that data originated and [is] stored. This means that a country has the authority to determine how its data is collected, processed, and shared, as well as enforce its own laws and regulations related to data protection and privacy. Data sovereignty is often linked to national security, as countries may be concerned about foreign access to sensitive data.

Data localisation, on the other hand, refers to the requirement that data be stored within a particular country’s borders. This does not mean that a country has full control of the data as the laws of other counties may also apply.[37]

5.43Many currently available cloud services are hosted overseas, with one submitter suggesting ‘[n]o cloud provider has any interest in expanding certain types of cloud services into Australia’.[38]

5.44Multiple submissions raised privacy concerns about Australian data being stored overseas, especially sensitive government data.[39] For example, Digital Rights Watch (DRW) raised concerns with the public sector storing data in private cloud services:

… particularly if there is a risk that those companies might exit the Australian market, if those services are hosted overseas and subject to different regulatory environments, or if those companies display poor workplace behaviours or corporate citizenship.[40]

5.45Vault Cloud argued both data sovereignty and data localisation are important to protect citizen data collected by government:

Often, there is little to no choice in what personal information is stored by the Government. The public, therefore, has a higher standard for Government when it comes to the management of personal data. When people provide personal data to the Government, there is an expectation that this data will be stored and managed within Australia …

Data sovereignty is important for protecting the privacy of individuals and safeguarding against the misuse of personal data. It is essential to ensure that the data of Australian citizens is safeguarded, and that the data is not used for purposes that are detrimental to the public interest.[41]

5.46Submitters raised concerns that data stored overseas may be subject to foreign regulations and authorities and any governments those authorities have agreements with.[42]

5.47Vault Cloud argued:

When in country data is stored on services, which are subject to foreign laws, an organisation retains substantial legal obligations concerning that data’s protection. However, the information may no longer be under their control and could be impacted by the laws and actions of a foreign country. This includes the future (as yet unwritten) laws of a foreign country. While the privacy laws of foreign countries may align to Australia’s today, there is no certainty that they will do so in the future … It is essential to ensure that the data of Australian citizens is safeguarded, and that the data is not used for purposes that are detrimental to the public interest.[43]

5.48Vault Cloud highlighted that the US does not use public clouds for sensitive data and instead uses special sovereign variants known as ‘Government Cloud’, ‘Community Cloud’, ‘Sovereign Cloud’ or ‘Secure Cloud’.[44]

5.49On the other hand, some submitters argued against data sovereignty and localisation, framing it as being unnecessary and hindering business innovation.[45] For example, Meta stated:

Data localisation policies in particular, not only impact the foundations o[f] the open internet, but also impose unnecessary costs and technical challenges on what should be efficiency-based decision-making processes, making them market blockers rather than the drivers of economic growth some imagine them to be.[46]

5.50Microsoft outlined the importance of non-localised cloud infrastructure to protect information during the Ukraine war:

… defence against a military invasion now requires for most countries the ability to disburse and distribute digital operations and data assets across borders and into other countries.[47]

5.51The Australian Institute of Company Directors (AICD) ‘caution[ed] around any further recommendations on data localisation’.[48] Mr Simon Mitchell, Senior Policy Adviser, the AICD, stated:

… we have seen insufficient evidence that new comprehensive data localisation obligations are necessary or will improve the overall cyberresilience and data management practices of Australian companies. It may be the case that requiring Australian companies to find domestic alternatives to a big tech cloud provider, by way of example, could deny businesses cost-effective, secure and innovative data protection solutions. The unintended consequence of such a policy could be to make individual Australians' data less secure.[49]

5.52The Developers Alliance argued:

Regulating data management on a globally dispersed platform presents multiple challenges. Firstly, cloud providers seldom have rights to access the data and processes they host for their customers. Secondly, what data they can access is often encrypted. Thirdly, the idea of where data is “located” is often a complex question. Fourthly, cloud providers can find themselves bound by conflicting laws where international authorities seek access to data from third jurisdictions or by acting extraterritorially. Mandating the localization of data for regulatory reasons is often used as a pretext for digital trade barriers, inviting reciprocity.[50]

Competition concerns

5.53Big Tech use their dominant market position to collect vast quantities of user data and make these datasets exclusively available through the platform’s own products and services.[51] For instance, Google bundles use of the data it collects within its own products in related markets across the digital advertising supply chain.[52]

5.54Submissions indicated that lack of access to relevant data held by Big Tech creates a substantial barrier to entry and growth for competing businesses because Big Tech’s wealth of data enables them target ads to specific consumers.[53] For example, Free TV Australia (Free TV) advised that data creates:

… an insurmountable barrier to entry (and expansion) in the market for the provision of ad tech services. It is not practically feasible, in the short to medium term, for any other ad tech services providers to collect such broad ranging and unique data sets in relation to users to compete effectively with Google. Given this, a stark choice exists, either regulatory intervention occurs or Google will continue to dominate the ad tech services market in Australia.[54]

5.55Submitters suggested Big Tech may also use third party generated data for anticompetitive purposes such as purchasing competitor products or creating similar products or features.[55] For instance, the Centre for AI and Digital Ethics stated Amazon is known to use seller data to create their own cheaper Amazon alternatives.[56]

5.56The Commonwealth Bank of Australia stated this data advantage is exemplified by the Consumer Data Right (CDR), as most companies covered are Australian businesses:

In the context of digital platforms, the unintended consequences of this is that domestic companies would be required to share their data with many platforms today even though they are competing directly with them. This is further compounded by the lack of a broad definition of reciprocity. That is, under a broad definition, a party that becomes accredited to receive CDR data would also have an obligation share their data. The lack of a broad definition only adds to the competitive advantage that platforms have when it comes to Australians’ data.[57]

Proposed solutions

5.57This section explores a range of proposed solutions to the data issues raised throughout this chapter. Options raised by submitters include implementing the proposed Privacy Act reforms, a right to delete data, a statutory tort, obligations for companies to handle data fairly, prohibitions on targeted advertising, limits on data use and aggregation, and engaging with businesses.

5.58Submitters also raised the importance of effective enforcement and international alignment.

The Privacy Act Review

5.59Multiple submissions supported proposed reforms to the Privacy Act.[58] Forexample, the Australian Communications Consumer Action Network stated:

We consider that it's appropriate that consumers have greater rights to not have their data collected, have greater rights to expressed consent to control how their data is collected and used by businesses. We're supportive of consumers having broader rights with respect to seeking redress where there have been breaches of their privacy. But, broadly speaking, we're quite supportive of the general thrust of the overall reforms.[59]

5.60The AICD stated it supported modernising the Privacy Act, but raised concerns about some proposed reforms:

… we are concerned that many of the proposed reforms are being advanced by a perception that Australia’s privacy laws are weak and poorly performing, therefore warranting existing elements to be strengthened and made more prescriptive. Such a significant policy case needs to be comprehensively tested from a cost benefit perspective to ensure that the likely benefits will outweigh costs, for instance to innovation and business competitiveness with global counterparts.[60]

5.61As one example, the AICD called for greater analysis of proposed removal of the existing small business exemption under the Privacy Act. It advised:

This would be a significant change that would impose sweeping obligations on millions of small businesses and would come with material compliance costs at both an individual and aggregate business level. Our understanding is that under the proposed reforms the full suite of Privacy Act obligations will be imposed on small businesses, including for example the requirement to have a nominated senior manager be accountable for privacy. This approach, on first reading, appears disproportionate to the risk posed by many small businesses in mishandling personal information and will involve extensive compliance costs for potentially limited public benefit. Rather we would recommend that the Government focus on how such small businesses can be best supported, including in terms of building their cyber resilience and data management practices.[61]

Right to delete data

5.62Under the Privacy Act Review, a right of erasure has been proposed. A right of erasure was supported by the Association of Heads of Independent Schools of Australia:

Young people should not be burdened throughout their lifetime by a public profile generated by the malice of others or their own immaturity.[62]

5.63Similarly, Mr James stated:

… if we as Australians want to enforce our own privacy rights to have that data deleted, it's close to impossible or extremely expensive to do. If we had some mechanisms to enable consumers to go through a process to work with those data brokers and have their data removed, that's something that I think would go a long way towards protecting citizens' information.[63]

Statutory tort

5.64Several submissions expressed support for the introduction of a statutory tort as proposed in the Privacy Act Review.[64] Mr Mark Nottingham, expert advisor to the UK Competition and Markets Authority's Digital Markets Unit, stated this ‘would help to give the Privacy Act the teeth that it is so often missing.’[65]

5.65Children and Media Australia (CMA) commented in support:

CMA can see the advantage of a statutory tort in that companies would be forced to take the risk of litigation into account when determining their practices, which means a significant chance of potential accountability shaping those choices. This means of deterring undesirable behaviour would be a useful addition to mechanisms such as market forces and negative publicity; and may be all the more helpful considering that under the digital platform business model there is no particular need to please the user/consumer.[66]

5.66Submissions suggested there should be adequate options for consumers if they identify misuse of their data, without having to go through courts, which involves access to justice issues.[67] For example, CMA suggested:

… there are risks associated with relying entirely on aggrieved consumers to undertake litigation: it can lead to significant burdens on individuals to prosecute the common good, and windfall gains at the other end. Therefore CMA submits that the ACCC should be empowered and resourced to bring representative actions under any statutory tort.[68]

Obligations to handle data fairly

5.67Evidence to the committee supported implementation of a positive obligation for digital platforms to handle data fairly or in the best interests of users.[69]

5.68The OAIC supported the proposed reform in the Privacy Act Review, establishing a positive obligation on organisations to handle personal information fairly and reasonably. It suggested:

… a positive obligation for organisations to handle data fairly and reasonably would give individuals greater confidence that they will be treated fairly when they choose to engage with a service. This would prevent consent being used to legitimise handling of personal information in a manner that, objectively, is unfair or unreasonable.[70]

5.69CHOICE stated a duty of care requirement should be implemented and ‘require entities to take reasonable care not to cause foreseeable harm to consumers through collection, handling and use of their data’.[71]

5.70The CPRC argued in favour of a provision to make companies responsible for delivering safe, secure data-driven products and services:

Incorporating a duty of care or best-interests duty (similar to a fiduciary duty), especially for how consumer data is treated and how choice architecture is presented and implemented on digital platforms, can help add a level of accountability on digital platforms that could significantly reduce the likelihood of consumer harm. It could also lead to pro-business benefits by increasing consumer trust in those platforms that actively build this into their business model. The idea of a best interests duty for consumer data is relatively new and unexplored in the Australian context. As a next step, CPRC recommends an inquiry to explore how to construct and implement positive obligations on businesses to use data in consumers’ interests.[72]

5.71The Centre for AI and Digital Ethics stated it is insufficient to rely on notice and consent regimes to address privacy harms. It noted that individuals ‘cannot be meaningfully expected to spend the large amount of time necessary to analyse and ruminate over consent for each service for which they engage’.[73] It supported:

… a regime that requires platforms to adhere to pro-privacy. We support regimes to allow individuals to correct errors in their collected information, withdraw consent for process, and to mandate subsequent erasure. There are additional safeguards that protect consumers despite the illusory nature of online consent. We support standardised and simple consent onboarding, with standard iconographs and layouts, to help minimise consumer consent fatigue.[74]

Prohibitions on targeted advertising

5.72Submissions supported the introduction of a prohibition on targeted advertising without free and informed consent from users.[75]

5.73The Obesity Policy Coalition stated this is particularly important for the marketing of harmful products, such as unhealthy food and alcohol. It recommended:

… that express consent be required for digital platforms, including social media services, to collect, use or disclose an individual’s personal information or data for commercial marketing purposes, particularly in terms of marketing for unhealthy food and drinks. These protections should enable individuals to effectively opt-out of commercial marketing, and in particular harmful industry marketing.[76]

5.74The HRLC expanded:

Instead of permitting profiling by default and allowing users to opt out, regulation in Australia should go further by requiring default settings to not be based on profiling. This would ensure that users who are less aware of the operation of recommender systems will not be treated less favourably and would limit the role of personalised content recommendation systems in amplifying disinformation and hate speech.[77]

Limits on data use and aggregation

5.75Multiple submissions suggested limiting data use and aggregation by digital platforms.[78]

5.76The ACCC recommended obligations to address barriers to entry and expansion caused by the market power of big tech. Subject to privacy considerations, it suggested:

data access requirements which require designated platforms to provide access to specific data sources on an agreed basis to rivals;

data portability requirements which would allow a consumer to request designated platforms transfer their data to them or a third party; and

data use limitations which would place restrictions on how a designated platform collects, stores, or uses certain data.[79]

5.77Free TV suggested regulation to limit data use, to strike a balance between privacy harm minimisation and promoting competition:

… Free TV submits that given the legitimate privacy concerns raised by these [ACCC] approaches, the only effective way to remedy the identified competition harms at the current time would be to limit data use by designated entities. This would be privacy enhancing, in that it would limit the use of data about individuals as compared to data portability or interoperability arrangements, which would increase the use of such data. The pro-competitive effects of limiting the ability of designated entities to leverage their data advantages would far outweigh the decreases in efficiency for designated entities caused by the implementation of these measures.[80]

Business engagement

5.78Several submissions suggested the government partner with industry to ensure that companies have the knowledge and expertise to combat cyber security threats.

5.79The AICD suggested the government partner with industry to ensure greater coordination across relevant agencies, clarity on regulator responsibilities and proactive threat and intelligence sharing. Additionally, it recommended:

a safe harbour or protected information mechanism where an organisation can share information of a significant cyber incident with a regulator(s) to assist in response and recovery without concern that the information will subsequently be used in enforcement action;

consideration of how existing reporting and notification obligations (e.g. SOCI [Security of Critical Infrastructure] Act obligations, Notifiable Data Breaches Scheme) can be harmonised or streamlined with the goal that an organisation only needs to report or notify to the Government once;

targeted support for SME [small to medium enterprises] and NFPs [not for profits] to build cyber security resilience and improve data management practices, education, information sharing and guidance in the event of experiencing and recovering from a cyber security incident...[81]

5.80The Council of Small Business Organisations Australia (COSBOA) also recommended the government focus on educating and upskilling small businesses so that they feel empowered to take ownership to mitigate risk, voluntarily upskill their staff, and ensure safe data collection:

The introduction of a Small Business Privacy Code, including a best practice guide and checklist for compliance, would be a helpful solution. Ideally, small businesses would be supported through a program such as COSBOA’s Cyber Wardens pilot program which aims to become Australia’s first cyber safety workplace certification or micro-credential for the small business sector. The program is designed by small business for small business, and aims to upskill the nation’s small business workforce to give owners and employees the knowledge and tools they need to safely engage in the digital economy. This program would complement the Small Business Privacy Code, helping small businesses understand and mitigate risk when engaging with big technology companies, achieve best practice and ensure compliance with regulatory requirements.[82]

5.81BSA – The Software Alliance noted the Department of Home Affairs Trusted Information Sharing Network (TISN) is a positive example of an effective and innovative public-private partnership mechanism:

The TISN is comprised of representatives from different critical infrastructure sectors, and each sector is supported by an Australian Government agency — usually the agency that has regulatory responsibility for that sector. Under the TISN Data Sector Group, data storage and processing service providers, which include cloud service providers, work together with government agencies to: a) identify and manage risks to critical infrastructure; b) address security gaps within sectors and implement mitigation strategies; c) inform future policy and programs to further support critical infrastructure resilience; and d) achieve the objectives of the Critical Infrastructure Resilience Strategy.[83]

Enforcement

5.82Multiple submissions argued that the existing provisions of the Privacy Act need to be better enforced.[84] For example, the CPRC commented:

For legislation and its respective penalties to be effective, they need to be supported by regular surveillance and enforcement by the regulator to educate and shift the market towards a more consumer-centric approach to the digital economy.[85]

5.83CHOICE argued the OAIC should be strengthened:

OAIC is under-resourced and lacks many of the regulatory powers of its consumer protection counterparts, including the ACCC and the Australian Securities and Investment Commission (ASIC). A permanent increase in funding will provide OAIC the resources needed to investigate other breaches. It will also allow OAIC to take preventative measures to mitigate the risk of future data breaches.[86]

5.84The OAIC agreed, stating it is looking for expanded powers including:

… a new mid-tier civil penalty provision for interference with privacy that doesn't meet the threshold of 'serious and repeated' and a low-level tier of civil penalty provisions for administrative breaches of the act, with some infringement notice powers.[87]

5.85Several submissions supported increasing penalties to incentivise digital platforms to address privacy risks.[88]

5.86The Centre for AI and Digital Ethics stated:

Stronger penalties increase the likelihood that a CISO [Chief Information Security Officer] can get access to decision-makers with budget assigning powers (particularly the Board) and helps a CISO to make the for diverting funds to improving cyber security. This is justifiable as organisational failures in cybersecurity impose substantial financial and non-financial externalities on ordinary Australian who suffer privacy loss, identity theft, and financial fraud as the result of hacks.[89]

5.87The Centre for AI and Digital Ethics recommended proportional penalties relative to the amount of sensitive information an organisation holds rather than the size of the organisation:

Doing so could create a powerful incentive for smaller and medium sized organisations (who generally have fewer resources to devote to cyber security and data protection) to avoid collecting and storing too much sensitive information. However, there may be diminishing returns thus extreme penalties may be no additional help and may simply be viewed as unduly harsh. Further, such unreasonably harsh penalties may incentivise organisations or staff to hide problems rather than dealing with them openly, thus weakening the response capability.[90]

5.88The Law Institute of Victoria suggested capping penalties at a percentage of revenue ‘to incentivise regulatory compliance in industry, whilst ensuring that small and medium companies are not disproportionately affected by overwhelming penalties’.[91]

International alignment

5.89Multiple submissions supported international alignment of data laws and supported implementing a similar approach to the EU GDPR (see Box 5.1).[92] For example, the Developers Alliance commented:

On data privacy, the EU’s GDPR provides a valuable test case. While most of the regulation has seen strong industry support, its data export provisions have proven unworkable in part because they create localized exceptions which are not extended internationally. This lack of international comity has resulted in rules which promote internet fragmentation, an outcome we fear is inevitable as extra-territorial regulations multiply around the world. We highlight internet fragmentation due to incompatible international regulations the single greatest threat to Australia’s digital economy.[93]

5.90The Irish Council for Civil Liberties supported implementation of the GDPR but highlighted the importance of enforcement as the GDPR has been frequently infringed upon without action by regulating authorities.[94]

5.91The AICD cautioned the committee on implementing the GDPR in Australia:

More generally, we understand that research conducted into the GDPR in the EU has identified legitimate questions about its effectiveness in improving privacy and has had potential detrimental impacts on innovation. These studies point to the clear need for the Government to comprehensively consider the appropriateness of broadly adopting the GDPR model in Australia.[95]

Box 5.1 The EU General Data Protection Directive

The GDPR established additional obligations for businesses and rights for individuals. Examples of requirements include:

  • Businesses must inform consumers about data collected.
  • Consumers can request a record of data held by businesses and have options to have it deleted.
  • Businesses cannot combine data collected from different parts of their business.

The GDPR also grants national authorities additional investigative and sanctioning powers, including the ability to raid organisations, compel information, impose significant fines, and ‘block data use, which is the ultimate sanction for international digital platforms’.

Source: General Data Protection Regulation 2016 (EU); Irish Council for Civil Liberties, Submission 36, p. 1.

Footnotes

[1]Human Rights Law Centre (HRLC), Submission 50, p. 12.

[2]See, for example, Australian Competition and Consumer Commission (ACCC), Digital platform services inquiry, Interim report No. 5 – Regulatory reform, September 2022, p. 31; Free TV Australia, Submission 17, p. 9; Commonwealth Bank of Australia, Submission 71, p. 4; Mr Joshua Zubak, Submission 27, p. 1.

[3]See, for example, CHOICE, Submission 54, p. 4; Consumer Policy Research Centre, Submission 60, p.1; Mr Joshua Zubak, Submission 27, p. 1.

[4]Foundation for Alcohol Research and Education, Submission 33, p. 4.

[5]Consumer Policy Research Centre, Submission 60, p. 2.

[6]Consumer Policy Research Centre, Submission 60, p. 2.

[7]CHOICE, Submission 54, p. 4.

[8]Proof Committee Hansard, 26 July 2023, p. 34.

[9]Digital Rights Watch, Submission 68, p. 11.

[10]Mr Joshua Zubak, Submission 27, p. 1.

[11]Attorney General’s Department, Submission 51, p. 16.

[12]Consumer Policy Research Centre, Submission 60, pp. 2–3.

[13]Professor Toby Walsh, Submission 42, [p. 3].

[14]Human Rights Law Centre, Submission 50, p. 12. See also United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Disinformation and freedom of opinion and expression, 13 April 2021, UN Doc A/HRC/47/25, pp. 14–15.

[15]See, for example, Professor Jeannie Marie Paterson, Professor of Consumer Law, Director, Centre for AI and Digital Ethics, Proof Committee Hansard, 26 July 2023, p. 46; Mr Joshua Zubak, Submission27, p. 3; Mr Tom Leuner, Executive General Manager, Mergers, Exemptions and Digital, ACCC, Proof Committee Hansard, 22 August 2023, p. 34.

[16]Consumer Policy Research Centre, Submission 60, p. 6.

[17]See, for example, Consumer Policy Research Centre, Submission 60, pp. 2–3; Australian Media Literacy Alliance (AMLA), Submission 55, [p. 3].

[18]Office of the Information Commissioner (OAIC), Submission 61, p. 3.

[19]AMLA, Submission 55, [p. 3].

[20]The OAIC states APP entities are ‘Australian Government agencies (and the Norfolk Island administration) and organisations with an annual turnover more than $3 million have responsibilities under the Privacy Act, subject to some exceptions’. See OAIC, Rights and responsibilities, www.oaic.gov.au/privacy/privacy-legislation/the-privacy-act/rights-and-responsibilities (accessed 15 November 2023).

[21]Katharine Kemp, ‘This law makes it illegal for companies to collect third-party data to profile you’, The Conversation, 21 September 2022.

[22]OAIC, Chapter 3: APP 3 Collection of solicited personal information, 22 July 2019, www.oaic.gov.au/privacy/australian-privacy-principles/australian-privacy-principles-guidelines/chapter-3-app-3-collection-of-solicited-personal-information (accessed 15 November 2023).

[23]Katharine Kemp, ‘This law makes it illegal for companies to collect third-party data to profile you’, The Conversation, 21 September 2022.

[24]See, for example, Digital Rights Watch, Submission 68, p. 4; Apple, Submission 70, p. 1.

[25]Vault Cloud, Submission 38, [p. 3].

[26]See, for example, CHOICE, Submission 54, p. 4; Consumer Policy Research Centre, Submission 60, p.1; Mr Joshua Zubak, Submission 27, p 1.

[27]OAIC, Submission 61, p. 2.

[28]Attorney General’s Department, Privacy Act Review Report, 16 February 2023, www.ag.gov.au/rights-and-protections/publications/privacy-act-review-report (accessed 6November 2023).

[29]Vault Cloud, Submission 38, [p. 3].

[30]CHOICE, Submission 54, p. 3.

[31]Department of Home Affairs, Safeguarding Australian Government data, www.hostingcertification.gov.au (accessed 7 November 2023).

[32]Digital Transformation Agency, Submission 7, p. 3.

[33]Digital Transformation Agency, Submission 7, p. 3.

[34]Tech Council of Australia, Submission 63, p. 9.

[35]Tech Council of Australia, Submission 63, p. 9.

[36]Vault Cloud, Submission 38, [p. 5].

[37]Vault Cloud, Submission 38, [p. 7].

[38]Mr Cian Byrne, Submission 75, [p. 3].

[39]See, for example, Digital Rights Watch, Submission 68, p. 19; OVHcloud, Submission 72, p. 7; Vault Cloud, Submission 38, [p. 7].

[40]Digital Rights Watch, Submission 68, p. 19.

[41]Vault Cloud, Submission 38, [p. 7].

[42]See, for example, Vault Cloud, Submission 38, [p. 7]; OVHcloud, Submission 72, p. 7; Digital Rights Watch, Submission 68, p. 19.

[43]Vault Cloud, Submission 38, [p. 6].

[44]Vault Cloud, Submission 38, [p. 6].

[45]See, for example, Microsoft, Submission 47, p. 8; Mr Simon Mitchell, Senior Policy Adviser, Australian Institute of Company Directors, Proof Committee Hansard, 26 July 2023, p. 5.

[46]Meta, Submission 69, p. 73.

[47]Microsoft, Submission 47, p. 8.

[48]Proof Committee Hansard, 26 July 2023, p. 8.

[49]Proof Committee Hansard, 26 July 2023, p. 5.

[50]Developers Alliance, Submission 35, [p. 3].

[51]Free TV Australia, Submission 17, p. 9.

[52]Free TV Australia, Submission 17, p. 9.

[53]See, for example, Commonwealth Bank of Australia, Submission 71, p. 4; Free TV Australia, Submission 17; Law Institute of Victoria, Submission 12, [p. 3]; ACCC, Digital platform services inquiry, Interim report No. 5 – Regulatory reform, September 2022, p. 166.

[54]Free TV Australia, Submission 17, p. 9.

[55]See, for example, Law Institute of Victoria, Submission 12, [p. 3]; Centre for AI and Digital Ethics, Submission 23, [p. 12].

[56]Centre for AI and Digital Ethics, Submission 23, [p. 12].

[57]Commonwealth Bank of Australia, Submission 71, p. 4.

[58]See, for example, Australian Communications Consumer Action Network, Submission 20, p. 37; Digital Industry Group Inc., Submission 65, p. 1; Mr Roger Somerville, Head, Australia and New Zealand Public Policy, Amazon Web Services, Proof Committee Hansard, 3 October 2023, p.4.

[59]Australian Communications Consumer Action Network, Submission 20, p. 37.

[60]Australian Institute of Company Directors (AICD), Submission 28, [p. 4].

[61]AICD, Submission 28, [p. 4].

[62]Association of Heads of Independent Schools of Australia, Submission 13, p. 3.

[63]Mr Rob James, Principal Consultant and Chief Executive Officer, Rob James Consulting Pty Ltd, Proof Committee Hansard, 26 July 2023, p. 34.

[64]See, for example, Mr Mark Nottingham, Submission 37, p. 4; Children and Media Australia, Submission 53, p. 2; Ms Elizabeth Hampton, Deputy Commissioner, OAIC, Proof Committee Hansard, 22 August 2023, p. 29.

[65]Mr Mark Nottingham, Submission 37, p. 4.

[66]Children and Media Australia, Submission 53, p. 2

[67]See, for example, Dr Shaanan Cohney, Senior Lecturer, School of Computing Information Systems, private capacity and Researcher, Centre for AI and Digital Ethics, Proof Committee Hansard, 26 July2023, p. 49; Children and Media Australia, Submission 53, p. 3.

[68]Children and Media Australia, Submission 53, p. 2.

[69]See, for example, CHOICE, Submission 54, p. 5; OAIC, Submission 61, p. 3.

[70]OAIC, Submission 61, p. 3.

[71]CHOICE, Submission 54, p. 5.

[72]Consumer Policy Research Centre, Submission 60, p. 7.

[73]Centre for AI and Digital Ethics, Submission 23, [p. 5].

[74]Centre for AI and Digital Ethics, Submission 23, [p. 5].

[75]See, for example, HRLC, Submission 50, p. 12; Foundation for Alcohol Research and Education, Submission 33, p. 5; Obesity Policy Coalition, Submission 19, p. 5.

[76]Obesity Policy Coalition, Submission 19, p. 5.

[77]HRLC, Submission 50, p. 12.

[78]See, for example, Free TV Australia, Submission 17, p. 20; ACCC, Digital platform services inquiry, Interim report No. 5 – Regulatory reform, September 2022, p. 168; Mr Joshua Zubak, Submission 27, p.5.

[80]Free TV Australia, Submission 17, p. 20.

[81]AICD, Submission 28, p. 3.

[82]Council of Small Business Organisations Australia, Submission 59, p. 2.

[83]BSA – The Software Alliance, Submission 32, pp. 6–7.

[84]See, for example, Mr Mark Nottingham, Submission 37, p. 4; Consumer Policy Research Centre, Submission 60, [p. 6].

[85]Consumer Policy Research Centre, Submission 60, [p. 5].

[86]CHOICE, Submission 54, p. 5.

[87]Ms Elizabeth Hampton, Deputy Commissioner, OAIC, Proof Committee Hansard, 22 August 2023, p.29.

[88]See, for example, Consumer Policy Research Centre, Submission 60, [p. 5]; Law Institute of Victoria, Submission 12, [p.7]; Centre for AI and Digital Ethics, Submission 23, [p. 3].

[89]Centre for AI and Digital Ethics, Submission 23, [pp. 3–4].

[90]Centre for AI and Digital Ethics, Submission 23, [p. 4].

[91]Law Institute of Victoria, Submission 12, [p. 7].

[92]See, for example, Developers Alliance, Submission 35, [p. 4]; Irish Council for Civil Liberties, Submission 36, p. 3; Mr Roger Somerville, Head, Australia and New Zealand Public Policy, Amazon Web Services, Proof Committee Hansard, 3 October 2023, p. 4; Australian Medical Association, Submission 66, p. 3; Microsoft, Submission 47, p. 14.

[93]Developers Alliance, Submission 35, [p. 4].

[94]Irish Council for Civil Liberties, Submission 36, p. 3.

[95]AICD, Submission 28, [p. 5].