Chapter 10 - The way forward

Chapter 10The way forward

Overview

10.1This chapter explores evidence received about regulatory fragmentation. It goes on to outline the committee’s views on the evidence discussed throughout this report and make recommendations.

Regulatory fragmentation

10.2Submitters raised concerns that the current digital platforms sphere involves duplicated and overlapping regulations, and there is a need for a coordinated approach to digital regulation.[1]

10.3BSA – The Software Alliance (BSA) suggested ‘the Committee consider the broader issue of how to reduce regulatory overlap, including by promoting improved coordination between regulators, policymakers, and the private sector’.[2]

10.4The Tech Council of Australia stated regulation:

… should aim to deliver a more coordinated and cohesive approach to digital regulation that enables long-term growth of the Australian technology sector in the national interest, including by avoiding overly broad or piecemeal approaches to regulation, which our research has found to be a key barrier to innovation and capturing the benefits of new technologies.[3]

10.5Significant regulatory gaps were also highlighted in evidence to the committee. For example, the committee questioned Digital Platforms Regulators Forum (DP-REG) members about which agency held responsibility for protecting Australians from spending on unused subscription fees. The DP-REG members were unable to point to an agency that would hold that remit.

10.6Senator Shoebridge asked:

Isn't this one of the problems in this whole space? There's no lead agency. There's no-one who's ultimately responsible. It must frustrate you no end, which is one of the reasons you've brought together this informal forum. There's just no lead agency, is there?[4]

10.7Ms Elizabeth Hampton, Deputy Commissioner, Office of the Australian Information Commissioner (OAIC), responded:

I don't agree that it's the frustration around a lack of a lead agency that caused us to coalesce and come together. Instead, we've reflected on the fact that we each have an important different lens to bring to a set of issues, and it's the coordination of those different lenses that results in a really good outcome for Australians.[5]

10.8In response to committee concerns that there is no agency with ultimate responsibility when regulatory gaps are identified, Ms Creina Chapman, Deputy Chair, the Australian Communications Media Authority, explained:

The gap is not in the regulators; the gap is not in the fact that there is not a regulator that has responsibility for it. If there is a gap, it is a legal gap.[6]

Proposed solutions

Upskill and empower existing regulators

10.9Submissions recommended upskilling existing regulators, such as the Australian Competition and Consumer Commission (ACCC) or OAIC, so they have the adequate skills and resources to regulate the behaviour of digital platforms.[7]

10.10The Consumer Policy Research Centre stated regulators need specific expertise to regulate digital platforms:

Monitoring and surveillance by regulators in this complex environment needs a diverse workforce that not only understands the implications of the law but also the technical architecture on which these business models are built upon. Experts such as data scientists, artificial intelligence engineers, information security analysts and other technical professionals need to be in the mix to support upstream regulation and mitigate the risk to consumers, potentially before widespread harm has occurred.[8]

10.11Similarly, the Human Rights Law Centre (HRLC) supported a comprehensive regulatory framework that includes ‘broad information-gathering and enforcement powers for an independent, well-resourced and integrated regulator’.[9] The HRLC told the committee this regulator should be empowered and have robust information-gathering powers.[10]

Better coordination between regulators and policy makers

10.12Evidence to the committee also recommended the creation of a new model of coordination between existing regulators and policy makers.[11]

10.13.au Domain Administration Ltd (auDA) told the committee that closer engagement with stakeholders is needed at all stages of policy development. It advocated for coordinated efforts between regulators, policy makers, the private sector, technical community, academia, and civil society. It recommended:

… all relevant regulators and government departments actively participate in a multi-stakeholder policy development approach. This would help to avoid siloes and overlapping consultation processes facilitated by different government entities and drive greater certainty amongst industry and consumers.[12]

10.14Similarly, BSA stated DP-REG is comprised of only regulators, with policy makers and industry representatives absent from the conversation. It recommended considering the Australian National University’s Tech Policy Design Centre’s (ANU Tech) proposed model to increase involvement from industry representatives and independent technical expertise. BSA argued:

The increased involvement of industry representatives will provide the government with access to independent technical expertise and a regular platform for consultations. More importantly, it will discourage taking a reactionary approach when addressing emerging concerns and ultimately will pave the way for a more certain regulatory environment.[13]

10.15ANU Tech’s proposed ‘Tech policy coordination model’ includes the following layers of coordination:

The Tech Policy Ministerial Coordination Meeting is the peak Ministerial coordination body in the Australian tech-ecosystem. Its objective is to facilitate cross-portfolio Ministerial coordination before tech policy proposals are taken to Cabinet.

The Tech Policy Council is the peak senior officials’ coordination body in the Australian tech-ecosystem. Its objective is to improve coordination among and between policymakers and regulators.

The Tech Regulators Forum is the peak regulator coordination body in the Australian tech-ecosystem. Its objective is to improve coordination among tech regulators.[14]

10.16auDA supported ANU Tech’s model as it ‘does not change any existing mandates of Ministers, departments or agencies, but helps cultivating coordination at all stages of tech policy development’.[15]

10.17The Australian Information Industry Association noted ANU Tech’s proposal and similarly recommended establishing a Council of Tech Regulators which:

… would work to a similar model as the Council of Financial Service Regulators and be comprised of authorities such as the eSafety Commissioner, the Australian Information Commissioner, the Digital Transformation Agency, the Department of Home Affairs, Treasury, the Attorney General’s Department and the Australian Cyber Security Centre. The Council would ensure that, as far as possible, regulation is streamlined and rationalised to mitigate overregulation, red tape, duplicative reporting requirements and parallel consultation timeframes. Breaking down silos and ensuring that in respect of technology – the all-pervasive, innovative and value-creating engine at the heart of the economy – the left hand of government knows what the right is doing as far as regulation and reporting is concerned, and regulatory impost is contained as far as possible.[16]

Parliamentary committee

10.18The HRLC recommended a dedicated Parliamentary committee on digital matters be established to acknowledge the ongoing attention required on emerging tech issues and policy coordination across Government.[17]

10.19A joint submission from multiple research organisations similarly proposed Parliament establish a Joint Standing Committee on Digital Affairs. They stated:

A dedicated standing Committee would allow for a better allocation of time, resources and expertise and help develop a more sophisticated understanding of digital and technology policy. Existing portfolio committees are overworked and their broad remits mean that they neither have the capacity nor time to proactively interrogate emerging tech issues.[18]

A digital platforms specific body

10.20Some submissions raised support for a new digital platforms specific body.

10.21Ben Blackburn Racing recommended consideration of ‘the introduction of a new Australian Government agency which could bring more independence to oversight of the influence and decision-making structures of Big Tech companies and their impacts in Australia’.[19]

10.22Mr Rupert Taylor-Price, Chief Executive Officer, Vault Cloud, discussed how there is no clear regulator in the digital platforms space, and there needs to be one:

It's a bit like when you get on a plane. To some degree, you don't have to worry too much about who's providing you that service. You know that it's a well-regulated industry. You know that there's a degree of safety by getting on that plane. That's what CASA and other regulators in that space affect in the outcome that they get for their citizens. In the technology space, say that you didn't like the way an algorithm had worked for you in some way on one of these platforms. How do you deal with that? If you go to a bank, you go to APRA. If you get on a plane, you go to CASA. Who do you go to as a citizen when you have an issue with a technology platform?[20]

10.23The Law Institute of Victoria recommended:

… the introduction of a new government regulatory authority, or the establishment of a collaborative team across existing regulatory bodies, tasked with overseeing the regulation of Big Tech companies specifically … [it] would need to be sufficiently resourced in order to provide any meaningful opportunity for appropriate regulation.[21]

10.24Ms Kate Pounder, Chief Executive Officer, Tech Council of Australia, stated the US National Institute of Standards and Technology (NIST) could be examined as a model that brings together competition, consumer and data issues. NIST is an agency of the US Department of Commerce, that produces standards and guidelines with expertise. Ms Pounder stated:

… often in these new areas, particularly when technology is moving fast, there's not a high degree of expertise. So I think centralising that in one body, which can provide expert guidance to governments and work fairly rapidly to get standards and guidance material out, is vital. It can take a science based and evidence based model. Often the work of NIST ends up being utilised in other markets. I think there's also an opportunity for Australia to simply leverage that a bit better and aim for coherence with some of the guidelines that come out there. It often tends to happen in the private sector, because an Australian company that's successful in the tech sector will be selling globally, so they might look to those guidelines and try to adhere to them.[22]

10.25Digital Rights Watch recommended a Minister for Digital Capabilities be appointed.[23]

Committee view

10.26This section provides the committee’s view on key themes and concerns raised throughout this inquiry and the committee’s recommendations.

Regulation

10.27Throughout this report and particularly earlier in this chapter, evidence was presented that the current regulatory system is not working effectively. Regulation of digital platforms is split across various agencies, in some cases with competing priorities.

10.28The committee found that the current legislative and regulatory framework is not sufficient to ensure positive outcomes for consumers and competition. In short, it is fragmented.

10.29The committee acknowledges the importance of well-resourced and appropriately skilled regulators to ensure adequate enforcement efforts achieve the desired outcomes. The committee is concerned that upskilling existing regulators alone will not resolve regulatory gaps or provide the expertise needed to address emerging competition and consumer risks.

10.30Stakeholders highlighted that despite the market power of Big Tech and potential for harm, digital platforms are not regulated like other significant industries, such as banks, telecommunications providers and airlines. The committee considers that a new regulatory regime could address fragmentation and bolster regulatory efficacy.

10.31Evidence to the committee also highlighted the need for better coordination between regulatory bodies and policymakers. Improved coordination would streamline legislation and regulatory efforts. Further, a coordinating body would give consumers and digital platforms certainty about where to turn to when issues arise.

10.32Accordingly, the committee recommends a new coordination body be established, which does not alter or acquire the day-to-day functions of the four main DP-REG agencies but coordinates collaboration efforts, common responsibilities and tasks.

Recommendation 1

10.33The committee recommends that the Australian Government establish a digital platforms coordination body.

Competition

10.34Chapters 3 and 4 considered issues that have arisen due to the concentrated market power of Big Tech. The committee heard evidence that the dominant market power of Big Tech has allowed these firms to engage in anticompetitive behaviours and exploit power imbalances to the detriment of small businesses and consumers.

10.35A range of submitters told the committee that the market power of Big Tech allows these firms to engage in anticompetitive tying and self-preferencing. These practices make it difficult for other companies, particularly small businesses, to compete, resulting in reduced competition, less choice for consumers and increased prices.

10.36The committee has heard that Big Tech platforms may impede consumers from switching products or services through tying practices that lock consumers in to one provider.

10.37Submissions raised concerns that app store providers tie the use of app store services to the use of their in-app payment (IAP) services. App stores take up to a 30 per cent commission on every IAP and restrict app-developers from providing their own IAP mechanisms.

10.38The committee is concerned that the tying of IAPs creates a barrier to entry for competitors and limits the choices available to consumers. Further, the committee believes there is a lack of transparency in how commission fees are determined, and how app stores use the IAP data they collect.

10.39Furthermore, the committee has heard that regulation of near-field communication mobile device components and mobile wallets is needed to ensure consumers have similar rights against large digital platforms compared to regulated financial institutions that provide payment services.

10.40Other jurisdictions such as the European Union (EU) and South Korea have introduced measures that require major app store operators such as Apple and Google to unbundle the use of their proprietary in-app payment systems from the use of app distribution services.

10.41Accordingly, the committee supports introduction of legislation that will address anti-competitive tying by Big Tech platforms to ensure a level and competitive playing field.

Recommendation 2

10.42The committee recommends that the Australian Government introduce legislation to prevent anti-competitive practices through the bundling of payment services and products by large digital platforms.

10.43The committee is concerned that self-preferencing conduct may be anticompetitive and create barriers to entry for small businesses.

10.44Multiple submissions called for regulation that tackles anti-competitive selfpreferencing by gatekeeper companies and referred to international approaches that could be adopted. For instance, the United Kingdom (UK) has proposed a pro-competition regime for digital markets. This regime will include measures to address anti-competitive self-preferencing by requiring digital platforms to not influence competitive processes or outcomes in a way that unduly self-preferences a platform’s own services over that of its rivals.

10.45The committee is of the view that there needs to be greater transparency on the part of large digital platforms regarding the practice of self-preferencing their own products.

10.46The committee believes this warrants mandatory public disclosure by large international platforms when they engage in self-preferencing behaviour for their own products on app-stores and other digital markets. Furthermore, large digital platforms should disclose aggregate information on the data collected from customers and business users for reasons other than the app review process.

Recommendation 3

10.47The committee recommends that the Australian Government require mandatory disclosure by large digital platforms of self-preferencing conduct.

Dispute resolution

10.48In Chapter 4, the committee considered consumer redress options within the digital economy. While Big Tech firms invest in a range of mechanisms to prevent and minimise problems for consumers, a significant number of problems and disputes are unable to be resolved within existing systems.

10.49Internal dispute resolution mechanisms provided by digital platforms are an important first point of redress. However, consumers encounter many difficulties navigating these mechanisms and the power imbalance between Big Tech providers and consumers is evident.

10.50The committee supports the introduction of mandatory digital platform internal dispute resolution standards.

Recommendation 4

10.51The committee recommends the Australian Government implement mandatory dispute resolution requirements for large digital platforms via regulation.

10.52Judicial escalation of disputes with digital platforms is generally not financially accessible for most consumers, nor expeditious enough to address problems before serious harm occurs. Small businesses and consumers are therefore reliant on a regulator choosing to prosecute their case; however, regulators such as the ACCC focus their resources on systemic issues.

10.53The committee is concerned that consumers are left with no realistic escalation options once business-to-business dispute resolution, perhaps with the assistance of an independent advocate or mediator, has been exhausted.

10.54The committee considers the proposal for a judicial escalation option akin to a state-level small claims tribunal has merit.

Recommendation 5

10.55The committee recommends the Australian Government establish a tribunal for small disputes with digital platforms.

Transparency

10.56Chapters 5 and 6 highlighted concerns about transparency of data use by Big Tech, including by algorithms and in automatic decision-making.

10.57Data collection by digital platforms occurs on a grand scale, often without explicit consent from users. Data brokers aggregate data to on-sell for commercial use, such as targeted advertising. Submissions raised concerns that consumer data can be used for profiling and discrimination, without consumers being aware that their data was collected.

10.58The committee suggests measures be implemented to ensure customers are aware of what personal data is being collected by digital platforms and what it is used for. A greater effort should be made by digital platforms and the Australian Government to ensure personal data of individuals is adequately protected.

10.59The committee proposes implementation of a public data reporting regime requiring Big Tech firms to:

provide details of the targeting criteria for advertising and data determining which users are exposed to particular ads; and

provide key metrics on demographic data collected for the purposes of targeting advertising, particularly children’s data.

10.60The committee notes that the EU Digital Services Act requires platforms that display advertising material on their online interfaces to ensure users can identify, for each advertisement displayed, that the information is an advertisement, who the advertisement is on behalf of and the parameters selecting recipients of the advertisement.[24] Some digital platforms have responded to this by creating an online repository of advertisers.[25] This model could be considered by the government.

10.61Mandatory reporting of data collection by digital platforms should be modelled on the obligations imposed on superannuation funds to disclose certain information in notices for annual members’ meetings.

10.62Chapter 6 discussed concerns that algorithms used by digital platforms may not operate in a way that adequately supports community values, such as fairness, accuracy, privacy and user safety.

10.63Evidence supported international approaches to strengthen the transparency of algorithmic use by digital platforms. In particular, the UK and the EU have implemented transparency standards for the use of algorithmic tools.

10.64Large digital platforms should be subject to data access obligations and transparency measures which extend to algorithms used for content recommendation and for targeted marketing.

10.65The committee supports the development of a risk-based regulatory framework by the proposed digital platforms coordination body. The framework should place the onus on digital platforms to identify risks created by their use of algorithms and outline how they will address those risks.

Recommendation 6

10.66The committee recommends the Australian Government implement a requirement for designated digital platforms to report advertising material via a public register, based on turnover, and that it implement mandatory reporting on algorithm transparency, data collection and profiling by very large platforms, particularly identifying what personal data is collected and how it is used.

10.67The committee notes the Privacy Act Review proposal to create a right of data erasure.

10.68Submissions highlighted that individuals have limited rights when it comes to how their data is used. A right to erase personal data would give individuals more control over their own information when engaging with digital platforms.

10.69The committee notes any right of erasure must extend beyond an individual’s ability to delete data, such as photos or posts, which they have voluntarily shared online to also encompass biographical, geolocation, browsing habits, ‘likes’ and other data surreptitiously collected and collated by digital platforms.

Recommendation 7

10.70The committee recommends that the Australian Government regulate an individual’s right to delete personal data.

Children’s data

10.71As highlighted in Chapter 8, children’s online data collection raises particular security and personal risks. Evidence suggested that the changes in digital platforms’ practices required to protect children online will only occur when mandatory codes with penalties for non-compliance are introduced and enforced.

10.72The committee considers that additional regulation of children’s data protection and privacy rights is necessary. The committee recommends implementing a mandatory code for the protection of children online, addressing regulatory fragmentation and aligning the rights of Australian children with international jurisdictions.

Recommendation 8

10.73The committee recommends the Australian Government legislate for mandatory industry codes on the collection, use and retention of children’s data.

Senator Andrew Bragg

Chair

Liberal Senator for New South Wales

Footnotes

[1]See, for example, BSA – The Software Alliance, Submission 32, p. 2; Tech Council of Australia (TCA), Submission 63, p. 4; Australian Information Industry Association, Submission 16, [p. 3]; Tech Policy Design Centre, Submission 22, p. 1.

[2]BSA – The Software Alliance, Submission 32, p. 3.

[3]TCA, Submission 63, p. 4.

[4]Proof Committee Hansard, 22 August 2023, p. 33.

[5]Proof Committee Hansard, 22 August 2023, p. 33.

[6]Proof Committee Hansard, 22 August 2023, p. 33.

[7]See, for example, Centre for AI and Digital Ethics, Submission 23, [p. 3]; CHOICE, Submission 54, p.3; Tech Policy Design Centre, Submission 22, p. 1; Consumer Policy Research Centre, Submission 60, p. 6; Australian Small Business and Family Enterprise Ombudsman, Submission 39, p. 3.

[8]Consumer Policy Research Centre, Submission 60, p. 6.

[9]Human Rights Legal Centre (HRLC), Submission 50, p. 5.

[10]HRLC, Submission 50, p. 11.

[11]See, for example, Tech Policy Design Centre, Submission 22, p. 2; Communications Alliance, Submission 58, p. 7; Tech Council of Australia, Submission 63, p. 5.

[12].au Domain Administration Ltd (auDA), Submission 26, p. 8.

[13]BSA – The Software Alliance, Submission 32, p. 7.

[14]Tech Policy Design Centre, Submission 22, p. 2.

[15]auDA, Submission 26, pp. 5–9.

[16]Australian Information Industry Association, Submission 16, pp. 4–5.

[17]HRLC, Submission 50, p. 9.

[18]Joint submission from Digital Rights Watch, Electronic Frontiers Australia, Tech Policy Design Centre, Centre for New Industry, Professionals Australia and Australia Institute, Submission 62, [p.1]. Also see Digital Rights Watch, Submission 68, p. 40.

[19]Ben Blackburn Racing, Submission 1, p. 9.

[20]Proof Committee Hansard, 26 July 2023, p. 22.

[21]Law Institute of Victoria, Submission 12, [pp. 3–4].

[22]Proof Committee Hansard, 27 July 2023, p. 44.

[23]Digital Rights Watch, Submission 68, p. 40.

[24] Guide to the Digital Services Act, Article 24 – Online advertising transparency, https://digitalservicesact.cc/dsa/art24.html (accessed 21 November 2023).

[25]For example, Google Ads Transparency Centre, https://adstransparency.google.com/?region=AU (accessed 21 November 2023).