Chapter 7 - Consumer harms

Chapter 7Consumer harms

Scams, harmful apps and fake reviews

Overview

7.1Increasing use of online platforms has provided a low-cost avenue for scammers and harmful apps to reach substantial numbers of consumers and businesses.

7.2Submissions raised significant concern about the growing proliferation of online scams and other harms.[1] The committee was advised that:

Trust and confidence in the digital economy are essential. Consumers and businesses will only embrace digital opportunities if they are confident they can trust the technologies and the entities with which they interact online. Recent rapid growth in scams and fraud is undermining this confidence.[2]

7.3This chapter considers the nature and scale of intentionally harmful activities taking place on digital platforms, current regulatory measures, the responsibilities and roles of digital platforms and adequacy of their current approaches, and potential mechanisms to address growing concerns.

7.4Online safety risks beyond scams, harmful apps and fake reviews, particularly risks for children, are discussed in Chapter 8: Online safety.

7.5The committee notes the considerable body of investigative work undertaken by the Australian Competition and Consumer Commission (ACCC) in this field and draws on its research.

Scams

7.6The committee was advised that the impact on Australians of scams originating from digital platforms is ‘disproportionately higher than other channels’.[3] The Commonwealth Bank of Australia highlighted:

Recent analysis from the Australian Financial Crimes Exchange shows that digital platforms, whether they be web or app based, account for close to half of scams exposures yet only approximately 20% of all scams origination.[4]

7.7Given the broad range of online scams utilising a variety of approaches, anyone can be a victim of scams. Approaches may include:

… dating and romance scams, identity theft, unexpected money or winnings, threats and extortion, job recruitment, investments, charities, phishing, hacking, remote access scams and attempts to gain personal information, e-commerce scams, and telephone and messaging scams.[5]

7.8The ACCC highlighted how scams often utilise multiple services to defraud victims, with digital platforms, like telecommunications services, situated at the start of the ‘scam chain of events’. It found:

The ACCC has received increasing numbers of reports to Scamwatch where victims were targeted via a digital platform service, then drawn to an encrypted messaging app, before being induced to make payments through a bank or cryptocurrency service.[6]

Scale of the problem

7.9Financial losses from online scams, such as those conducted via social media platforms and mobile apps, are currently responsible for a small proportion of the total losses from scams but continue to grow.[7]

7.10The ACCC outlined that financial losses reported to Scamwatch significantly increased between 2020 ($49 million) and 2021 ($92 million). Similarly, the ACCC’s review of Scamwatch data ‘shows that reported losses to scams in 2022 on social networks increased by approximately 42% and on mobile apps by approximately 98%’.[8] Actual losses are likely much higher, given only 13percent of victims are estimated to report their losses to Scamwatch.[9] Box 7.1 elaborates on cryptocurrency scams.

Box 7.1 Case study: cryptocurrency scams

Cryptocurrency scams are particularly on the rise, with consumer reports to the ACCC and Scamwatch data suggesting many investment scams use digital platform services to target victims.[10]

The Centre for AI and Digital Ethics highlighted:

… cryptocurrency investment scams were the "main driver" of the sharp 35% increase in investment scam losses in 2021 from the previous year, with Australians reporting $99 million lost to these scams.[11]

The ACCC noted ‘cryptocurrency was also the most common payment method for investment scams’.[12]

Fake advertising is also a feature of cryptocurrency scams. Free TV Australia advised that images of well know TV presenters such as Karl Stefanovic and David Koch have been used in fake endorsements to lure social media users into scam cryptocurrency investments.[13]

The Centre for AI and Digital Ethics proposed cryptocurrency investments and investments in blockchain products be restricted to ‘sophisticated investors’:

… where such investments have not met a bar similar to those faced by ordinary issuers of securities, to avoid the proliferation of scams and fraudulent activity that current characterises the market for these digital assets.[14]

The Developers Alliance noted:

Cryptocurrency regulation is rapidly evolving as fraud and speculation emerge as fundamental drivers of adoption. We would simply highlight that virtual transactions and payments are here to stay, and that confidence in these processes and systems is fundamental to economic stability and growth.[15]

Fake reviews

7.11Fake or manipulated reviews are also a cause of significant harm to consumers and small businesses.The ACCC noted in its Regulatory Reform Report:

… as Australians spend more time and money online, consumers and small businesses are more reliant on online reviews and more vulnerable to harms from fake or manipulated reviews.[16]

7.12Fake reviews are instigated by a range of actors, from malicious actions by past employees or competitors[17] through to commercial service providers generating hostile fake reviews for existing providers to support consumer traction for new entrants.[18]

7.13Fake reviews have significant impacts on the businesses, products and markets they target, such as:

undermining consumer choice;[19]

distorting competition;[20]

causing financial losses for small business owners if they are targeted;[21]

causing reputational damage and impacting future customers;[22]

undermining credibility of legitimate small businesses;[23] and

small business being held to ransom by scammers seeking payment for review removal.[24]

Current regulation

Limitation of current regulations

7.14The committee was advised that current regulations do not adequately capture digital platforms’ role with respect to scams, fake reviews and harmful apps.The regulations have not kept pace as scammers pivot from phone call and text message scams to social media and other applications.[25]

7.15The ACCC noted that additional measures are required to supplement the essential role of Australian Consumer Law (ACL) in a manner more targeted to digital platforms, in light of the scale of harm from online scams, harmful apps and fake reviews.[26]

7.16The ACCC further highlighted that the Competition and Consumer Act 2010 and the ACL ‘are not well-suited’ to digital platform services given ‘[e]nforcement of these laws is also necessarily retrospective, addressing particular instances of conduct on a case-by-case basis after harms have already occurred’.[27]

7.17Lengthy redress processes cause further damage to small businesses where fake reviews or fraudulent misrepresentation of a business remain visible on a platform during investigation processes.[28] ASBFEO noted:

This can impact not only business viability but the mental health of the small business operator and their employees.[29]

7.18New protections which came into effect in 2022 ‘requiring telecommunications providers to identify, trace and block SMS scams’ saw around 90 million scams SMS blocked in the first six months of operation.[30] However, these new protections do not apply to digital platforms and cannot be extended under current laws.[31] The ACMA advised:

Digital platforms and messaging applications that are not required to prevent scams on their service will become an increasingly attractive target for scammers.[32]

7.19Small businesses are particularly vulnerable to intentional online harms as they lack resources to identify and counter scams[33] and encounter difficulties accessing processes to verify and remove fake reviews.[34]

National anti-scams centre

7.20The Australian Government has committed to introducing strengthened measures to combat scams.[35]

7.21Establishment of the National Anti-Scams Centre (NASC) as part of the ACCC in July 2023 is one such initiative, helping to coordinate efforts across those government agencies with a role in preventing scams. The NASC noted that it works ‘together with government, industry, other regulators, law enforcement bodies and community organisations to make it more difficult to scam Australians’.[36]

7.22Noting the significant losses from investment scams, the first NASC ‘fusion cell’ (time-limited taskforce) intends to focus on these and aims to identify methods for disrupting investment scams to minimise scam losses. Future fusion cells will target other particular scam types.[37]

Scamwatch

7.23Scamwatch is a program run by the NASC to collect reports about scams from businesses and consumers. Scamwatch uses this information to help issue warnings and to take action to stop scams.

7.24Scamwatch also provides guidance and up-to-date information to help the community identify and avoid scams[38] such as advice for dealing with impersonation of a business online.[39]

Platform responsibilities and practices

7.25The committee heard from some digital platforms about the systems they have in place to combat scams and harmful apps.

7.26Ms Mia Garlick, Regional Director of Policy, Meta, advised that Meta works hard to combat scams at all levels, working with regulators in different countries to share information and seek redress for customers.[40] Meta stated:

In October 2022, we reported that we had identified more than 400 malicious android and iOS apps that were designed to steal Facebook login information and compromise people’s accounts. These apps were listed on the Google Play Store and Apple’s App Store and disguised as photo editors, games, VPN services, business apps and other utilities to trick people into downloading them.

We’ve reported these malicious apps to our peers at Apple and Google and they have been taken down from both app stores. We also alerted people who may have unknowingly self-compromised their accounts by downloading these apps and sharing their credentials, and are helping them to secure their accounts.[41]

7.27Meta outlined how its Community Standards ‘prohibit inauthentic accounts or behaviour that intends to mislead users’,[42]stating:

… we use a combination of system and human review to detect and enforce against those who perpetrate cyber security risks.

As bad actors have become more sophisticated, so too have our efforts to detect and enforce against them. In recent years, we have invested significantly in artificial intelligence to detect harmful content and accounts, before a user needs to see it.[43]

7.28Meta believes it has greater than 99 per cent efficacy removing fake accounts before they are publicly identified.[44]

7.29Apple highlighted the safeguards provided by its developer verification process:

To develop and install apps on iOS or iPadOS, developers must register with Apple giving their real-world identity. This ensures that apps on the App Stores are submitted by identifiable persons or organisations and deters the creation of malicious apps.[45]

Inadequacy of current approaches

7.30The ACCC's Regulatory Reform Report noted current action by digital platforms against scams, harmful apps and fake reviews is not adequate. It stated:

Digital platforms that host or otherwise act as intermediaries between scammers and their victims are in a unique position to identify and stop scams and harmful apps, and are well placed to remove harmful apps. However, platforms are relatively free to choose how they deal with these issues, and the ACCC considers that platforms could do more to protect consumers. This includes providers of search, social media, online private messaging, app store, online retail marketplace and digital advertising services.[46]

7.31The ACCC highlighted it was particularly concerned about the following failings in current digital platform processes:

Failure to act on user reports: platforms have at times failed to remove scams, harmful apps and fake reviews when notified by consumers, businesses, media, and other concerned parties (for example, public figures whose identities have been misused).

Inadequate business user verification systems: scammers continue to proliferate fraudulent pages on digital platforms, including pages impersonating public figures and legitimate businesses. Not only does this harm consumers, but it also harms those public figures and businesses that have been impersonated.

Platforms hosting ads for investment scams: digital platforms continue to host insufficiently vetted ads that direct consumers to investment scams.

Platforms providing insufficient detail about what verification steps they use for reviews, if any: many platforms do not inform consumers about whether they have measures to check or verify the legitimacy of reviews and if so, what those measures are. This prevents consumers from making informed choices based on the most reliable sources.

Inconsistent and vague transparency reporting by digital platforms: digital platforms’ voluntary transparency reports do not allow consumer advocacy groups or regulators to effectively evaluate their consumer protection strategies or provide sufficient accountability to users.[47]

7.32Many of these concerns were echoed in submissions to the committee.

7.33Free TV Australia (Free TV) emphasised the need for digital platforms, particularly social media platforms, to take more responsibility to ensure ‘material which they have the ability to control (and accordingly which they have the ability to remove from their sites) is not fake, damaging, misleading or defamatory’.[48]

7.34Digital platforms have drawn criticism and been subject to legal proceedings in relation to inadequate takedown processes for fake and misleading advertising.[49]For example, the ACCC commenced proceedings against Meta in 2022 in relation to the publication of scam ads featuring prominent Australians without their consent. Free TV highlighted that digital platform takedown processes remain inadequate despite this action.[50]

7.35Further, Free TV advised that platforms are persistently slow to respond to takedown requests.[51] It submitted:

Fake ads continue to quickly reappear after they are taken down. These inadequate takedown processes damage the business reputations of broadcasters and also the personal reputations of the celebrities and media personalities that are misrepresented.[52]

7.36The NSW Small Business Commissioner similarly noted:

Requiring digital platforms to prevent and remove fake reviews, scams and harmful apps in a timely fashion would be an important step in ensuring digital platforms provide a credible space for small businesses to sell goods and services. Stronger protections requiring platforms to do so is justified given they hold a gatekeeper role and are the only party that is able remediate a fake, misleading or deceptive review. The Commission has heard from many small businesses who have faced long delays in their attempts to have fake removed and difficulty in locating who to speak to within a platform to make such requests.[53]

7.37Match Group (Match) asserted that the dominant positions of Apple and Google in the provision of in-app payment processing services created ‘little incentive to develop new features to combat scams or otherwise protect consumers’.[54]

7.38Match further advised that mandatory in-app payment system tying by Apple and Google restricts the user data available to app developers, therefore hindering the ability of app developers to detect and respond to scams and keep bad actors off their services.[55]

7.39DITRDCA also raised concerns that platforms and digital services can ‘inadvertently profit from scams occurring across their services, either directly through the sale of ad space for fraudulent products or services, or indirectly through commissions on apps and sales’.[56]

Possible solutions

7.40Some submissions provided overarching commentary and highlighted additional considerations when reflecting on how best to tackle intentional online harms.

7.41Meta called for action by the government and regulators against scammers on online platforms and other communications services, particularly in pursuing legal action against scammers.[57] It noted that ‘creating real world consequences for scam advertisers and other bad actors … is important to maintain the integrity of our services’.[58]

7.42The CBA noted the need for shared responsibility by all industry players to prevent and mitigate harms from scams.[59]

7.43Finally, Match emphasised the need for consultation with the eSafety Commissioner on any measures to address scams, harmful apps and fake reviews.[60]

7.44In addition to measures to protect digital platform users where a power imbalance exists, such as adequate internal dispute resolution processes and escalation options[61] (discussed in Chapter 4: Bargaining imbalances), the ACCC emphasised in its Regulatory Reform Report that mandatory processes should apply to all relevant digital platforms ‘to prevent and remove scams, harmful apps, and fake reviews on the platforms’ services’.[62]

7.45The ACCC outlined that mandatory processes should include:

a notice-and-action mechanism

verification of certain business users

additional verification of advertisers of financial services and products

improved review verification disclosures

public reporting on mitigation efforts.[63]

7.46It further outlines that these measures should apply, at a minimum, to:

search, social media, online private messaging, app store, online retail marketplace, and digital advertising services, in respect of scams

app stores in respect of harmful apps

search, social media, app stores, online retail marketplace, and digital advertising services, in respect of fake reviews.[64]

7.47The ACMA and the Australian Communications Consumer Action Network expressed support for the introduction of new legislation[65] requiring ‘digital platforms and messaging applications to identify and block scam activities, as is required for telecommunications providers’.[66]

Notice-and-action mechanism

7.48Submissions supported the implementation of the ACCC’s recommendation[67] for a mandatory ‘notice-and-action’ mechanism enabling any individual or entity to report a scam, illegal content or harmful app and obliging the digital platforms receiving the report to take appropriate action in response.[68]

7.49The ACCC advised:

Verification of advertisers, app developers and merchants would reduce the prevalence of scams and harmful apps, better protecting would-be victims from monetary losses and psychological impacts, and additional verification of advertisers of financial services and products would better protect consumers from predatory parties.[69]

7.50Notice-and-action mechanisms will soon be required in Europe under the Digital Services Act and are being considered for digital platforms operating in the United Kingdom.[70]

Codes

7.51Some submissions suggested the ACCC’s proposal for additional measures to promote consumer safety could be achieved with sector specific codes such as for marketplace services and social media services.[71]

7.52DITRDCA highlighted that the Treasury is consulting on a possible Government response to the ACCC report and advised DITRDCA ‘is actively working with the Treasury and the ACMA to shape advice to the Government in relation to measures to address online scams’.[72]

Footnotes

[1]See, for example, Free TV Australia, Submission 17, p. 13; Match Group, Submission 73, p. 6; Commonwealth Bank of Australia (CBA), Submission 71, p. 4; Australian Small Business and Family Enterprise Ombudsman (ASBFEO), Submission 39; Meta, Submission 69, p. 40.

[2]CBA, Submission, 71, p. 4.

[3]CBA, Submission, 71, p. 4.

[4]CBA, Submission 71, p. 4.

[5]Department of Infrastructure, Transport, Regional Development, Communications and the Arts (DITRDCA), Submission 9, p. 10.

[6]Australian Competition and Consumer Commission (ACCC), Digital platform services inquiry, Interim report No. 5 – Regulatory reform, September 2022, p. 74.

[7]DITRDCA, Submission 9, p. 10.

[8]Australian Communications Media Authority (ACMA), Submission 24, p. 4.

[9]ACCC, Submission 8, p. 6; Australian Communications Consumer Action Network (ACCAN), Submission 20, p. 1.

[11]Centre for AI and Digital Ethics, Submission 23, [p. 7].

[13]Free TV Australia, Submission 17, pp. 13–14. Also see Casey Briggs, ‘Inside the world of fake ad scams stealing the identities of Kochie and celebrities like him around the world’, ABC News, 6November2023.

[14]Centre for AI and Digital Ethics, Submission 23, [p. 2].

[15]Developers Alliance, Submission 35, [p. 5].

[17]NSW Small Business Commissioner, Submission 6, p. 2.

[18]ASBFEO, Submission 39, [p. 2].

[19]ACCC, Submission 8, p. 6.

[20]ACCC, Submission 8, p. 6.

[21]ACCC, Submission 8, p. 6.

[22]NSW Small Business Commissioner, Submission 6, p. 2.

[23]NSW Small Business Commissioner, Submission 6, p. 2.

[24]ASBFEO, Submission 39, [p. 2].

[25]ACMA, Submission 24, p. 4.

[27]ACCC, Submission 8, p. 4.

[28]ASBFEO, Submission 39, [p. 3].

[29]ASBFEO, Submission 39, [p. 3].

[30]ACMA, Submission 24, p. 4.

[31]ACMA, Submission 24, p. 4.

[32]ACMA, Submission 24, p. 4.

[33]ASBFEO, Submission 39, [pp. 1–2].

[34]NSW Small Business Commissioner, Submission 6, p. 2.

[35]DITRDCA, Submission 9, p. 11.

[36]Australian Government, National Anti-Scam Centre, Scamwatch, ‘About us’, www.scamwatch.gov.au/about-us (accessed 2 November 2023).

[37]ACCC, National Anti-Scam Centre's first fusion cell to disrupt investment scams, www.accc.gov.au/media-release/national-anti-scam-centres-first-fusion-cell-to-disrupt-investment-scams (accessed 2 November 2023).

[38]Australian Government, National Anti-Scam Centre, Scamwatch, ‘About us’, www.scamwatch.gov.au/about-us (accessed 2 November 2023).

[39]Australian Government, National Anti-Scam Centre, Scamwatch, ‘Advice for dealing with impersonation of your business online’, www.scamwatch.gov.au/research-and-resources/resources/advice-for-dealing-with-impersonation-of-your-business-online (accessed 2November 2023).

[40]Proof Committee Hansard, 22 August 2023, p. 18.

[41]Meta, Submission 69, p. 41.

[42]Meta, Submission 69, p. 40.

[43]Meta, Submission 69, p. 40.

[44]Ms Mia Garlick, Regional Director of Policy, Meta, Proof Committee Hansard, 22 August 2023, p. 18.

[45]Apple, Submission 70, p. 9.

[48]Free TV Australia, Submission 17, p. 23.

[49]Free TV Australia, Submission 17, p. 13.

[50]Free TV Australia, Submission 17, p. 23; Casey Briggs, ‘Inside the world of fake ad scams stealing the identities of Kochie and celebrities like him around the world’, ABC News, 6 November 2023.

[51]Free TV Australia, Submission 17, pp. 13–14.

[52]Free TV Australia, Submission 17, p. 13.

[53]NSW Small Business Commissioner, Submission 6, pp. 2–3.

[54]Match Group, Submission 73, p. 13.

[55]Match Group, Submission 73, Appendix 2 (Match Group, ‘Response to the Government consultation on the ACCC’s regulatory reform recommendations for digital platforms’, Submission by Match Group Inc. to Treasury), p. 3.

[56]DITRDCA, Submission 9, p. 10.

[57]Meta, Submission 69, p. 42.

[58]Meta, Submission 69, p. 42.

[59]CBA, Submission, 71, p. 4.

[60]Match Group, Submission 73, p. 6.

[61]ACCC, Submission 8, pp. 6–7; Dr Gareth Downing, Deputy CEO, ACCAN, Proof Committee Hansard, 26 July 2023, p. 36.

[65]Dr Gareth Downing, Deputy CEO, ACCAN, Proof Committee Hansard, 26 July 2023, pp. 35-36.

[66]ACMA, Submission 24, p. 4

[68]See, for example, ASBFEO, Submission 39, [p. 2]; CBA, Submission 71, p. 4; Match Group, Submission73, p. 13; DITRDCA, Submission 9, p. 11.

[69]ACCC, Submission 8, p. 7.

[71]See, for example, Match Group, Submission 73, p. 13; Free TV Australia, Submission 17, p. 23; DrGareth Downing, Deputy CEO, ACCAN, Proof Committee Hansard, 26 July 2023, pp. 36–37.

[72]DITRDCA, Submission 9, p. 11.