Chapter 7Consumer harms
Scams, harmful apps and fake reviews
Overview
7.1Increasing use of online platforms has provided a low-cost avenue for scammers and harmful apps to reach substantial numbers of consumers and businesses.
7.2Submissions raised significant concern about the growing proliferation of online scams and other harms. The committee was advised that:
Trust and confidence in the digital economy are essential. Consumers and businesses will only embrace digital opportunities if they are confident they can trust the technologies and the entities with which they interact online. Recent rapid growth in scams and fraud is undermining this confidence.
7.3This chapter considers the nature and scale of intentionally harmful activities taking place on digital platforms, current regulatory measures, the responsibilities and roles of digital platforms and adequacy of their current approaches, and potential mechanisms to address growing concerns.
7.4Online safety risks beyond scams, harmful apps and fake reviews, particularly risks for children, are discussed in Chapter 8: Online safety.
7.5The committee notes the considerable body of investigative work undertaken by the Australian Competition and Consumer Commission (ACCC) in this field and draws on its research.
Scams
7.6The committee was advised that the impact on Australians of scams originating from digital platforms is ‘disproportionately higher than other channels’. The Commonwealth Bank of Australia highlighted:
Recent analysis from the Australian Financial Crimes Exchange shows that digital platforms, whether they be web or app based, account for close to half of scams exposures yet only approximately 20% of all scams origination.
7.7Given the broad range of online scams utilising a variety of approaches, anyone can be a victim of scams. Approaches may include:
… dating and romance scams, identity theft, unexpected money or winnings, threats and extortion, job recruitment, investments, charities, phishing, hacking, remote access scams and attempts to gain personal information, e-commerce scams, and telephone and messaging scams.
7.8The ACCC highlighted how scams often utilise multiple services to defraud victims, with digital platforms, like telecommunications services, situated at the start of the ‘scam chain of events’. It found:
The ACCC has received increasing numbers of reports to Scamwatch where victims were targeted via a digital platform service, then drawn to an encrypted messaging app, before being induced to make payments through a bank or cryptocurrency service.
Scale of the problem
7.9Financial losses from online scams, such as those conducted via social media platforms and mobile apps, are currently responsible for a small proportion of the total losses from scams but continue to grow.
7.10The ACCC outlined that financial losses reported to Scamwatch significantly increased between 2020 ($49 million) and 2021 ($92 million). Similarly, the ACCC’s review of Scamwatch data ‘shows that reported losses to scams in 2022 on social networks increased by approximately 42% and on mobile apps by approximately 98%’. Actual losses are likely much higher, given only 13percent of victims are estimated to report their losses to Scamwatch. Box 7.1 elaborates on cryptocurrency scams.
Box 7.1 Case study: cryptocurrency scams Cryptocurrency scams are particularly on the rise, with consumer reports to the ACCC and Scamwatch data suggesting many investment scams use digital platform services to target victims. The Centre for AI and Digital Ethics highlighted: … cryptocurrency investment scams were the "main driver" of the sharp 35% increase in investment scam losses in 2021 from the previous year, with Australians reporting $99 million lost to these scams. The ACCC noted ‘cryptocurrency was also the most common payment method for investment scams’. Fake advertising is also a feature of cryptocurrency scams. Free TV Australia advised that images of well know TV presenters such as Karl Stefanovic and David Koch have been used in fake endorsements to lure social media users into scam cryptocurrency investments. The Centre for AI and Digital Ethics proposed cryptocurrency investments and investments in blockchain products be restricted to ‘sophisticated investors’: … where such investments have not met a bar similar to those faced by ordinary issuers of securities, to avoid the proliferation of scams and fraudulent activity that current characterises the market for these digital assets. The Developers Alliance noted: Cryptocurrency regulation is rapidly evolving as fraud and speculation emerge as fundamental drivers of adoption. We would simply highlight that virtual transactions and payments are here to stay, and that confidence in these processes and systems is fundamental to economic stability and growth. |
Fake reviews
7.11Fake or manipulated reviews are also a cause of significant harm to consumers and small businesses.The ACCC noted in its Regulatory Reform Report:
… as Australians spend more time and money online, consumers and small businesses are more reliant on online reviews and more vulnerable to harms from fake or manipulated reviews.
7.12Fake reviews are instigated by a range of actors, from malicious actions by past employees or competitors through to commercial service providers generating hostile fake reviews for existing providers to support consumer traction for new entrants.
7.13Fake reviews have significant impacts on the businesses, products and markets they target, such as:
undermining consumer choice;
distorting competition;
causing financial losses for small business owners if they are targeted;
causing reputational damage and impacting future customers;
undermining credibility of legitimate small businesses; and
small business being held to ransom by scammers seeking payment for review removal.
Current regulation
Limitation of current regulations
7.14The committee was advised that current regulations do not adequately capture digital platforms’ role with respect to scams, fake reviews and harmful apps.The regulations have not kept pace as scammers pivot from phone call and text message scams to social media and other applications.
7.15The ACCC noted that additional measures are required to supplement the essential role of Australian Consumer Law (ACL) in a manner more targeted to digital platforms, in light of the scale of harm from online scams, harmful apps and fake reviews.
7.16The ACCC further highlighted that the Competition and Consumer Act 2010 and the ACL ‘are not well-suited’ to digital platform services given ‘[e]nforcement of these laws is also necessarily retrospective, addressing particular instances of conduct on a case-by-case basis after harms have already occurred’.
7.17Lengthy redress processes cause further damage to small businesses where fake reviews or fraudulent misrepresentation of a business remain visible on a platform during investigation processes. ASBFEO noted:
This can impact not only business viability but the mental health of the small business operator and their employees.
7.18New protections which came into effect in 2022 ‘requiring telecommunications providers to identify, trace and block SMS scams’ saw around 90 million scams SMS blocked in the first six months of operation. However, these new protections do not apply to digital platforms and cannot be extended under current laws. The ACMA advised:
Digital platforms and messaging applications that are not required to prevent scams on their service will become an increasingly attractive target for scammers.
7.19Small businesses are particularly vulnerable to intentional online harms as they lack resources to identify and counter scams and encounter difficulties accessing processes to verify and remove fake reviews.
National anti-scams centre
7.20The Australian Government has committed to introducing strengthened measures to combat scams.
7.21Establishment of the National Anti-Scams Centre (NASC) as part of the ACCC in July 2023 is one such initiative, helping to coordinate efforts across those government agencies with a role in preventing scams. The NASC noted that it works ‘together with government, industry, other regulators, law enforcement bodies and community organisations to make it more difficult to scam Australians’.[36]
7.22Noting the significant losses from investment scams, the first NASC ‘fusion cell’ (time-limited taskforce) intends to focus on these and aims to identify methods for disrupting investment scams to minimise scam losses. Future fusion cells will target other particular scam types.
Scamwatch
7.23Scamwatch is a program run by the NASC to collect reports about scams from businesses and consumers. Scamwatch uses this information to help issue warnings and to take action to stop scams.
7.24Scamwatch also provides guidance and up-to-date information to help the community identify and avoid scams such as advice for dealing with impersonation of a business online.
Platform responsibilities and practices
7.25The committee heard from some digital platforms about the systems they have in place to combat scams and harmful apps.
7.26Ms Mia Garlick, Regional Director of Policy, Meta, advised that Meta works hard to combat scams at all levels, working with regulators in different countries to share information and seek redress for customers. Meta stated:
In October 2022, we reported that we had identified more than 400 malicious android and iOS apps that were designed to steal Facebook login information and compromise people’s accounts. These apps were listed on the Google Play Store and Apple’s App Store and disguised as photo editors, games, VPN services, business apps and other utilities to trick people into downloading them.
We’ve reported these malicious apps to our peers at Apple and Google and they have been taken down from both app stores. We also alerted people who may have unknowingly self-compromised their accounts by downloading these apps and sharing their credentials, and are helping them to secure their accounts.
7.27Meta outlined how its Community Standards ‘prohibit inauthentic accounts or behaviour that intends to mislead users’,stating:
… we use a combination of system and human review to detect and enforce against those who perpetrate cyber security risks.
As bad actors have become more sophisticated, so too have our efforts to detect and enforce against them. In recent years, we have invested significantly in artificial intelligence to detect harmful content and accounts, before a user needs to see it.
7.28Meta believes it has greater than 99 per cent efficacy removing fake accounts before they are publicly identified.
7.29Apple highlighted the safeguards provided by its developer verification process:
To develop and install apps on iOS or iPadOS, developers must register with Apple giving their real-world identity. This ensures that apps on the App Stores are submitted by identifiable persons or organisations and deters the creation of malicious apps.
Inadequacy of current approaches
7.30The ACCC's Regulatory Reform Report noted current action by digital platforms against scams, harmful apps and fake reviews is not adequate. It stated:
Digital platforms that host or otherwise act as intermediaries between scammers and their victims are in a unique position to identify and stop scams and harmful apps, and are well placed to remove harmful apps. However, platforms are relatively free to choose how they deal with these issues, and the ACCC considers that platforms could do more to protect consumers. This includes providers of search, social media, online private messaging, app store, online retail marketplace and digital advertising services.
7.31The ACCC highlighted it was particularly concerned about the following failings in current digital platform processes:
Failure to act on user reports: platforms have at times failed to remove scams, harmful apps and fake reviews when notified by consumers, businesses, media, and other concerned parties (for example, public figures whose identities have been misused).
Inadequate business user verification systems: scammers continue to proliferate fraudulent pages on digital platforms, including pages impersonating public figures and legitimate businesses. Not only does this harm consumers, but it also harms those public figures and businesses that have been impersonated.
Platforms hosting ads for investment scams: digital platforms continue to host insufficiently vetted ads that direct consumers to investment scams.
Platforms providing insufficient detail about what verification steps they use for reviews, if any: many platforms do not inform consumers about whether they have measures to check or verify the legitimacy of reviews and if so, what those measures are. This prevents consumers from making informed choices based on the most reliable sources.
Inconsistent and vague transparency reporting by digital platforms: digital platforms’ voluntary transparency reports do not allow consumer advocacy groups or regulators to effectively evaluate their consumer protection strategies or provide sufficient accountability to users.
7.32Many of these concerns were echoed in submissions to the committee.
7.33Free TV Australia (Free TV) emphasised the need for digital platforms, particularly social media platforms, to take more responsibility to ensure ‘material which they have the ability to control (and accordingly which they have the ability to remove from their sites) is not fake, damaging, misleading or defamatory’.
7.34Digital platforms have drawn criticism and been subject to legal proceedings in relation to inadequate takedown processes for fake and misleading advertising.For example, the ACCC commenced proceedings against Meta in 2022 in relation to the publication of scam ads featuring prominent Australians without their consent. Free TV highlighted that digital platform takedown processes remain inadequate despite this action.
7.35Further, Free TV advised that platforms are persistently slow to respond to takedown requests. It submitted:
Fake ads continue to quickly reappear after they are taken down. These inadequate takedown processes damage the business reputations of broadcasters and also the personal reputations of the celebrities and media personalities that are misrepresented.
7.36The NSW Small Business Commissioner similarly noted:
Requiring digital platforms to prevent and remove fake reviews, scams and harmful apps in a timely fashion would be an important step in ensuring digital platforms provide a credible space for small businesses to sell goods and services. Stronger protections requiring platforms to do so is justified given they hold a gatekeeper role and are the only party that is able remediate a fake, misleading or deceptive review. The Commission has heard from many small businesses who have faced long delays in their attempts to have fake removed and difficulty in locating who to speak to within a platform to make such requests.
7.37Match Group (Match) asserted that the dominant positions of Apple and Google in the provision of in-app payment processing services created ‘little incentive to develop new features to combat scams or otherwise protect consumers’.
7.38Match further advised that mandatory in-app payment system tying by Apple and Google restricts the user data available to app developers, therefore hindering the ability of app developers to detect and respond to scams and keep bad actors off their services.
7.39DITRDCA also raised concerns that platforms and digital services can ‘inadvertently profit from scams occurring across their services, either directly through the sale of ad space for fraudulent products or services, or indirectly through commissions on apps and sales’.
Possible solutions
7.40Some submissions provided overarching commentary and highlighted additional considerations when reflecting on how best to tackle intentional online harms.
7.41Meta called for action by the government and regulators against scammers on online platforms and other communications services, particularly in pursuing legal action against scammers. It noted that ‘creating real world consequences for scam advertisers and other bad actors … is important to maintain the integrity of our services’.
7.42The CBA noted the need for shared responsibility by all industry players to prevent and mitigate harms from scams.
7.43Finally, Match emphasised the need for consultation with the eSafety Commissioner on any measures to address scams, harmful apps and fake reviews.
7.44In addition to measures to protect digital platform users where a power imbalance exists, such as adequate internal dispute resolution processes and escalation options (discussed in Chapter 4: Bargaining imbalances), the ACCC emphasised in its Regulatory Reform Report that mandatory processes should apply to all relevant digital platforms ‘to prevent and remove scams, harmful apps, and fake reviews on the platforms’ services’.
7.45The ACCC outlined that mandatory processes should include:
a notice-and-action mechanism
verification of certain business users
additional verification of advertisers of financial services and products
improved review verification disclosures
public reporting on mitigation efforts.
7.46It further outlines that these measures should apply, at a minimum, to:
search, social media, online private messaging, app store, online retail marketplace, and digital advertising services, in respect of scams
app stores in respect of harmful apps
search, social media, app stores, online retail marketplace, and digital advertising services, in respect of fake reviews.
7.47The ACMA and the Australian Communications Consumer Action Network expressed support for the introduction of new legislation requiring ‘digital platforms and messaging applications to identify and block scam activities, as is required for telecommunications providers’.
Notice-and-action mechanism
7.48Submissions supported the implementation of the ACCC’s recommendation for a mandatory ‘notice-and-action’ mechanism enabling any individual or entity to report a scam, illegal content or harmful app and obliging the digital platforms receiving the report to take appropriate action in response.
7.49The ACCC advised:
Verification of advertisers, app developers and merchants would reduce the prevalence of scams and harmful apps, better protecting would-be victims from monetary losses and psychological impacts, and additional verification of advertisers of financial services and products would better protect consumers from predatory parties.
7.50Notice-and-action mechanisms will soon be required in Europe under the Digital Services Act and are being considered for digital platforms operating in the United Kingdom.
Codes
7.51Some submissions suggested the ACCC’s proposal for additional measures to promote consumer safety could be achieved with sector specific codes such as for marketplace services and social media services.
7.52DITRDCA highlighted that the Treasury is consulting on a possible Government response to the ACCC report and advised DITRDCA ‘is actively working with the Treasury and the ACMA to shape advice to the Government in relation to measures to address online scams’.