Australian Greens' additional comments

Australian Greens' additional comments

1.1After months of evidence from experts, parents, young people, organisations and community members, this Joint Committee Inquiry has not recommended an age ban on social media. It is important to note this at the outset, because this is a well-informed majority decision. Unlike the Albanese Government’s recent announcement of an age ban of 16, which has been supported by Peter Dutton and the Coalition, the Committee’s decision is supported by evidence and facts. Significantly, what the recommendations in the majority report are designed to do is make platforms safer for all Australians, with fit-for-purpose regulation that will disrupt the predatory business models of the tech giants and help to hold them accountable. The recommendations are intended to empower and educate young people, and all social media users, not punish them, with the onus on the tech companies making enormous profits from hate, outrage and targeted advertising to act with a duty of care and be held accountable.

1.2In a rapidly digitised environment, Australia is lagging behind other countries when it comes to regulating offshore digital giants. Overwhelmingly, the inquiry heard that comprehensive reform and regulation is needed urgently to hold online giants responsible for making their online spaces safe – but the Parliament must get that regulation right. Bringing these multi-billion-dollar tech companies under Australian jurisdiction is complex, but the safety of Australians, especially our kids, must be the priority going forward. To that end, unless companies are forced to onshore their business operations so they fall under Australian domestic, corporate, criminal and consumer laws, any reform will be difficult, if not impossible, to enforce.

1.3The Albanese Government’s announcement of plans to legislate an age limit ban on social media for under 16-year-olds in this term of Parliament, ahead of the finalisation of this inquiry, ahead of the release of the Online Safety Act Review, and ahead of the results of the age assurance trial currently underway, was disappointing and concerning. Given this legislation would be a world first, it is more important than ever that the detail is thought out properly – yet seemingly no evidence or consultation with experts has been taken into account.

1.4In this context, the government's proposed ban on social media appears to be a knee-jerk reaction to a complex problem. Parents are rightfully worried about the safety of their kids online, but they also know unless platforms are forced to clean up their act, their child won’t be safe online when they turn 16 either.

1.5No legislative detail has been made clear, including which platforms will and won’t fall under the ban, and experts are concerned about unintended consequences such as pushing young people onto even less regulated platforms, isolation from family and friends, privacy concerns with providing personal identification to global platforms, and how parents and educators will be supported to navigate this – all outlined in evidence to this inquiry which the government and Coalition appear to want to ignore.

1.6It was made abundantly clear through this inquiry that an age ban alone will not make the platforms safer or age appropriate, nor will it change the culture that informs unsafe behaviours that people are targeted with on these platforms, from eating disorders to gambling. Rather than banning young people altogether, we need to tackle the predatory business models of the tech giants, including the poisonous algorithms that fuel extremism, mental health problems and division in our democracy. The government’s own online safety expert, the e-Safety Commissioner, has recommended a multi-pronged approach that encourages platforms to be safe by design.

1.7If the government wants to protect the safety of young people, they must ban platforms harvesting young people’s data and targeting them with toxic algorithms and advertising to make massive profits. All users must have the ability to switch off or turn down the algorithms that push unwanted content into their feed. Recent research found that platforms like Facebook identify people who are at risk of harm from alcohol and gambling, and target them with advertising, while alcohol and gambling companies share data on those who are at risk to fuel targeted advertising and increase their profits.[1] This is a toxic, predatory system, which vulnerable people have no way of escaping without better regulation.

1.8Overarchingly, privacy reforms are also long overdue in this country. The protection of all users’ data is vital to keeping people safer on and offline. Such reform is critical to breaking the predatory business models of the platforms. There is also growing concern about the unabated use of user’s data by tech companies to train their AI, without consent, knowledge or compensation. This theft of intellectual and creative work is causing harm particularly to journalists and artists, it is impacting culture, undermining trust and amplifying misinformation and disinformation. In the European Union (EU), the likes of Meta have been forced to provide an opt-out option for users, at a minimum, Australia must force companies to do the same here.

1.9Comprehensive reform will be needed to bring global giants under Australian jurisdiction and enforce their responsibility to make their platforms safe. While this is a complex problem to resolve, we've seen effective laws in the EU and the United Kingdom that not only make platforms safer for young people, but for all of us. This includes not only regulating tech giants, but implementing a tax to ensure these big corporations are reinvesting the money they make off Australians back into our communities.

1.10To accompany critical regulatory reforms, the government should fund online safety education, in particular for young people and parents, and reinvest in children’s content which has dropped off in recent years. As many throughout the inquiry noted, online spaces are important for people to connect and to learn – they have many benefits for young people, particularly marginalised or isolated youth. Rather than removing young people from these important places, we should be proactively teaching them how to stay safe.

1.11It is positive to see the Minister for Communications move in the right direction by announcing their intention to legislate a digital duty of care. Hopefully the government will take guidance from the evidence-based recommendations of the majority report, particularly in relation to imposing safety by design responsibilities on platforms and prioritising implementation of proposals from the Privacy Act Review Report to protect Australians. In addition to those in the majority report, the Australian Greens recommend:

Recommendation 1

1.12Immediate release of the Online Safety Act review.

Recommendation 2

1.13Prohibit platforms from harvesting and exploiting the data of minors and protect young people from targeted, unsolicited advertisements and algorithms as a matter of priority, with a view for this to apply to all users in the long-term to protect all Australians’ safety and privacy.

Recommendation 3

1.14Invest in education for young people and their families to help develop digital literacy and online safety skills, and equip them with the tools and resources they need for positive and responsible online use.

Recommendation 4

1.15Consider implementing a digital services tax to ensure global giants pay their fair share back to the communities they profit off, similar to those implemented in countries like France and Canada.

Senator Sarah Hanson-Young

Deputy Chair

Senator for South Australia

Footnotes

[1]Foundation for Alcohol Research and Education, Submission 170.1, [p. 2].