Foreword
Generative AI (GenAI) presents exciting opportunities and yet high stakes risks for the Australian education system. It offers data-driven insights and administrative efficiencies as well as the potential for enhanced educational experiences and outcomes for all students. This includes students from low socio-economic or ESL backgrounds; with a disability or learning difficulty; from regional or remote areas; and First Nations people. Australia must forge ahead to safely and ethically maximise the benefits of this technology while recognising and mitigating the associated risks.
This inquiry considered how GenAI could be used as an educational tool, and what this should look like in an Australian context. It is the Committee’s view that GenAI in education should be a national priority, with a focus on equitable access for all students and educators to high-quality and suitable GenAI education products. If managed correctly, GenAI in the Australian education system will be a valuable study buddy and not an algorithmic influencer.
The Committee found that the best way to implement GenAI education tools into the school system, like study buddies, is by integrating them into the national curriculum, creating and implementing guidelines and polices like the Australian GenAI in Schools Framework, and providing product standards like those being developed by Education Services Australia. Furthermore, to make GenAI education tools fit-for-purpose in Australian schools, foundation models, especially large language models (LLM), should be trained on data that is based on the national curriculum. This can make the datasets local to Australia, and inclusive, like being sensitive to gender and culture.
The safety of users was explored as a leading theme during this inquiry, especially relating to the vulnerabilities of minors. The Australian Government is already creating protections and safeguards in the education space by rolling out technology-related reforms to help protect the safety and wellbeing of children, like those related to deep fakes and cyber bullying.
Further risks and challenges arise in the education space regarding GenAI. These relate to the technology itself, the ways it is used, and the data inputs and outputs. Key concerns exist around the potential for over-reliance on GenAI, mis- and disinformation, algorithmic bias, data protection, and transparency.
The Committee shares the concerns raised by some that without strong guardrails, GenAI tools could cause great harm to individuals as it can induce a variety of biases and potentially perpetuate unfairness or even unlawful discrimination.Furthermore, it is paramount that educational providers do not select GenAI tools that involve the storage of users’ data offshore or the sale of data to third parties.
The Committee recognises the urgent need to create, implement and enforce mandatory guardrails to help manage the use of GenAI in education. The Australian Government can lead in mitigating the challenges by taking a coordinated and proactive approach, especially with state and territory governments, regulators, industry, educational institutions, educators, and international partners.
The Australian Government also needs to ensure that students in schools, TAFEs, and universities have equitable opportunities to understand and use GenAI tools ethically, safely, and responsibly. Equity and access issues also include having the infrastructure and hardware to enable the use of GenAI, ensuring GenAI is integrated into educational institutions, and having training to use it. A huge uplift is required nationally, including training for pre-service teachers and existing teachers.
GenAI is also having considerable impacts on the broader education workforce, the design and implementation of assessments, and academic and research integrity. These impacts will require adjustments to education policy and practice. TEQSA is actively promoting greater consistency in standards for GenAI in higher education including tough consequences for students and academics who may misuse GenAI technology.
The Committee has made 25 recommendations in this report that focus on:
- maximising the opportunities of GenAI education-specific tools;
- promoting quality education products;
- supporting the implementation of the Australian GenAI in Schools Framework;
- integrating AI literacy across all subjects in the next school curriculum review cycle; and
- promoting a range of safeguards and developing standards and frameworks.
While the recommendations put forward in this report are fit to regulate the application of AI in the education sector today,the Committee recognises the prospect that this technology may rapidly outpace the parameters of its terms of reference. The LLM in its present form and its accelerating multimodal capabilities set for imminent public release hold the potential to meaningfully improve educational outcomes, if applied safely. These benefits are foreseeable; however, the trajectory of how AI will advance is not. The frontier models of today may, in retrospect, be viewed as a technology only in its fledgling stage.
Should frontier AI architecture of the future be non-LLM based or should LLMs one day demonstrate capabilities such as advanced reasoning or general-purpose reasoning, the Committee’s recommendations may need to be reviewed. A framework designed to regulate GenAI may be unable to withstand or effectively scale to powerful advancements whereby the primary function of this technology is no longer to simply generate content. Government needs to remain aware that regulation may be over-committed to opportunities and risks presented to us at such an early stage in AI’s lifespan.
The Committee’s report intersects with the findings and recommendations pursued through other inquiry and reform processes, especially the Department of Education’s Australian Framework for Generative AI in Schools, and the Department of Industry, Science and Resources’ Supporting responsible AI: discussion paper and the Australian Government’s Interim Response. It also recognises the work underway by the Attorney-General’s Department, and regulators like TEQSA, ACECQA, and the eSafety Commissioner. The Committee’s findings and recommendations should be considered alongside those processes.
I would like to thank my parliamentary colleagues on the Committee for their engagement over the course of this inquiry, as well as the Committee’s expert panel—Dr James Curran, Professor Nicholas Davis, Professor Leslie Loble AM and Associate Professor Julia Powles—who shared valuable insights and expertise on the use of GenAI in education.
I also thank the individuals and organisations who provided submissions and appeared at public hearings, including current students who shared experiences and suggested ways forward.
Ms Lisa Chesters MP
Chair