Edtech advocate and promote generative artificial intelligence (GenAI) tools as transformative, offering personalised scalable, and interactive learning experience. That only works for some schools. While some experiment with AI-driven platforms and policies, others lack basic digital infrastructure. The risk is clear: GenAI may not democratise education. Instead it might deepen existing divides unless we have targeted policies and equitable implementation.
From “AI for ALL” to Unequal Access
GenAI has potential. It can adapt explanations to student needs, offer 24/7 academic support, and automate repetitive tasks that are very time consuming for teachers. The major assumption in the use of GenAI is that all students and teachers are equipped enough, have access to the technology, the skills to use it, and the literacy to question it.
The UNESCO report on Global education Monitoring Report, 2023: technology in education:a tool on whose terms? revealsaccess to AI-enhanced learning tools remains highly unequal across socioeconomic lines. Many students in rural and remote communities, as well as those from culturally and linguistically diverse backgrounds, are less likely to benefit. Reasons? Limited access to devices, patchy internetand language bias in AI systems. Low Earth Orbit (LEO) satellite internet once installed offers a promising solution for bridging the digital divide in rural areas.
The OECD Education Policy Outlook 2024 reports on building equitable societies, includinghelping manage teacher workload through the support of AI. However, it also suggests that students without digital literacy may rely on AI outputs without critical understanding, leading to superficial or inaccurate learning. Australia’sdigital divide is already evident more pronounced between urban and regional schools. Some private and well funded schools are integrating AI literacy into their classroom practices, while public schools in lower SES areas are still navigating foundational digital inclusion.
What’s Happening in Australian Classrooms?
A range of state-level and system-wide GenAI pilot programs are already underway. These examples illustrate both innovation and inequality.
In Western Australia, a $4.7 million AI pilot program funded by the federal and state governments is exploring how AI can reduce teacher workload through lesson planning and administrative support. Eight public schools are involved in this initial phase.
In South Australia, the Department of Education partnered with Microsoft to launch EdChat, a generative chatbot trialled in eight high schools in an initial trial completed in August 2023. Designed to support inquiry-based learning, the chatbot raised questions around student data privacy, accuracy of feedback and classroom integration.
In NSW, the government developed NSWEduChat, an AI assistant being trialled in 16 public schools. Unlike ChatGPT, it prompts students to reflect and reason, rather than delivering direct answers. The tool aims to align with pedagogical goals, but requires teachers to mediate use and guide students’ understanding.
In Queensland, Brisbane Catholic Education developed Catholic CoPilot, grounded in Catholic teaching, tradition and theology. Teachers use it for lesson planning, report writing, and generating resources, showing that customised AI is feasible within institutional values.
In Victoria, an independent school in Melbourne, Haileybury Keysborough , embraced ChatGPT and created school-wide protocols to teach students ethical and effective AI use. These include critical thinking tasks and assessments designed to discourage AI overreliance.
These examples show the growing momentum but also risk of a two-speed system. Well-resourced schools move quickly, underfunded ones lag behind, potentially widening the gap in both learning outcomes and digital literacy.
AI Literacy:The New Divide
One of the most overlooked challenges in this debate is the literacy gap around AI. Knowing how to access GenAI is not enough. Students and teachers need to understand how these tools work, their limitations and how to verify output generated by these AI systems.
A 2024 Australian Education Union Survey Victoria Branch revealed 1,560 underfunded public schools in Victoria. Public schools, particularly in regional and low-income areas, have limited opportunities. According to the Australian Education Union’s (AEU) 2024 article on future skills for educators, teachers are “left to navigate the ethical and pedagogical risks of AI on their own”, often without clear national guidance, curriculum-aligned training, or digital infrastructure to experiment safely.
This leads to a two-tier system. n one, students and teachers are supported to use AI thoughtfully as a scaffold for learning collaboration, and innovation. In the other, they are either excluded from AI use altogether, or exposed to it in ways that lack context, clarity, critical literacy, or alignment with pedagogy. This new two-tier of inequality will produce students who can interrogate technology critically, and those who treat it as an unquestioned authority.
Even more concerning, the AEU notes that “students already at a disadvantage are most at risk of falling further behind” if AI adoption is left to market forces or uneven state-by-state initiatives.
Design, Governance, and Inclusion
GenAI tools are not culturally neutral. They reflect the data they are trained on, mostly English language, Western centric internet sources. Without careful consideration, they can reinforce linguistic, cultural, and cognitive bias.
Bender, in On the Dangers of Stochastic Parrots, warns that the large language models (LLM), on which GenAI is based, often reproduce harmful stereotypes and misinformation unless explicitly mitigated. This risk is amplified in educational settings where students may lack the critical skills to identify inaccuracies.
Equity in AI use means more than access, it demands representation, transparency, and contextual sensitivity. We need AI tools aligned to local curricula, respectful of cultural knowledge systems, and available in accessible formats.
What Can Be Done?
Invest in infrastructure for underserved schools to ensure that all public schools particularly in regional, remote and low SES areas have reliable internet, updated devices, and tech support.
Moving beyond one-off briefings. Teachers need ongoing training that is curriculum-aligned, classroom tested, and critically reflective. Professional learning communities, and microcredentials in AI pedagogy could help bridge the gap.
Engage learners, families, and communities in conversations about AI use in education and developing investing in open source AI that reflects Australia’s educational and cultural diversity.
As educators, researchers, and policymakers we have a choice. We can let technology set the pace, or we can slow down, ask critical questions, and build systems that centre human dignity and learning equity. Let us ensure that GenAI supports the public good, not just private innovation.
Let us ensure no learner is left behind in the age of artificial intelligence.

Meena Jha is an accomplished researcher, educator, and leader in the field of computer science and information technology, currently serving as Head of the Technology and Pedagogy Cluster at Central Queensland University (CQU), Sydney, Australia.