From Adversary to Ally: Rethinking AI in Schools
Written by Ryan Elwell, Director, Digital Education, ACT Education Directorate
Part 1: This time feels different.
In just a few short months, generative AI has burst onto the scene and promised to upend education as we know it. Headlines declared that teachers were relics of the past—AI could now teach our students, finally realising the long-elusive dream of personalised learning for all. This time is different!
So, if this is truly different, why does it feel so…familiar?
Clicks were baited and fortunes were made on unfulfilled promises. And as always, teachers and schools are left holding the bill. AI has become ubiquitous in conversations about teaching and learning, but the meaningful and implementable guidance on how to use it in classrooms remains scarce. At the end of the day, no matter how advanced or disruptive a new tool may be, it’s teachers who are tasked with making it work.
So what can we do as educators when faced with a technology that seems to render traditional assessments obsolete before our very eyes? We can start by looking inward—and finish by leading through example.
Part 2: Assessment integrity: Why we can’t go adversarial
Sidebar: AI Detectors should not be used. Full stop. These black-box tools are not accurate and contain uncontrollable biases based on race, gender, socioeconomic status, country of birth, and more. I consider them not safe to use at this time, so please do not use them.
Now, let’s sort out this issue of assessment integrity: How can we stop students from cheating with AI?
Assessment Integrity simply means that submitted work was produced by students without outside assistance, and critically, it’s very, very dead. Honestly, it has been very, very dead for some time, yet it’s often propped up like a bad Weekend at Bernie's remake no one asked for.
Let’s let it go.
When a task is out of sight of the teacher, it’s vulnerable. This has prompted some interesting decisions, including a troubling resurgence of pen and paper assessments. No judgement, these types of assignments certainly have a time and a place, but as with any assessment, they aren’t perfect.
For instance, they can:
consume considerable time that could be spent teaching,
consume considerable time marking,
minimise opportunities for ongoing feedback,
possibly present accessibility issues,
be really, really boring, and finally,
inadvertently assess things not in the curriculum, like writing speed
Sure, sometimes it can work, but this can’t be the solution to AI cheating, right?
Perhaps we keep finding ourselves in this adversarial position because education has been able to maintain a tenuous white-knuckled grip on our approach to assessment. If we choose to go adversarial as we have so many times before with bans and reverting to paper and pencil, I just don’t see teachers winning that battle.
AI is different, and we need to do better this time.
Let’s shift our focus from what we can’t control (students' use of AI outside our classrooms) to what we can control (using AI to reimagine assessment itself). The institution isn’t changing any time soon, but teachers have a lot of power here.
Part 3: Talking, Reflecting, and Rethinking
Something important is the shift in mindset from seeing AI as a cheating tool to recognising its potential as a teacher amplification tool, and taking this AI journey alongside our students.
You’re now asking yourself with some impatience: “Journey alongside my students? What the heck does that mean!?”, and that is a pretty fair question, so it would be rude not to answer.
Taking the AI journey with our students means engaging in open conversations about AI (talking), modelling AI use (modelling), and adapting our assessment (rethinking).
Talking
While all technology benefits from reflective use, with AI the stakes are higher. Teachers need to be very intentional about what they’re doing and why they’re doing it. Before all the prompt engineering and the clicking of the keys - without intentionality there is a risk that AI ends up being in control of the content.
We can ask ourselves: What is a core priority for me as a teacher?
Knowing that in advance helps us be intentional about our AI use before we start, and helps us not give away the best parts of teaching unintentionally. When I look back at my career, I will likely not cherish my deftness in creating question banks, so AI can help me with that. Using AI for marking and learning feedback? Not me, because getting to know my students and how they learn is one of my core priorities I do cherish, so I am going to hold onto that tightly.
This intentionality automatically hands control to the teacher, and keeps them from giving AI the parts of teaching that make it so special.
Modelling
Modelling transparency is critical, and it offers a valuable opportunity to set an example for students.
A general rule: if something created with AI ends up in front of the student in a learning/assessment context, they deserve to know. From K-12 and beyond. This also helps teachers stay honest in the face of a very tempting technology. Feeling uncomfortable with labelling AI use may indicate we are giving some key parts of teaching to technology that we should hold on to. Besides, I think that students that can be accused of using AI poorly have a right to know when their teachers are using those same tools.
Did I upload a worksheet and ask for five different versions? After I check the content, tag the sheets as "Created with AI." There is no shame in this. My students know I’m busy, and not doing this creates that adversarial thing we’re trying to avoid.
Seeing us tag our content and model where and when it is okay to use AI helps clarify expectations for our students while keeping them accountable. What if I’m not sure about using AI for feedback? Discuss as a class what feedback should look like, what type of feedback could AI give, and what should a human give.
Another sidebar: I feel that feedback from AI shouldn’t include advice to guide a student’s learning. That is core teacher stuff. But mechanical feedback like grammar, vocabulary, spelling, etc? Sure. AI does that and I get more time to do the important feedback, and students get better feedback in a timely manner. AI is not for everything, but intentional use of AI for feedback may have a place in your practice as well.
Rethinking
If we’re thinking about all the ways AI can help students cheat, I worry that puts us at a disadvantage. Let’s take a step back for a moment.
Good, now one more…
Now if I just open these blinds and let some sunlight in….perfect!
Here we’re looking at AI from a new angle and we start to see that it’s the most powerful digital tool teachers have ever had. AI can give us the time we wish we had to create dynamic assessments like we know we can.
AI isn’t a student cheating tool; it’s a teacher amplification tool.
Let’s try to shift our thinking from what we can’t control, which is how students use AI outside of our classroom, to what we can control, which is how we use AI to support our practice.
Ask AI to chunk that task, help differentiate, make an older test multimodal, add formative check-ins, and link things across the curriculum. AI can make a great product but it often struggles with a process over time, so ask it where to insert little process check-ins to help see this project come together.
Using AI to suggest how to chunk, differentiate, scaffold, identify bias, insert teachable moments, and boost accessibility all supercharge our assessments. Our expertise and time is better spent getting to know our students, so why not let AI help? Supercharged assessments are in our control and are more fun and relevant for our students.
Conclusion
To wrap up, by reframing our thinking about AI from a student cheating machine to a time-saving teacher amplification tool, we can focus on tasks that require critical thinking, creativity, and authentic application of knowledge. AI gives us the capacity to make our assessments and tasks engaging and accessible and gives us time to build meaningful relationships with students, foster deep learning experiences, and empower the next generation to thrive in an AI-driven world.