How Schools Can Detect AI Use Across Year Groups
When a teacher in the English department suspects that a Year 11 essay was written by ChatGPT, they might run a quick check. When a History teacher down the corridor has the same suspicion about a different student, they might handle it entirely differently. If the Geography team has no detection process at all, and the Science department is using an external tool that costs money out of their own pocket, you end up with a system where the response to AI misuse depends almost entirely on which classroom a student is in.
That inconsistency is a problem — not just for academic integrity, but for fairness. Students deserve to know that the same standards apply across their entire school. School leaders have a responsibility to make that happen.
This guide is for heads of department, curriculum leaders, and senior leadership teams thinking about how to build a consistent, school-wide approach to AI detection.
Why Patchwork Detection Creates Unfairness
In the absence of a school-wide policy, AI detection becomes informal and inconsistent. Some teachers will flag AI use; others will not notice it or will lack the tools to investigate. Some departments will have clear processes for escalating concerns; others will handle it privately and inconsistently. Some students will face consequences; others, doing exactly the same thing, will not.
This is not a hypothetical problem. As AI writing tools become more sophisticated and more widely used, the gap between departments that have a structured response and those that do not will widen. The students most likely to slip through are those in year groups or subjects where AI detection is not yet standard practice — which is an equity issue, not just an operational one.
A whole-school approach does not mean treating every high detection score as a disciplinary matter. It means ensuring that every teacher has access to the same tools, every student is held to the same standards, and every escalation follows the same documented process.
What a Whole-School AI Detection Policy Should Include
An effective AI detection policy does not need to be lengthy, but it does need to be clear on several key points.
First, it should define what constitutes AI misuse at your school. There is a meaningful difference between a student who uses an AI tool to generate ideas and one who submits AI-generated text as their own work. Your policy should make this distinction explicit, and it should align with the guidance from relevant exam boards — AQA, Edexcel, OCR, and others have all issued or are issuing guidance on AI in assessed work.
Second, the policy should specify which types of work are in scope for detection checks. Coursework and extended homework assignments are the obvious candidates. Some schools are also extending this to mock exam responses, particularly where students complete these under low-supervision conditions at home.
Third, there should be a clear escalation pathway. A detection score — however high — is not by itself evidence of misconduct. The policy should set out what happens after a score is generated: who reviews it, what conversation takes place with the student, what documentation is required, and at what point the matter is referred to a head of year, SENCO, or SLT.
Finally, the policy should be communicated clearly to students and parents. AI detection works best as a deterrent as well as a detection mechanism. Students who know their work may be checked are less likely to submit AI-generated content in the first place.
How GradeOrbit Supports Detection Across Departments
GradeOrbit includes a built-in AI detection tool designed for classroom use, but it scales naturally to department and school level. Any teacher at your school with a GradeOrbit account can submit student work for analysis and receive a likelihood score from 0 to 100% — representing how consistent the writing is with patterns typical of AI-generated text.
Each submission also returns a confidence label (Low, Medium, or High), a list of specific linguistic signals that contributed to the score, and a short reasoning paragraph. This means teachers are not just receiving a number — they are receiving context that helps them make an informed professional decision.
The tool offers two detection depths. The 1-credit model is designed for routine checks across a class set, providing a fast scan that surfaces any submissions worth a closer look. The 3-credit model runs a more thorough analysis using a more capable AI, returning a more refined assessment. This is the appropriate choice when a teacher has already identified a concern and wants more detailed evidence before initiating a conversation with the student or escalating to SLT.
Because all teachers access the same tool and receive the same type of output, GradeOrbit makes it straightforward for schools to standardise how detection is conducted and documented across departments.
Giving Every Teacher the Same Starting Point
One of the less visible inequities in schools is that the quality of a teacher's response to AI misuse often depends on their personal familiarity with detection tools and their confidence in interpreting results. An experienced teacher who has read widely about AI in education may handle a concern very differently from an ECT who has never encountered this before.
A school-wide tool with consistent outputs removes some of this variation. When every teacher uses the same detection process and receives the same type of scored report, it is easier to provide training on what the scores mean, how to interpret them, and how to use them as part of a broader professional assessment. SLT can also monitor patterns across the school — identifying year groups, subjects, or assignment types where AI misuse appears more prevalent — rather than relying on individual teachers to flag concerns in isolation.
This also protects teachers. A documented detection score, alongside a record of the conversation that followed, provides evidence that a concern was handled appropriately and professionally. That matters if a parent later challenges the process.
Handling Escalations Consistently
Even with a consistent detection tool, the human element of the process needs to be standardised. Schools should agree on a clear set of steps for what happens after a high likelihood score is recorded.
The first step is always a review, not a consequence. The teacher should look at the flagged work alongside the student's previous submissions. Is there a change in voice, vocabulary, or structural sophistication that is difficult to explain? Are there other factors — a student who has recently received additional support, or one who has worked with a tutor — that might explain improved quality?
If concern persists, the next step is a short, private conversation with the student. The aim is not to accuse but to understand. Asking a student to talk through their work, explain their research process, or respond to a follow-up question verbally will usually reveal whether they genuinely engaged with the task. For more detail on how to approach this conversation, see our guide on how to talk to students about AI detection results.
From there, your school's academic integrity policy should determine what documentation is required and when a formal referral is appropriate. Consistency at every stage of this process is what protects students, protects teachers, and protects the school.
Try GradeOrbit Across Your School
GradeOrbit is built for UK secondary schools and is used by teachers across a range of subjects and year groups. Its AI detection feature is available to every teacher on the platform, making it straightforward to implement as a consistent whole-school tool without requiring specialist technical knowledge or significant staff training.
If you are a head of department, curriculum lead, or member of SLT looking to establish a consistent approach to AI detection, GradeOrbit gives you the infrastructure to do it — and the confidence that every teacher is working from the same evidence base.
Sign up to GradeOrbit and explore how AI detection can work consistently across your school.