How AI Detection Works on Handwritten Student Work
Most teachers who hear the phrase "AI detection" immediately picture typed text — a paragraph pasted into a website, a score returned, a decision to be made. But the reality of UK secondary school assessment is that the vast majority of student work is handwritten. Exercise book responses, mock exam scripts, timed essays, end-of-unit assessments — none of these arrive as a neatly formatted digital document. They arrive as a pile of books or a folder of papers, and that physical reality has historically made AI detection tools almost entirely useless for the contexts where teachers spend most of their time.
GradeOrbit is built to work with handwritten work as well as typed text. This guide explains how AI detection works on handwritten student submissions, what the workflow looks like in practice, and what the likelihood scores it returns actually mean.
Why Handwriting Makes AI Detection Harder
AI detection tools work by analysing the statistical and linguistic patterns of a piece of text. They compare the structure, rhythm, and vocabulary of a submission against the known patterns of AI-generated output. But to do that analysis, they need the text in a readable digital form. Handwriting is not readable by a language model in the same way that typed text is — which is why most AI detection tools simply do not offer a pathway for handwritten work at all.
The solution is to add a transcription step before the detection analysis. A handwriting recognition system reads the image of the student's work and converts it to plain text, which the detection model then analyses. The quality of the transcription matters enormously: if the transcription is inaccurate, the detection analysis runs on corrupted text and the resulting likelihood score is unreliable.
GradeOrbit uses Google Cloud Vision for handwriting transcription. This gives it a level of accuracy that is robust across the range of handwriting styles typically seen in a UK secondary school classroom — including cursive, print, and mixed styles — though very unclear handwriting will always present more of a challenge. You can review the transcription before the detection runs, so you can catch and correct any significant errors before they affect the output.
How to Submit Handwritten Work for Detection in GradeOrbit
There are two ways to get a handwritten document into GradeOrbit for AI detection.
Uploading a Scanned Image
If you have access to a scanner — or a scanning app on your phone — you can scan the student's work and upload the resulting image file directly into GradeOrbit. JPEG and PNG files are both accepted. For multi-page submissions, you can upload multiple images and they will be processed together. This is the most straightforward route for teachers who already have a scanning workflow for assessment.
Using the QR Code Camera Feature
GradeOrbit includes a QR code feature that lets you use your mobile phone as a camera without needing a separate scanning app. In the GradeOrbit dashboard, you generate a QR code that links your mobile browser session to your desktop session. You then use your phone to photograph the handwritten pages, and the images are sent directly to your GradeOrbit session on the desktop. This is a practical option for teachers who do not have access to a scanner but want to photograph physical work quickly.
Redacting Student Information Before Detection Runs
A common concern when submitting student work to any AI tool is data privacy. GradeOrbit allows you to redact identifying information — typically a student's name written at the top of the page — before anything is sent for analysis. You draw a box over the relevant area directly in the GradeOrbit interface, and that region is blacked out at the image level before the transcription or detection processes run. The AI model never sees the student's name.
This redaction happens client-side, in your browser, rather than on GradeOrbit's servers. The original image is not stored. GradeOrbit does not retain student work after processing — the content is analysed and then discarded.
Understanding the Likelihood Score for Handwritten Work
Once the transcription is complete and you have confirmed it is accurate, GradeOrbit runs the detection analysis and returns a likelihood score between 0% and 100%. A higher score indicates that the transcribed text more closely resembles the statistical patterns of AI-generated writing. A lower score suggests it is more consistent with human writing.
There is an important nuance for handwritten submissions specifically: the act of handwriting something does not make it less likely to trigger a high AI detection score if the original composition was AI-generated. Students who generate text using ChatGPT or Claude and then copy it out by hand are still submitting AI-generated content, and that content will still carry the linguistic patterns that detection tools look for. Handwriting is not a bypass.
Conversely, a student who writes fluently — with clear structure, confident vocabulary, and well-organised argument — may also produce text that scores higher than their peers, simply because well-crafted human writing shares some characteristics with AI output. This is why the likelihood score is best treated as one input to your professional judgement, not a standalone verdict. Our guide on how to handle AI detection scores covers this in detail and offers a practical framework for what to do when a score is high.
The 1-Credit and 3-Credit Detection Models
GradeOrbit offers two detection modes. The standard analysis uses 1 credit and is suited to routine checks — when you want a quick indication of whether a piece of work is worth a closer look. The in-depth analysis uses 3 credits and runs a more thorough examination, returning a more detailed breakdown of the linguistic signals that contributed to the score.
For high-stakes assessments — coursework components, controlled assessment, or any work where the outcome has significant consequences — the 3-credit model is generally worth using. The additional detail it provides gives you more to work with when making a professional judgement about next steps. For quick checks of lower-stakes class work, the 1-credit model is usually sufficient.
Your model preference is saved between sessions, so you do not need to reconfigure it each time you run a detection check.
What AI Detection Cannot Tell You About Handwritten Work
It is worth being explicit about what the detection workflow cannot do, even when it works well. A high likelihood score on a piece of handwritten work does not prove that the student used AI. It tells you that the text, once transcribed, resembles AI-generated writing in its statistical patterns. That resemblance might reflect AI use. It might also reflect a highly proficient student, a student who drafted digitally and wrote by hand, or a student who received substantial editing support that changed the register of their writing.
A low likelihood score does not prove the work is authentic either. A student who heavily edits AI-generated text — restructuring sentences, adding personal voice, inserting subject-specific examples — can produce output that scores much lower than the original AI draft.
Detection is most useful as a flag that prompts further investigation, not as a mechanism that makes decisions. The most productive follow-up to a high score is a direct conversation with the student: ask them to talk through their argument, explain their evidence, or write a short response to the same question in class. A student who wrote the work genuinely will be able to engage with it. A student who submitted AI-generated content will often struggle to defend it in their own words.
Try GradeOrbit's AI Detection Feature
GradeOrbit's AI detection tool is built into the same dashboard where you mark student work — there is no separate platform to learn or switch between. Whether a student has submitted typed text, a scanned handwritten essay, or a phone photograph of a paper script, the detection workflow handles all three.
If you are dealing with handwritten coursework submissions and want a practical, privacy-conscious way to run a detection check, try GradeOrbit today and see how the workflow fits into your existing assessment practice.