How to Grade a Student Math Explanation Video (With a Rubric)
Grading a student math explanation video is different from grading a written test and most teachers don't have a clear system for it. Here's a rubric-based approach that makes the process consistent, fair, and actually useful for students.
When a student records themselves explaining a math concept, they're doing something a worksheet can't capture: they're showing you how they think. But that also means the grading criteria are different. You're not just checking answers. You're evaluating communication, reasoning, vocabulary, and mathematical process — all at once.
Without a clear rubric, grading these videos becomes subjective and time-consuming. With one, it becomes one of the most revealing assessments you can give.
Why Video Explanations Reveal More Than Written Work
A student can copy steps from a textbook without understanding them. They can circle the right answer by elimination. But they can't fake an explanation.
When a student has to speak through a problem — narrate their reasoning, name the steps, explain why they're doing what they're doing — gaps in understanding surface immediately. A student who says "and then you just... move this over" has told you something important. So has a student who says "I divide both sides by two because I need to isolate the variable."
This is why video explanation assignments are increasingly used in math classrooms, from middle school through community college, as both formative checks and summative assessments.
The 5 Dimensions of a Strong Math Explanation
Effective rubrics for math explanation videos evaluate five distinct areas. These aren't arbitrary — they map to a hierarchical model of mathematical understanding supported by education research.
1. Routine (Procedural Accuracy)
Did the student follow the correct mathematical process? Are the steps in the right order? This is the foundation — a student can communicate beautifully but still get the procedure wrong.
What to look for: Correct sequence of steps, no skipped or invented operations, accurate arithmetic at each stage.
2. Structure (Organization and Clarity)
Does the explanation have a clear beginning, middle, and end? Can a listener follow it without seeing the written work?
What to look for: Introduction of the problem, logical progression through steps, clear conclusion that states the final answer.
3. Reflection (Conceptual Understanding)
Does the student explain why each step is taken, not just what they're doing? This is where surface-level memorization is separated from genuine understanding.
What to look for: Language like "because," "in order to," "this tells me that." Students who only describe what they're doing — without explaining why — are showing procedural fluency without conceptual depth.
4. Lingo (Mathematical Vocabulary)
Does the student use correct mathematical terminology? Vocabulary use reveals whether a student is thinking in the language of mathematics or translating into informal language to compensate for uncertainty.
What to look for: Accurate use of domain-specific terms (variable, coefficient, equation, inverse operation, etc.). Watch for avoidance — students who talk around terms they don't know.
5. Correctness (Final Answer Accuracy)
Did the student arrive at the right answer, and did they state it clearly? This overlaps with Routine but focuses specifically on the endpoint.
What to look for: Correct final answer, clearly stated, in appropriate form (simplified, labeled with units where applicable).
Total: 10 points
This rubric scales well across grade levels — the vocabulary and reasoning expected from an Algebra 1 student looks different from what you'd expect in Precalculus, but the dimensions stay consistent.
Tips for Applying the Rubric
Watch once before scoring. Let the full video play before you evaluate anything. First impressions matter — you'll often notice things on the first pass that get lost when you're focused on a specific dimension.
Score one dimension at a time. It's tempting to watch once and try to score everything simultaneously. Resist this. Watch with a specific question in mind: "Am I hearing mathematical vocabulary?" Then watch again: "Is this student explaining why?"
Take notes on timestamps. If you're giving written feedback, note the specific moment in the video ("around 1:30, you said 'move the number over' — the more precise term here is 'subtract from both sides'"). Students find timestamped feedback far more actionable than general comments.
Look for discrepancies between written work and verbal explanation. Sometimes a student writes the correct steps but says something different out loud. The verbal explanation is the assessment — what they say is what they know.
How to Give Feedback That Actually Changes What Students Do Next
A rubric score alone doesn't help students improve. For each dimension where a student scored below Proficient, effective feedback answers three questions:
What specifically happened? ("At 0:47, you skipped from step 2 directly to the final answer.")
What does Proficient look like? ("A complete explanation would show the intermediate step of dividing both sides by 2.")
What should they practice? ("Try re-recording just that section and say each step out loud as you write it.")
This kind of specific, timestamped, actionable feedback is what research shows students actually respond to — and it's what separates a meaningful assessment from a grade that gets glanced at and forgotten.
Making This Scalable
The honest challenge with video explanation assessment is time. Watching, evaluating, and writing feedback for 30 videos — even short ones — takes hours. This is why many teachers who want to use explanation videos limit their frequency, or abandon them after one semester.
A few approaches that help:
Batch-grade by dimension. Score all students on Routine, then all on Vocabulary, etc. This is cognitively more efficient than switching criteria between students.
Use a shared rubric with students before they record. When students know exactly what they're being evaluated on, average scores improve and feedback becomes a conversation rather than a verdict.
Consider AI-assisted analysis. Tools like Capture Thought AI analyze student math explanation videos automatically — transcribing the audio, evaluating each dimension against the rubric framework, and generating timestamped, personalized feedback for each student. For teachers with large class loads, this can make video explanation assessment realistic at scale.
The Bottom Line
Grading student math explanation videos doesn't have to be guesswork. A five-dimension rubric — Routine, Structure, Reflection, Vocabulary, and Correctness — gives you a consistent framework that works across topics and grade levels. Pair it with specific, timestamped feedback and students have what they actually need to improve.
The goal isn't to make grading easier for its own sake. It's to make the feedback good enough that students watch it, think about it, and do something different next time.
Capture Thought AI helps math teachers assess student explanation videos at scale — using the same rubric framework described in this post. See how it works →