Use this skill whenever a user wants to grade student homework from handwritten images — whether scanned, photographed, or taken on an iPad. Triggers include: "grade these homeworks", "mark student submissions", "check student answers against the solution key", "parse handwritten homework", "score student assignments", or any request involving images of handwritten student work that should be compared to a solution key. Also trigger when the user provides multiple student image sets alongside a PDF or image solution key and wants a graded spreadsheet output. Subject areas include geophysics, earth science, math, and equations. Always use this skill when student work is handwritten and needs structured, scored grading output — even if the user just says "can you grade these?" with images attached.
Grades multi-page handwritten student homework submissions against a solution key. Outputs a consolidated Excel spreadsheet with per-question scores, partial credit, mismatch flags, and a total score per student.
Read references/parsing-strategy.md for detailed image parsing and OCR guidance.
Read references/matching-and-grading.md for question matching logic and scoring rubrics.
Read references/spreadsheet-schema.md for the exact output spreadsheet format.
Ask the user for the following if not already provided:
references/matching-and-grading.md. Ask only if the user wants to override.If the user uploads files directly in the chat, accept them. If files are on disk, they will be at /mnt/user-data/uploads/.
Follow the detailed parsing guidance in references/parsing-strategy.md, Section 1.
Quick summary:
pdfplumber (preferred) or pymupdf. Fall back to vision if extraction yields garbled text (scanned PDF).references/parsing-strategy.md.Q1: Define P-wave velocity. Answer: Speed of compressional waves through a medium, ~6 km/s in continental crust.
Q2: Calculate depth given two-way travel time of 1.2s and v=3000 m/s. Answer: depth = (v × t) / 2 = 1800 m
...
Store this as solution_key: list[dict] with fields q_num, q_text, answer_text, points.
For each student, process their image set in order. Follow references/parsing-strategy.md, Section 2.
Key rules:
student1_p1.jpg, student1_p2.jpg); otherwise ask the user for page order.Output per student: raw_text: str, detected_answers: list[dict] with fields detected_q_label (what the student wrote, e.g. "Q3", "3.", "iii", or None if no label), answer_text, page_number.
Follow references/matching-and-grading.md, Section 1 for full logic.
Two-pass matching strategy:
Pass 1 — Auto-detect: Try to map detected_q_label to q_num in the solution key.
3Pass 2 — Content anchor fallback: If auto-detect fails (missing labels, ambiguous numbering, <70% coverage):
answer_text content to answer_text in solution keyreferences/matching-and-grading.md)Flag the following as mismatches:
MISSING: A solution key question has no matched student answerUNMATCHED: A student answer has no plausible match in solution keyAMBIGUOUS: Two or more student answers could match the same solution key questionREORDERED: Answer matched via content (not label) — note the original label vs. matched labelFollow references/matching-and-grading.md, Section 2 for rubric details.
Score each matched answer on a 0–4 scale (then scale to point value):
| Score | Meaning |
|---|---|
| 4 | Fully correct — key facts, values, units all present and correct |
| 3 | Mostly correct — minor error, missing unit, small numeric rounding |
| 2 | Partially correct — correct approach, significant error or missing step |
| 1 | Minimal credit — relevant attempt, mostly wrong |
| 0 | Incorrect, blank, or unmatched |
For math/equations:
For geoscience/written answers:
MISSING questions → score 0 automatically.
UNMATCHED answers → score 0, flag for instructor review.
Read /mnt/skills/public/xlsx/SKILL.md before writing the spreadsheet.
Follow the schema in references/spreadsheet-schema.md exactly.
Sheet 1: Grade Summary
Sheet 2: Mismatch Flags
Sheet 3: Parsed Answers (Detail)
Apply conditional formatting:
Save to /mnt/user-data/outputs/homework_grades.xlsx and present to user.
After writing the file, give the user a brief summary:
Call present_files with the xlsx path.
ILLEGIBLE, score 0, flag for manual review.LANGUAGE_MISMATCH, attempt translation, note uncertainty._dup to the second occurrence.