Grade student weekly summaries against a rubric. Extracts text from submissions (PDF/DOCX/PPTX), evaluates against lecture content and midterm key, and generates formatted reports with scores and feedback. Use when grading student weekly summaries for a course.
Grades student weekly summaries by extracting text, applying a rubric, and generating structured reports.
Ask these questions IN ORDER:
Priority order: PPT → Midterm → Student Submissions
# 1. Extract PPT (MUST specify page range)
python3 ~/.gemini/antigravity/skills/grading-assistant/scripts/extract_ppt.py "<PPT>" <START> <END> > lecture_content.txt
# 2. Extract midterm (if provided)
[Use appropriate text extraction method]
# 3. Extract student submissions
python3 ~/.gemini/antigravity/skills/grading-assistant/scripts/process_submissions.py "<FOLDER>" > extracted_submissions.txt
Content Source Rules:
Generate proposed_coverage.md in the submission folder with 3-5 BIG concepts (not granular details).
# Proposed Coverage for [Week Name]
**Lecture Slides**: [START]-[END]
## Core Concepts (3-5 big concepts)
### 1. [Big Concept Name] (mentioned by X/Y students)
- Definition: [from PPT]
- Key sub-topics: [list related topics grouped here]
- Midterm reasoning: [if applicable, how this concept is tested]
### 2. [Big Concept Name] (mentioned by X/Y students)
...
## Midterm Terms/Reasoning (for scoring)
[Key terms from midterm that students should define/explain correctly]
[Student gets Midterm point if they demonstrate understanding of ANY of these]
Examples:
- Isostasy: lithosphere floats on asthenosphere; uniform pressure at depth
- Lithosphere: rigid outer part, T < 1300°C
- Asthenosphere: low viscosity mantle, T > 1300°C
- Moho: seismic velocity change at crust-mantle boundary
## Midterm-Relevant Facts (for TA hints ONLY)
[Specific values students must MEMORIZE - used for "I suggest you review..." hints]
[Student does NOT need these for Midterm point if they have Terms above]
✅ Facts to include:
- Oceanic crust thickness: 7 km
- Continental crust thickness: 40 km
- Lithosphere boundary: 1300°C
- Reference thicknesses: 21 km oceanic, 19 km continental
❌ Do NOT include:
- Densities (given in exam)
- Scaling ratios (derived in exam)
- Geological examples
STOP and wait for user approval.
⚠️ MANDATORY: Grade ALL students with FULL details. No exceptions.
extracted_submissions.txt[ ] Student 1, [ ] Student 2, ... [ ] Student Nreferences/rubric.md. Do not rely on assumptions or intuition.
proposed_coverage.md). Do not just list random facts. If no midterm topics relate to their summary, explicitly state "No relevant topics found for this week's midterm" and automatically give them 1 point for the Midterm category.references/grading_report_template.mdIf there are more than 10 students, process in batches:
Batch 1: Students 1-10 → Write to report, save
Batch 2: Students 11-20 → Append to report, save
Batch 3: Students 21-N → Append to report, save
After each batch, verify the report file contains all graded students.
submissions/ subfolder){foldername}_grading_report.md# Grade Summary
| Student | Coverage | Logic | Personal | Creativity | Midterm | Total | Brief Note |
|---------|----------|-------|----------|------------|---------|-------|------------|
| Name 1 | 4 | 3 | 1 | 2 | 1 | 11/11 | Excellent reasoning + research |
| Name 2 | 3 | 2 | 0 | 1 | 0 | 6/11 | Missing midterm terms, needs more personal connection |
...
Final Check: Before finishing, verify report contains summary table + N individual student entries.
⚠️ IMPORTANT: Collect ALL questions from ALL students.
?extracted_submissions.txt? and question indicators## Questions for Professor
### [Topic 1] (X questions)
- **[Student Name]**: [Question text]
- **[Student Name]**: [Question text]
### [Topic 2] (X questions)
- **[Student Name]**: [Question text]
...
After identifying potential questions, verify each is a genuine question for the professor:
Example questions to look for:
references/rubric.md: Detailed scoring criteria with decision tablesreferences/grading_report_template.md: TA comment structure and examplesreferences/rubric.md decision tables, not memoryproposed_coverage.md (located in the submission folder), not absolute completenessproposed_coverage.md Core Concepts (i.e., content from a different week's lecture):
extracted_submissions.txt shows [TEXT EXTRACTION FAILED] for a student, use view_file tool to read the PDF directly from the path provided.h/b = 1/7), no LaTeX