This skill should be used when a learner requests a quiz or test on material they have studied, saying things like 'test me', 'quiz me', 'check my understanding', or 'run a mastery check'. Administers an adaptive 5-question assessment covering recall, explanation, application, misconception detection, and confidence calibration.
Administer an adaptive 5-question assessment to evaluate a learner's understanding of a previously studied topic. This skill activates only when the learner explicitly requests testing — never trigger it automatically after course delivery. The assessment follows a structured question sequence designed to probe different depths of understanding, from basic recall through application and critical evaluation.
For the detailed scoring rubric with per-question point values, see references/rubric.md.
Before proceeding, confirm that the learner has explicitly asked for a mastery check, quiz, test, or assessment. Valid triggers include phrases like:
If the learner has not explicitly requested testing, do not proceed. Instead, ask: "Would you like me to test your understanding of [topic]? I can run a 5-question mastery check whenever you are ready."
Accept or reconstruct the following:
Generate all five questions before presenting any of them. This ensures the full assessment covers the breadth of the course material. However, present questions one at a time to the learner — never reveal all questions upfront, as this allows the learner to preview and pre-plan answers, which undermines authentic assessment.
Design each question to target a different lesson or learning objective when possible. Avoid testing the same concept twice.
Present each question in sequence. After the learner answers each one, provide brief feedback before moving to the next question. Use this exact sequence:
Test whether the learner can retrieve a key concept from memory without any hints or context.
Format: "From what you studied about [topic], [ask them to state, list, or define a core concept]."
Do not provide multiple choice options. The learner must produce the answer from memory. Accept answers that are substantively correct even if they use different wording than the course material.
Feedback: State whether the answer is correct or incorrect. If incorrect, provide the correct answer in one sentence. If partially correct, acknowledge what they got right and fill in what was missing.
Test whether the learner has internalized the concept well enough to teach it. This is the core Feynman test.
Format: "Explain [concept] as if you were teaching a friend who has never heard of it. Use plain language — no jargon."
Evaluate the explanation on three criteria:
Feedback: Rate their explanation (strong / adequate / needs work) and note any inaccuracies or missing elements. If they used jargon without explaining it, point that out — it signals they may be reciting rather than understanding.
Test whether the learner can apply their knowledge to a concrete, realistic scenario they have not seen before.
Format: Present a short scenario (2-4 sentences) that requires the learner to use a concept from the course to solve a problem, make a decision, or predict an outcome. The scenario should be different from any examples used in the course material.
Feedback: Evaluate the reasoning process, not only the final answer. A learner who reaches the wrong conclusion through sound reasoning has more mastery than one who guesses the right answer. Explain the correct approach step by step.
Test whether the learner has overcome the common misconceptions identified in the learner profile.
Format: Present a statement that sounds plausible but is incorrect (drawn from the mapped misconceptions). Ask: "True or false: [plausible but incorrect statement]. Explain your reasoning."
The statement must be genuinely tempting — it should be something that someone with a surface-level understanding would accept as true. Requiring them to explain their reasoning prevents lucky guesses from inflating the score.
Feedback: If they correctly identified the statement as false and gave a sound explanation, confirm and reinforce why it is false. If they fell for the trap, explain the misconception clearly and reference which lesson addressed it.
Shift from testing knowledge to testing metacognition — the learner's awareness of what they do and do not know.
Format: List the 3-5 main subtopics from the course. Ask the learner to rate their confidence on each from 1 (not confident at all) to 5 (could teach this to someone else). Then ask them to identify which single subtopic they feel least confident about and explain why.
This question is not scored as correct/incorrect. Instead, use the learner's self-assessment to:
Feedback: Acknowledge their self-assessment. If you noticed a mismatch between their confidence and their performance, gently flag it: "You rated [subtopic] as a 4, but your answer to Question 3 suggests there may be a gap in [specific area]. Consider reviewing Lesson N."
After all five questions are complete, compile the results.
Score each question on a 100-point scale with the following point values:
For each question, award full points for a correct answer with sound reasoning, partial points for a partially correct answer or correct conclusion with flawed reasoning, and zero points for an incorrect answer.
Map the total score to a level:
Present the results in this format:
MASTERY CHECK RESULTS
=====================
Topic: [topic]
Score: [X / 100]
Mastery Level: [level]
STRENGTHS
- [specific concept or skill they demonstrated well]
- [another strength]
GAPS IDENTIFIED
- [specific concept or skill that needs work]
- [another gap]
RECOMMENDATIONS
- [specific action, e.g., "Revisit Lesson 3 on [topic] — focus on [specific section]"]
- [another recommendation]
- [link to Notion lesson page if available]
Based on the mastery level, provide tailored next steps: