Facilitates deliberate skill development during AI-assisted coding. Offers interactive learning exercises after architectural work (new files, schema changes, refactors). Use when completing features, making design decisions, or when user asks to understand code better. Supports the user's stated goal of understanding design choices as learning opportunities.
Invocation argument: $ARGUMENTS
The user wants to build genuine expertise while using AI coding tools, not just ship code. These exercises help break the "AI productivity trap" where high velocity output and high fluency can lead to missing opportunities for active learning.
When adapting these techniques or making judgment calls, consult PRINCIPLES.md for the underlying learning science.
Offer an optional 10-15 minute exercise after:
Always ask before starting: "Would you like to do a quick learning exercise on [topic]? About 10-15 minutes."
Keep offers brief and non-repetitive. One short sentence is enough.
This skill applies to:
End your message immediately after the question. Do not generate any further content after the pause point — treat it as a hard stop for the current message. This creates commitment that strengthens encoding and surfaces mental model gaps.
After the pause point, do not generate:
Allowed after the question:
Pause points follow this pattern:
Use explicit markers:
Your turn: What do you think happens when [specific scenario]?
(Take your best guess—wrong predictions are useful data.)
Wait for their response before continuing.
At the start of a new session on an ongoing project:
Elaborative interrogation: Ask "why," "how," and "when else" questions
Interleaving: Mix concepts rather than drilling one
Varied practice contexts: Apply the same concept in different scenarios
Concrete-to-abstract bridging: After hands-on work, transfer to broader contexts
Error analysis: Examine mistakes and edge cases deliberately
Prefer directing users to files over showing code snippets. Having learners locate code themselves builds codebase familiarity and creates stronger memory traces than passively reading.
Give enough context to orient, but have them find the key piece:
Open
[file]and find the[component]. What does it do with[variable]?
Adjust guidance based on demonstrated familiarity:
[file], scroll to around line [N], and find the [function]"[feature]"[feature] works?"Fading adjusts the difficulty of the question setup, not the answer. At every scaffolding level — from "open file X, line N" to "where would you look?" — the learner still generates the answer themselves. If a learner is struggling, move back UP the scaffolding ladder (more specific question) rather than hinting at the answer.
After they locate code, prompt self-explanation:
You found it. Before I say anything—what do you think this line does?
After exploring one instance, have them find a parallel:
We just looked at how
[function A]handles[task]. Can you find another function that does something similar?
If this skill is invoked with the argument orient (i.e., /learning-opportunities orient), run a guided repo orientation exercise instead of the default exercise offer flow.
Look for resources/orientation.md relative to this skill file at these locations, in order:
.claude/skills/learning-opportunities/resources/orientation.md (project level)~/.claude/skills/learning-opportunities/resources/orientation.md (user level)If the file does not exist at either location, stop and tell the user:
"No orientation file found. Run
/orient:orientfirst to generate one for this repo. It takes about 30 seconds."
See orient for the plugin that generates orientation files.
If orientation.md exists, read it and run through the Suggested exercise sequence section it contains. Apply all standard skill techniques: pause for input after each question, use fading scaffolding, embrace wrong predictions as learning data. The orientation file contains repo-specific content but not full pedagogical guidance — consult PRINCIPLES.md as needed when making facilitation decisions.
Before starting, give the user a one-sentence summary of what the orientation covers and ask if they want to proceed — consistent with the "always ask before starting" principle.
After the exercise sequence, ask the user: "What's one thing about this codebase that surprised you or that you want to dig into further?" Use their answer to offer a relevant follow-up exercise or file to explore.