Cognitive translation skill that transforms technical concepts into analogies, spatial maps, constraint boxes, and one-line handles — adapts to your cognitive suit.
Transform complex technical concepts into a format that matches how the user actually thinks. This skill teaches any agent HOW to explain things so the user absorbs and retains them. It does not change WHAT is communicated — only the format.
Before producing any Gabe Block, check for a profile at ~/.claude/gabe-lens-profile.md.
suit field from frontmatter. Load the matching suit from SUITS.md and adapt all output accordingly./gabe-lens calibrate. This presents the same concept in 4 suits and saves the user's choice./gabe-lens calibrate resetAvailable suits: Spatial-Analogical (default), Sequential-Procedural, Abstract-Structural, Narrative-Contextual. Full definitions in SUITS.md.
When the learner arrives at a simple answer quickly, the mind says "this can't be right — too easy." They search for hidden complexity, don't find it, lose energy.
Gabe Blocks are token-expensive. A full block costs ~200-350 tokens. In agent frameworks with limited context windows, this matters. Every block can be expressed at three fidelity levels. Use the leanest mode that serves the current need.
| Mode | Tokens | When to use |
|---|---|---|
| Full | ~200-350 | First encounter, documentation, teaching |
| Brief | ~40-80 | Referencing a known concept, warm context loading |
| Oneliner | ~5-15 | Compaction handoffs, session re-anchoring, summaries |
Full Block:
┌─── GABE BLOCK: Enforcement Tiers ────────────────────┐
│ THE PROBLEM │
│ Rules in docs get ignored under fatigue... │
│ THE ANALOGY │
│ Gravity vs. posted speed limits... │
│ THE MAP │
│ Tier 1: GRAVITY ══════► Always works │
│ ... │
│ CONSTRAINT BOX │
│ IS: A reliability classification for rules │
│ IS NOT: A quality judgment │
│ DECIDES: Where to invest enforcement effort │
│ ONE-LINE HANDLE │
│ "Hooks are gravity — docs are speed limit signs" │
│ ANALOGY LIMITS │
│ When reasoning about hook maintenance, gravity │
│ has no cost, but hooks do... │
│ SIGNAL: Quick check ✓ │
└───────────────────────────────────────────────────────┘
Brief:
ENFORCEMENT TIERS
IS: A reliability classification for rules
IS NOT: A quality judgment
DECIDES: Where to invest enforcement effort
HANDLE: "Hooks are gravity — docs are speed limit signs"
Oneliner:
"Hooks are gravity — docs are speed limit signs"
When a concept is complex or critical, produce a Gabe Block:
┌─── GABE BLOCK: [Concept Name] ───────────────────────┐
│ │
│ THE PROBLEM │
│ [1-2 sentences: why this exists, what breaks without │
│ it. Purpose-first, no mechanism yet.] │
│ │
│ THE ANALOGY │
│ [Physical system mapping. Choose from: mechanical, │
│ fluid dynamics, optical, chemical, electromagnetic, │
│ thermodynamic, biological. Must be a system the │
│ user can visualize spatially — not an abstract │
│ metaphor. Must capture the KEY TRADE-OFF.] │
│ │
│ THE MAP │
│ [ASCII spatial diagram showing relationships, flow, │
│ states, or architecture. Use boxes, arrows, layers. │
│ Max 15 lines. This is the visual anchor.] │
│ │
│ CONSTRAINT BOX │
│ IS: [what this concept/thing IS] │
│ IS NOT: [what it is NOT — prevents overthinking] │
│ DECIDES: [what trade-off or choice it resolves] │
│ │
│ ONE-LINE HANDLE │
│ [A memorable phrase — 5-10 words — that captures the │
│ essence. Must survive compaction, fatigue, and │
│ context loss. Think bumper sticker, not abstract.] │
│ │
│ ANALOGY LIMITS │
│ [1-3 sentences. Lead with the CASE: when does this │
│ analogy stop working? Then explain why it breaks. │
│ Case first, explanation second.] │
│ │
│ SIGNAL: Quick check ✓ | Deeper question ◆ │
│ [Quick check = trust first instinct. │
│ Deeper question (~5 min) = layers, but tractable. │
│ Deeper question (rethink) = changes your model.] │
│ │
└────────────────────────────────────────────────────────┘
───→ for data flow, - - → for optional/conditional flow, ══► for critical pathAnalogies are powerful but they have a failure mode: analogy rot. An analogy that fits perfectly today can become misleading as the underlying system evolves.
Every block includes an ANALOGY LIMITS field that documents where the analogy breaks NOW, so readers know the boundaries upfront.
When revisiting a concept or updating documentation, check:
When an analogy no longer fits:
ALWAYS apply a Gabe Block for:
NEVER apply a Gabe Block for:
APPLY LIGHTLY (brief or oneliner) for:
When producing a compaction handoff note, include one-line handles for the key decisions made this session. These survive context compression and re-anchor the spatial model.
When the planner or architect agent makes a significant decision, apply gabe-lens to that decision before presenting it.
┌─── GABE BLOCK: Enforcement Tiers ────────────────────┐
│ │
│ THE PROBLEM │
│ Rules written in docs get ignored under fatigue and │
│ context loss. 19 files exceeded the 800-line limit │
│ despite the limit being documented everywhere. │
│ │
│ THE ANALOGY │
│ Think of it as gravity vs. posted speed limits. │
│ Gravity (Tier 1 hooks) works whether you're paying │
│ attention or not — drop a ball, it falls. Speed │
│ limits (Tier 3 docs) only work if the driver reads │
│ the sign AND chooses to comply. Tier 2 (workflows) │
│ is like a speed bump — it slows you down IF you │
│ drive over it, but you can take a different road. │
│ │
│ THE MAP │
│ │
│ Tier 1: GRAVITY ══════════════════► Always works │
│ (hooks) PreToolUse:Edit ─→ fires every edit │
│ pre-commit ─→ fires every commit │
│ CI pipeline ─→ fires every PR │
│ │
│ Tier 2: SPEED BUMPS ─ ─ ─ ─ ─ ─ ─► Works if used │
│ (workflows) /ecc-code-review ─→ only if invoked │
│ /workflow-close ─→ only if remembered │
│ │
│ Tier 3: POSTED SIGNS · · · · · · ·► Often ignored │
│ (docs) CLAUDE.md rules ─→ lost after compaction │
│ Retro lessons ─→ forgotten next sprint │
│ │
│ CONSTRAINT BOX │
│ IS: A reliability classification for rules │
│ IS NOT: A quality judgment (Tier 3 rules aren't │
│ bad rules — they're just badly placed) │
│ DECIDES: Where to invest enforcement effort — │
│ convert Tier 3 lessons into Tier 1 hooks │
│ │
│ ONE-LINE HANDLE │
│ "Hooks are gravity — docs are speed limit signs" │
│ │
│ ANALOGY LIMITS │
│ When reasoning about hook maintenance burden, gravity │
│ has no cost and is uniform, but real hooks can be │
│ misconfigured, disabled, or have false positives with │
│ execution overhead. │
│ │
│ SIGNAL: Quick check ✓ │
│ (The concept is intuitive once you see the tiers. │
│ Don't overthink — the tier system IS the insight.) │
└────────────────────────────────────────────────────────┘
┌─── GABE BLOCK: Cache-Read Cost Spiral ───────────────┐
│ │
│ THE PROBLEM │
│ 65.9% of the $8,812 project cost ($5,808) came from │
│ the AI re-reading files it had already processed but │
│ forgotten after compaction. The AI is doing the same │
│ work over and over. │
│ │
│ THE ANALOGY │
│ A refrigerator with a door that opens every 27 │
│ minutes. Each time the door opens, warm air rushes │
│ in and the compressor has to run again to cool │
│ everything back down. The compressor is your cache- │
│ read cost. The door opening is compaction. If you │
│ open the door 6x per day, the compressor runs │
│ constantly. If you open it once, the food (context) │
│ stays cold (fresh) and the compressor barely runs. │
│ │
│ THE MAP │
│ │
│ Session start ──→ AI reads all files ──→ Context │
│ │ warm │
│ ↓ (27 min) │
│ Compaction ──→ Context compressed ──→ Details lost │
│ │ │
│ ↓ │
│ AI re-reads same files ──→ $1.875/1M tokens ──→ $$$│
│ │ │
│ ↓ (27 min) │
│ Compaction again ──→ RE-reads AGAIN ──→ $$$$$$ │
│ │ │
│ [repeat 6x per day = $5,808 over 18 days] │
│ │
│ CONSTRAINT BOX │
│ IS: A feedback loop where context loss causes │
│ expensive re-reads that cause more context │
│ bloat that causes more compaction │
│ IS NOT: A token pricing problem (rates are fine) │
│ DECIDES: Session length — shorter sessions with │
│ fresh context are cheaper than long ones │
│ │
│ ONE-LINE HANDLE │
│ "The refrigerator door opens every 27 minutes" │
│ │
│ ANALOGY LIMITS │
│ When deciding what to KEEP vs. DISCARD during │
│ compaction, opening a fridge doesn't lose food, just │
│ warms it, but compaction actually deletes detail. │
│ Also re-reading is not uniform like cooling; some │
│ files get re-read far more than others. │
│ │
│ SIGNAL: Quick check ✓ │
│ (The mechanism is simple — the numbers are shocking │
│ but the fix is obvious: close the door less often.) │
└────────────────────────────────────────────────────────┘