Challenge system assumptions against accumulated evidence. Triages observations and tensions, detects patterns, generates proposals. The scientific method applied to knowledge systems. Triggers on "/rethink", "review observations", "challenge assumptions", "what have I learned".
Read these files to configure domain-specific behavior:
ops/derivation-manifest.md — vocabulary mapping, domain context
vocabulary.notes for the notes folder namevocabulary.note for the note type name in outputvocabulary.rethink for the command name in outputvocabulary.topic_map for MOC referencesvocabulary.cmd_reflect for connection-finding referencesops/config.yaml — thresholds, processing preferences
self_evolution.observation_threshold: number of pending observations before suggesting rethink (default: 10)self_evolution.tension_threshold: number of pending tensions before suggesting rethink (default: 5)ops/methodology/ — existing methodology notes (read all to understand current system self-knowledge)
If these files don't exist (pre-init invocation or standalone use), use universal defaults.
The command name itself transforms per domain. The derivation manifest maps the universal name to domain-native language. If no manifest exists, use "rethink" as the command name.
Target: $ARGUMENTS
Parse immediately:
START NOW. Reference below defines the six-phase workflow.
The system is not sacred. Evidence beats intuition.
Every rule in the context file, every workflow in a skill, every assumption baked into the architecture was a hypothesis at some point. Hypotheses need testing against reality. Observation notes in ops/observations/ capture friction from actual use. Tension notes in ops/tensions/ capture unresolved conflicts. Rethink first triages these individually (some become {DOMAIN:notes}, some become methodology updates, some get archived), then compares remaining evidence against what the system assumes and proposes changes when patterns emerge.
This is the scientific method applied to knowledge systems: hypothesize, implement, observe, revise.
Without this loop, generated systems ossify — they accumulate friction that never gets addressed, contradictions that never get resolved, and methodology learnings that never get elevated to system-level changes. /rethink is the immune system that prevents calcification.
Rule Zero: ops/methodology/ is the canonical specification of how this system operates. Before triaging observations, check whether the system has drifted from what the methodology says it should do.
# Get all methodology notes with their metadata
for f in ops/methodology/*.md; do
echo "=== $f ==="
head -20 "$f" # frontmatter with category, created, updated, status
echo ""
done
Read all methodology notes fully. Extract:
Read:
ops/config.yaml — current configuration stateops/derivation-manifest.md — vocabulary and feature stateType 1: Staleness
# Compare config.yaml modification time vs newest methodology note
CONFIG_MTIME=$(stat -f %m ops/config.yaml 2>/dev/null || stat -c %Y ops/config.yaml 2>/dev/null || echo 0)
NEWEST_METH=$(ls -t ops/methodology/*.md 2>/dev/null | head -1)
METH_MTIME=$(stat -f %m "$NEWEST_METH" 2>/dev/null || stat -c %Y "$NEWEST_METH" 2>/dev/null || echo 0)
If CONFIG_MTIME > METH_MTIME: config has changed since methodology was last updated. Flag as staleness drift.
Type 2: Coverage Gap
For each active feature in config.yaml (features with enabled: true or features present in the active configuration), check whether a corresponding methodology note exists. Features without methodology coverage represent gaps — the system does things it cannot explain to itself.
Check these feature areas:
Type 3: Assertion Mismatch
For each methodology note that makes a behavioral assertion ("What to Do" section), check:
Report: which assertions align, which contradict, which have no corresponding system element.
For each drift finding, create an observation note in ops/observations/:
---