Product thinking and problem framing before code. Two modes: startup (six forcing questions) and builder (generative brainstorming). Produces a design doc with alternatives and adversarial review. Use when: "brainstorm this", "I have an idea", "help me think through this", "is this worth building", "think through", "scope this". Proactively suggest when the user describes a new product idea or is exploring whether something is worth building — before any code is written. Use before /plan.
Created: 2026-03-20-00-00 Last Updated: 2026-03-20-00-00
Inspired by gstack office-hours — adapted for CrossCheck.
Frame the problem, challenge premises, generate alternatives, and produce a design doc before writing a single line of code.
/think # Start thinking through current idea
/think build a calendar app # Start with description
/think --startup # Force startup mode
/think --builder # Force builder mode
Understand the project and the area the user wants to change.
Read context — CLAUDE.md, recent git log --oneline -20, relevant code areas
Check for prior design docs — look in project root and ~/.crosscheck/designs/ for
existing design docs related to this project
Ask: What's your goal with this? via AskUserQuestion:
Before we dig in — what's your goal with this?
- Building a startup (or thinking about it)
- Intrapreneurship — internal project at a company
- Hackathon / demo — time-boxed, need to impress
- Open source / research — building for a community or exploring
- Learning — teaching yourself, leveling up
- Side project — creative outlet, solving your own problem
Mode mapping:
For startup/intrapreneurship, assess product stage:
Output: "Here's what I understand about this project and the area you want to change: ..."
Ask ONE AT A TIME via AskUserQuestion. Push on each until the answer is specific and evidence-based.
Smart routing by product stage:
Ask: "What's the strongest evidence you have that someone actually wants this — not 'is interested,' but would be genuinely upset if it disappeared tomorrow?"
Push until you hear: Specific behavior. Someone paying. Someone whose workflow depends on it.
Red flags: "People say it's interesting." "We got waitlist signups." "VCs are excited."
Ask: "What are your users doing right now to solve this problem — even badly?"
Push until you hear: A specific workflow. Hours spent. Dollars wasted. Tools duct-taped together.
Red flags: "Nothing — there's no solution." If no one is doing anything, the problem probably isn't painful enough.
Ask: "Name the actual human who needs this most. What's their title? What gets them promoted? What keeps them up at night?"
Push until you hear: A name. A role. A specific consequence they face.
Red flags: Category-level answers. "Healthcare enterprises." "SMBs." "Marketing teams."
Ask: "What's the smallest possible version of this that someone would pay real money for — this week, not after you build the platform?"
Push until you hear: One feature. One workflow. Something shippable in days.
Red flags: "We need to build the full platform first." "It wouldn't be differentiated without X."
Ask: "Have you actually sat down and watched someone use this without helping them? What did they do that surprised you?"
Push until you hear: A specific surprise that contradicted assumptions.
Red flags: "We sent a survey." "Nothing surprising." Surveys lie. "As expected" means filtered through assumptions.
Ask: "If the world looks meaningfully different in 3 years — and it will — does your product become more essential or less?"
Push until you hear: A specific claim about how their users' world changes and why that makes the product more valuable.
Red flags: "The market is growing 20%/year." Growth rate is not a vision.
Smart-skip: If earlier answers already cover a later question, skip it.
Escape hatch: If the user says "just do it" or provides a fully formed plan → fast-track to Phase 4.
Ask ONE AT A TIME via AskUserQuestion:
Smart-skip: If the user's initial prompt already answers a question, skip it.
Escape hatch: If the user says "just do it" or provides a fully formed plan → fast-track to Phase 4.
Before proposing solutions, challenge the premises:
Output premises as clear statements the user must agree with:
PREMISES:
1. [statement] — agree/disagree?
2. [statement] — agree/disagree?
3. [statement] — agree/disagree?
Use AskUserQuestion to confirm. If the user disagrees, revise and loop back.
Produce 2-3 distinct implementation approaches. This is NOT optional.
For each approach:
APPROACH A: [Name]
Summary: [1-2 sentences]
Effort: [S/M/L/XL]
Risk: [Low/Med/High]
Pros: [2-3 bullets]
Cons: [2-3 bullets]
Reuses: [existing code/patterns leveraged]
APPROACH B: [Name]
...
Rules:
RECOMMENDATION: Choose [X] because [one-line reason].
Present via AskUserQuestion. Do NOT proceed without user approval.
Write the design document.
mkdir -p ~/.crosscheck/designs
Write to ~/.crosscheck/designs/{repo-name}-{branch}-{date}.md (sanitize branch name:
replace / with - to avoid creating nested directories):
# Design: {title}
Generated by /think on {date}
Branch: {branch}
Repo: {repo}
Status: DRAFT
Mode: Startup
## Problem Statement
{from Phase 2A}
## Demand Evidence
{from Q1}
## Status Quo
{from Q2}
## Target User & Narrowest Wedge
{from Q3 + Q4}
## Constraints
{from conversation}
## Premises
{from Phase 3}
## Approaches Considered
### Approach A: {name}
### Approach B: {name}
## Recommended Approach
{chosen approach with rationale}
## Open Questions
{unresolved questions}
## Success Criteria
{measurable criteria}
## Next Action
{one concrete thing to do next}
# Design: {title}
Generated by /think on {date}
Branch: {branch}
Repo: {repo}
Status: DRAFT
Mode: Builder
## Problem Statement
{from Phase 2B}
## What Makes This Cool
{the core delight, novelty, or "whoa" factor}
## Constraints
{from conversation}
## Premises
{from Phase 3}
## Approaches Considered
### Approach A: {name}
### Approach B: {name}
## Recommended Approach
{chosen approach with rationale}
## Open Questions
{unresolved questions}
## Success Criteria
{what "done" looks like}
## Next Steps
{concrete build tasks — what to implement first, second, third}
Before presenting to the user, run an adversarial review.
Step 1: Dispatch a subagent (via Agent tool) to review the design doc independently.
Prompt the subagent with:
Dimensions:
Step 2: If the reviewer returns issues, fix them and re-dispatch. Max 3 iterations.
Convergence guard: If the same issues repeat, persist them as "## Reviewer Concerns" in the document rather than looping.
Step 3: Report to user: "Doc survived N rounds of review. M issues caught and fixed. Quality score: X/10."
If the subagent fails or is unavailable, skip the review and present the unreviewed doc.
Present the design doc to the user via AskUserQuestion:
/plan and implementationOnce approved, tell the user:
"Design doc approved. Run /plan to create an implementation plan, or start coding."
Report using one of: