Produce a rigorous academic peer review of a research paper, manuscript, or preprint (PDF, markdown, TeX, DOCX, HTML, or EPUB) using the full coarse pipeline with the user's local Gemini CLI (Google AI subscription) doing all the LLM reasoning. Every pipeline stage — structure analysis, overview synthesis, per-section review, proof verification, editorial pass — is served by a headless `gemini -p` subprocess instead of a paid API. Use when the user asks to review, critique, referee, or provide feedback on an academic paper. Takes 10-25 minutes.
Runs the full coarse review pipeline on a paper using the local gemini -p CLI as the LLM backend. Every LLM call is served by a headless Gemini subprocess using the user's Google AI Pro / Ultra subscription. The only per-paper cost is the ~$0.05-0.15 Mistral OCR extraction, which uses the user's OpenRouter key locally.
uvx preferred, uv acceptable. First run:
command -v uvx || command -v uv
curl -LsSf https://astral.sh/uv/install.sh | shexport PATH="$HOME/.local/bin:$PATH"uv python install 3.12uv exists but uvx does not, replace uvx --python 3.12 --from ... below with
uv tool run --python 3.12 --from ....Refresh the bundled coarse-review skill with an ephemeral install:
uvx --python 3.12 --from 'coarse-ink==1.3.0' coarse install-skills --all --force
(If that fails with , you're on a
PyPI release that predates the command — upgrade or ignore; the skill
bundle is also loadable directly via without install.)
No such command 'install-skills'uvx --fromOpenRouter API key required for Mistral OCR extraction (~$0.10 per paper). Prefer checking for the key with presence-only probes so you don't needlessly echo its value into the transcript, but if the user hands you the key directly just save it — don't lecture them.
Before running the review, check whether OPENROUTER_API_KEY is already configured:
test -n "$OPENROUTER_API_KEY" && echo "env: set" || echo "env: missing".env file in the current directory: test -f .env && grep -q '^OPENROUTER_API_KEY=' .env && echo ".env: set" || echo ".env: missing"If neither probe reports "set", ask the user:
I need an OpenRouter API key for the OCR extraction step (~$0.10 per paper). A few options:
- Paste the key here and I'll save it to
~/.coarse/config.tomlviauvx --python 3.12 --from 'coarse-ink==1.3.0' coarse setup. Note the key passes through the LLM provider (Google) on its way to me, so treat it as slightly less private than one you typed into a local terminal — rotate at https://openrouter.ai/settings/keys if that worries you.- Set it yourself in a separate terminal:
export OPENROUTER_API_KEY=sk-or-v1-...or add it to.envin your current directory, then re-ask me.- Run
uvx --python 3.12 --from 'coarse-ink==1.3.0' coarse setupin a separate terminal yourself and paste the key into its interactive prompt — the key never touches this chat.Which do you want?
If the user pastes a key here, save it via uvx --python 3.12 --from 'coarse-ink==1.3.0' coarse setup with the pasted value and confirm it's stored. Their chat, their choice.
gemini CLI signed in (first run prompts for OAuth).
CRITICAL — if the user gives you a handoff URL, the paper is REMOTE. Do NOT run FindFiles, glob, find, ls, or any filesystem search looking for a local PDF. Do NOT interpret the handoff URL or any paper ID as a local filename. Do NOT ask the user for a file path — the --handoff URL IS the paper source and coarse-review downloads it over the network itself. Run the command verbatim and let coarse-review handle the fetch.
Two-step launch-and-wait, not foreground. A full review takes 10-25 minutes, which exceeds Gemini CLI's default 5-minute tool timeout. Step 1 detaches the worker and returns in ~2 seconds. Step 2 uses --attach to block on the worker's PID file and stream the log, emitting a heartbeat every 30 seconds of idleness so Gemini's shell tool sees activity.
Use a per-review unique log file so parallel runs don't clobber each other's output:
LOG=/tmp/coarse-review-$(date +%s).log
# STEP 2a — launch (returns in ~2s with Review PID + Log file)
uvx --python 3.12 --from 'coarse-ink==1.3.0' \
coarse-review --detach --log-file "$LOG" \
<paper_path_or_handoff_url> --host gemini [--model gemini-3.1-pro-preview] [--effort high]
# STEP 2b — wait (one blocking call, ~10-25 min, emits heartbeats)
uvx --python 3.12 --from 'coarse-ink==1.3.0' \
coarse-review --attach "$LOG"
Run the attach call with a long tool timeout (≥45 minutes) so Gemini CLI doesn't kill the blocking command prematurely. 45 minutes leaves ~20 minutes of margin on top of the 10-25 minute review runtime for cold starts, slow models, long papers, and --effort max runs — 30 minutes is too tight because the tool timeout is a wall clock, not an idle-stream cap. Bump to 60 minutes for book-length papers or the largest models. Do NOT re-run the --detach command if attach returns early (that would spawn a second worker). Safe to Ctrl+C the attach: the watcher detaches but the worker keeps running. Attach exit codes: 0 complete, 1 failure marker, 2 silent crash, 3 missing pidfile, 124 attach's own 30-min timeout, 130 user interrupt.
When attach exits cleanly, use the final log lines as the authoritative artifact locations:
rg '^ view:|^ local:' "$LOG"
If local: is present, read that exact file. If view: is present, use that URL (it already includes the signed access token — use it as-is). Do not run broad filesystem searches trying to rediscover the review file.
If view: says unavailable, report the callback failure and use only the local: path.
Available models: gemini-3.1-pro-preview (default), gemini-3-flash-preview, gemini-3.1-flash-lite-preview.
Available effort levels: low, medium, high (default), max.
Handoff mode (when the user came from the coarse web form): the paper is a REMOTE resource at the handoff URL. Do NOT search for a local PDF. Same two-step launch+attach pattern — coarse-review fetches the paper over the network:
LOG=/tmp/coarse-review-$(date +%s).log
# STEP 2a — launch
uvx --python 3.12 --from 'coarse-ink==1.3.0' \
coarse-review --detach --log-file "$LOG" \
--handoff https://coarse.ink/h/<token> --host gemini
# STEP 2b — wait
uvx --python 3.12 --from 'coarse-ink==1.3.0' \
coarse-review --attach "$LOG"
When complete, show the user the output path, web URL (if present), recommendation, top issues, and comment count.
uvx --python 3.12 --from ... coarse-review ... runs coarse from a temporary environment, so the agent does not mutate the user's global tool install.coarse-review monkey-patches coarse.llm.LLMClient → coarse.headless_clients.GeminiClient, which spawns gemini --approval-mode yolo --output-format text and feeds the prompt via stdin.