Use when acting as a journal or grant reviewer and writing formal reviewer-side evaluations focused on methodology, statistics, reporting standards, reproducibility, and constructive feedback.
Peer review is a systematic process for evaluating scientific manuscripts. Assess methodology, statistics, design, reproducibility, ethics, and reporting standards. Apply this skill for manuscript and grant review across disciplines with constructive, rigorous evaluation.
This skill should be used when:
When creating documents with this skill, consider adding diagrams when they clarify a workflow, architecture, or evaluation framework.
If the document does not already contain suitable figures:
Example command:
uv run ~/.codex/skills/inno-figure-gen/scripts/generate_image.py \
--prompt "Publication-style diagram of the review workflow; white background; clean labels; colorblind-friendly palette; high contrast" \
--filename "figures/review-workflow.png" \
--resolution 2K
Requires GEMINI_API_KEY or an explicit --api-key.
When to add figures:
For detailed guidance on creating figures, refer to the inno-figure-gen skill documentation.
Conduct peer review systematically through the following stages, adapting depth and focus based on the manuscript type and discipline.
Begin with a high-level evaluation to determine the manuscript's scope, novelty, and overall quality.
Key Questions:
Output: Brief summary (2-3 sentences) capturing the manuscript's essence and initial impression.
Conduct a thorough evaluation of each manuscript section, documenting specific concerns and strengths.
Critical elements to verify:
Common issues to identify:
Red flags: