Copilot for scientific paper review. Use when reviewing a research paper (PDF/LaTeX), guiding section-by-section analysis, logging issues, and generating structured review responses.
Guide user through structured paper review, logging issues and generating formal review response.
Locate paper from $ARGUMENTS:
<paper-name>-review/ folder, copy PDF into it.tex first (prioritize over PDF)AskUserQuestion: "No paper detected. Please provide path to paper."Convert PDF → LaTeX (if no .tex exists):
# Check credentials
[ -n "$MATHPIX_APP_ID" ] && [ -n "$MATHPIX_API_KEY" ] && echo "OK" || echo "MISSING"
python ~/.claude/plugins/science-skill/skills/paper-review-helper/scripts/pdf2tex.py "<pdf_path>" "<paper-folder>"
Output structure: <paper-folder>/<pdf_id>/<pdf_id>.tex with images/ subfolder for figures.
Initialize workspace:
<paper-folder>/
├── <pdf_id>/ # Mathpix output (if converted)
│ ├── <pdf_id>.tex # Converted LaTeX
│ └── images/ # Extracted figures
├── artifact/
│ ├── review-log.md # Conversation log
│ ├── issues-major.md # Major issues
│ ├── issues-minor.md # Minor issues
│ └── programs/ # Math verification scripts
└── original.pdf # Source PDF (if applicable)
Parse LaTeX structure: \section, \subsection, \begin{abstract}. For each section:
Chunk appropriately:
\subsection or paragraph groupsPresent section with:
[Figure X: images/<filename> - <caption>] with a path link to the file in the workspace.Ask user via AskUserQuestion:
Respond to requests:
WebSearch for DOI, author names, paper titlesWebSearch, WebFetch for conceptsartifact/programs/, run with Python/SymPyimages/ folder, describe or use /vision skillLog to artifact/review-log.md:
## [Section Name] - [Timestamp]
### User Questions
- Q: ...
- A: ...
### Issues Identified
- [MAJOR] ...
- [MINOR] ...
### Tools Used
- WebSearch: "query" → finding
issues-major.md: methodology flaws, unsupported claims, logical errorsissues-minor.md: grammar, typos, unclear wordingSkip Supplementary/Appendix unless user requests.
Gather context via AskUserQuestion:
Generate review to artifact/REVIEW.md:
# Review of [Paper Title]
## Summary
[1-2 sentences]
## Overall Recommendation
[User's decision + justification]
## Major Issues
1. **Issue**: [description]
- **Location**: Section X, paragraph Y / Equation N
- **Impact**: [why this matters]
- **Suggestion**: [fix or note if unfixable]
## Minor Issues
[Grouped by type: grammar, clarity, formatting]
## User Misunderstandings Analysis
[If user had confusion during review:]
- **Confusion**: [what]
- **Cause**: paper vagueness / reader knowledge gap
- **Recommendation**: [should paper clarify?]
## Constructive Feedback
[Positives + specific improvements]
## Editor Questions Response
[If provided]
Inline markers when presenting text:
[G: ...] - grammar error[C: ...] - clarity issue[?] - ambiguous/unsupported claim[REF?] - missing or questionable citation[EQ?] - equation to verify