Search for published papers that justify a specific methodological decision. Returns precedents diversified across empirical domains, with top papers read in full via PDF.
Find published papers that faced the same methodological decision and show how they handled it. This is NOT a literature review -- it is a targeted search for methodological precedents.
Input: $ARGUMENTS -- a description of the methodological decision, optionally prefixed with:
[section:filename] -- read draft/sections/{filename}.tex for context (e.g., [section:5-empirical-approach])[domain: ...] -- constrain search domain (e.g., [domain: economics, finance])[UNVERIFIED]Extract from $ARGUMENTS:
.tex file to understand the decision in context. Use this to formulate better queries, but do NOT let existing text bias which papers you find or what conclusions you draw.Then:
CLAUDE.md to identify the target journal and domain (if no explicit domain hint).Grep through draft/references.bib for papers already cited that address this method. Note them but do not let them anchor the search.Generate 5-7 queries with built-in diversity:
| Slot | Purpose |
|---|---|
| 1-2 | Direct methodological query using user's terms |
| 3 | Same method + target journal domain terms |
| 4-5 | Same method + DIFFERENT empirical contexts |
| 6-7 | Surveys, handbook chapters, or review articles on this method |
Run searches sequentially (rate limit):
search_by_topic: Run 4-5 of the formulated queries with limit=10 each.Collect all unique results. From MCP results, retain: title, authors, year, DOI, venue, abstract, TLDR, isOpenAccess, openAccessPdf URL, paperId.
From the collected results (typically 20-40 papers):
For each Tier 1 paper:
openAccessPdf URL exists from the API. If not, use Playwright to navigate to https://www.semanticscholar.org/paper/{paperId}, take a snapshot, and extract the PDF link. If still not found, try WebSearch for an open-access version.curl -sL -o literature/{AuthorYear}.pdf "{pdf_url}"Read tool with pages parameter targeting the methodology/empirical strategy sections (typically pages 5-15 for a 30-40 page paper; check table of contents on page 1-2 first). Extract:
If PDF download fails (paywall, broken link), demote to Tier 2.
Save to quality_reports/method_precedent_[sanitized_topic].md.
# Methodological Precedent Search: [Decision]
**Date:** YYYY-MM-DD
**Query:** [User's original question]
**Target domain:** [Detected or specified]
**Domains represented:** [List across all results]
## Summary
[2-3 sentences: consensus approach, key variations, cross-domain patterns]
## Tier 1: Full-Text Analysis
### 1. Author (Year) -- Short Title
- **APA citation:** [From tool-verified metadata only]
- **Domain:** [Empirical domain]
- **Their decision:** [What they chose]
- **Their justification:** [Why -- quote where possible]
- **Relevance to our decision:** [How it informs what we should do]
## Tier 2: Abstract-Only (Verify Manually)
### 6. Author (Year) -- Short Title
- **APA citation:** [From tool-verified metadata only]
- **Domain / Venue:** [Field and journal]
- **Brief note:** [What the abstract suggests about their approach]
- **Status:** [No open-access PDF / paywall / etc.]
## Consensus & Variations
- **Most common approach:** [What most papers do]
- **Notable alternatives:** [Different paths and why]
- **Cross-domain pattern:** [Consistent across fields or domain-specific?]
## BibTeX Entries
[Verified papers only]
## Caveats
[Papers that could not be accessed; search limitations]