Use when the user wants to evaluate whether a paper or manuscript meets reproducibility standards and identify missing methodological details.
Evaluates the reproducibility of a paper or the team's own work against established reproducibility standards. Flags missing details, unavailable code/data, and underspecified methods before submission — catching issues reviewers will raise.
reproducibility_check.check_manuscript(
manuscript_text="[paste manuscript]",
target_venue="NeurIPS",
include_checklist=True
)
reproducibility_check.check_external(
doi="10.48550/arXiv.2310.xxxxx",
check_code_availability=True,
check_data_availability=True
)
reproducibility_check.checklist(
venue="NeurIPS",
year=2024,
format="markdown_checklist"
)
reproducibility_check.score(
doi="10.48550/arXiv.2310.xxxxx",
rubric="ml_reproducibility_challenge"
)
# Returns: {"score": 6.5/10, "breakdown": {...}, "critical_gaps": [...]}
Returns structured checklist with pass/fail per item, critical gaps flagged in red, and an overall reproducibility score. Fix recommendations are actionable: "Add random seed to Section 3.2" not "improve reproducibility."
method_reviewer for missing technical detail checksexperiment_skeptic for submission-readiness and reviewer-risk checks