Evaluate research rigor. Assess methodology, experimental design, statistical validity, biases, confounding, evidence quality (GRADE, Cochrane ROB), for critical analysis of scientific claims.
Critical thinking is a systematic process for evaluating scientific rigor. Assess methodology, experimental design, statistical validity, biases, confounding, and evidence quality using GRADE and Cochrane ROB frameworks. Apply this skill for critical analysis of scientific claims.
This skill should be used when:
Consider adding diagrams for critical analysis:
Note: Create diagrams manually using draw.io, Mermaid, or your preferred diagramming tool.
Evaluate research methodology for rigor, validity, and potential flaws.
Apply when:
Evaluation framework:
Study Design Assessment
Validity Analysis
Control and Blinding
Measurement Quality
Reference: See references/scientific_method.md for detailed principles and references/experimental_design.md for comprehensive design checklist.
Identify and evaluate potential sources of bias that could distort findings.
Apply when:
Systematic bias review:
Cognitive Biases (Researcher)
Selection Biases
Measurement Biases
Analysis Biases
Confounding
Reference: See references/common_biases.md for comprehensive bias taxonomy with detection and mitigation strategies.
Critically assess statistical methods, interpretation, and reporting.
Apply when:
Statistical review checklist:
Sample Size and Power
Statistical Tests
Multiple Comparisons
P-Value Interpretation
Effect Sizes and Confidence Intervals
Missing Data
Regression and Modeling
Common Pitfalls
Reference: See references/statistical_pitfalls.md for detailed pitfalls and correct practices.
Evaluate the strength and quality of evidence systematically.
Apply when:
Evidence evaluation framework:
Study Design Hierarchy
Important: Higher-level designs aren't always better quality. A well-designed observational study can be stronger than a poorly-conducted RCT.
Quality Within Design Type
GRADE Considerations (if applicable)
Convergence of Evidence
Contextual Factors
Reference: See references/evidence_hierarchy.md for detailed hierarchy, GRADE system, and quality assessment tools.
Detect and name logical errors in scientific arguments and claims.
Apply when:
Common fallacies in science:
Causation Fallacies
Generalization Fallacies
Authority and Source Fallacies
Statistical Fallacies
Structural Fallacies
Science-Specific Fallacies
When identifying fallacies:
Reference: See references/logical_fallacies.md for comprehensive fallacy catalog with examples and detection strategies.
Provide constructive guidance for planning rigorous studies.
Apply when:
Design process:
Research Question Refinement
Design Selection
Bias Minimization Strategy
Sample Planning
Measurement Strategy
Analysis Planning
Transparency and Rigor
Reference: See references/experimental_design.md for comprehensive design checklist covering all stages from question to dissemination.
Systematically evaluate scientific claims for validity and support.
Apply when:
Claim evaluation process:
Identify the Claim
Assess the Evidence
Check Logical Connection
Evaluate Proportionality
Check for Overgeneralization
Red Flags
Provide specific feedback:
Be Constructive
Be Specific
Be Proportionate
Apply Consistent Standards
Consider Context
Structure feedback as:
Use precise terminology:
This skill includes comprehensive reference materials that provide detailed frameworks for critical evaluation:
references/scientific_method.md - Core principles of scientific methodology, the scientific process, critical evaluation criteria, red flags in scientific claims, causal inference standards, peer review, and open science principles
references/common_biases.md - Comprehensive taxonomy of cognitive, experimental, methodological, statistical, and analysis biases with detection and mitigation strategies
references/statistical_pitfalls.md - Common statistical errors and misinterpretations including p-value misunderstandings, multiple comparisons problems, sample size issues, effect size mistakes, correlation/causation confusion, regression pitfalls, and meta-analysis issues
references/evidence_hierarchy.md - Traditional evidence hierarchy, GRADE system, study quality assessment criteria, domain-specific considerations, evidence synthesis principles, and practical decision frameworks
references/logical_fallacies.md - Logical fallacies common in scientific discourse organized by type (causation, generalization, authority, relevance, structure, statistical) with examples and detection strategies
references/experimental_design.md - Comprehensive experimental design checklist covering research questions, hypotheses, study design selection, variables, sampling, blinding, randomization, control groups, procedures, measurement, bias minimization, data management, statistical planning, ethical considerations, validity threats, and reporting standards
When to consult references:
grep -r "pattern" references/Scientific critical thinking is about:
Always distinguish between:
Goals of critical thinking: