Evaluates the credibility and relevance of research sources. Use when the user asks to "check sources", "verify claims", "assess credibility", "fact-check", or when source quality needs evaluation before report writing. Also triggers on "is this reliable" or "can I trust this source".
Evaluate sources for credibility, bias, recency, and relevance.
Government statistical agencies, central bank data, peer-reviewed journals, court filings, SEC filings, patent records, official specifications, direct participant testimony.
Signal: The source generated or collected the data itself.
Major news outlets with original reporting (Reuters, AP, Bloomberg, NYT, WSJ), established research firms (Gartner, McKinsey, Pew), recognized industry analysts, university research reports.
Signal: Professional editorial standards, named authors, cited sources within their reporting.
Well-known publications summarizing others' work, expert blogs with citations, trade press with editorial oversight, Wikipedia (for non-controversial topics with good citations).
Signal: Aggregates but adds analysis; traceable to primary sources.
Trade publications without clear sourcing, company blogs mixing data with marketing, survey-based content with unclear methodology, opinion pieces presented as analysis.
Signal: Some useful data but motivations are mixed or unclear.
Anonymous blogs, SEO-optimized content farms, undated articles, press releases without independent verification, social media posts, forums, AI-generated content without attribution.
Signal: No editorial oversight, no cited sources, possible financial motivation.
Check for these red flags:
| Topic Type | Maximum Source Age |
|---|---|
| Technology, startups, AI | 6 months |
| Policy, regulation | 12 months |
| Market data, economics | 12 months |
| Science, medicine | 2 years (unless foundational) |
| History, established facts | No limit |
When sources disagree:
Document the resolution or mark as "unresolved — present both sides."