Competitive Analysis Quality Assurance | Skills Pool
Archivo del skill
Competitive Analysis Quality Assurance
Systematic fact-checking, source verification, and quality assurance for competitive research deliverables. This skill ONLY activates when explicitly requested - it does not auto-trigger. Use this skill when you explicitly ask to review competitive research, fact-check claims, verify sources, validate consistency, or assess quality of completed analysis documents. Example requests that activate this skill "review this competitive analysis", "fact-check these findings", "verify the sources", "quality check this report", "are these claims accurate", or "validate this analysis".
SeanECurrie0 estrellas25 nov 2025
Ocupación
Categorías
Académico
Contenido de la habilidad
This skill provides systematic fact-checking, source verification, and quality assurance methodologies for competitive research deliverables. It works as a companion to the competitive-research-brightdata skill, ensuring findings meet enterprise consulting standards before final delivery.
When to Use This Skill
Use this skill AFTER initial research is complete, during the review and quality assurance phase. Specifically:
Fact-Checking: Verify specific claims, statistics, or facts from research
Source Validation: Check citations are accurate, accessible, and relevant
Consistency Review: Ensure information is consistent across multiple documents
Gap Identification: Find missing information or areas needing deeper research
Quality Assessment: Evaluate against consulting-grade standards
Pre-Delivery Review: Final QA before presenting to stakeholders
Typical Project Structure
When working with competitive research projects, documents are typically organized as:
Check current directory structure using ls or glob patterns
Look for common project directory patterns (both structures above)
Identify which structure variant is in use
Note available documents for review
Only ask clarifying questions after understanding what exists
When asked to review documents, check these directories first before asking user for specific file paths. This makes QA more efficient by knowing where to look.
Available Tools
Research & Verification Tools
web_search - Search the web to verify claims and find corroborating sources
Use to validate facts, statistics, company information
Cross-reference with original research sources
Find additional sources for under-supported claims
search_engine (Bright Data) - Professional-grade search across Google/Bing/Yandex
Use for deeper verification when web_search insufficient
Access same tools used in original research
Validate hard-to-find or specialized information
web_fetch - Retrieve full webpage content to verify citations
Use to check if cited sources actually contain claimed information
Validate quote accuracy and context
Verify publication dates and authorship
scrape_as_markdown (Bright Data) - Extract clean content from websites
Use for detailed source verification
Check if company websites still contain cited information
Validate pricing, features, or specifications
Document Analysis Tools
Filesystem tools - Read and analyze research documents
Use to review company profiles, competitive analyses, reports
Cross-reference claims across multiple documents
Check for internal consistency
Workflow
Phase 1: Scope the Review
Locate Project Files First:
Before asking clarifying questions, check if you can identify the project structure:
Look for common research project directory patterns (see "Typical Project Structure" above)
Check filesystem for folders like 01-COMPANY-PROFILES/, 02-COMPETITOR-PROFILES/, 03-COMPARATIVE-ANALYSIS/
If found, note the structure and available documents
This allows more specific clarifying questions
Clarify Review Objectives:
Ask questions to understand what needs review:
Which documents need fact-checking? (Specific profiles, analyses, or full project)
What level of review? (Quick validation, comprehensive audit, or targeted fact-check)
Are there specific claims flagged for verification?
What's the timeline for review completion?
Are there known concerns or areas of uncertainty?
What quality standard should be met? (Internal review, client-ready, publication-grade)
Identify Review Priority:
High Priority: Executive summaries, key findings, strategic recommendations
Medium Priority: Detailed analyses, comparison matrices, company profiles
Scan document for numeric patterns (regex: \$[\d,]+, \d+%, \d+/\d+, etc.)
For each match, check if (Source, YYYY) appears within 100 characters
Flag uncited claims for review
Generate compliance report
Citation Compliance Report Format:
## Citation Compliance Check
**Document:** [filename]
**Date:** [review date]
### Summary
- Total numeric claims found: X
- Claims with proper citations: Y (Z%)
- Claims missing citations: N
### Uncited Claims Requiring Attention
| Line | Claim | Type | Recommendation |
|------|-------|------|----------------|
| 45 | "$4.80 per student" | Pricing | Add source citation |
| 112 | "12% market share" | Market Share | Add source citation |
### Properly Cited Examples
[List 2-3 good examples as reference]
Integration with Workflow:
Run citation check BEFORE comprehensive fact-checking
Use findings to prioritize which claims need source verification
Include citation compliance score in final QA report
Phase 5.6: Marketing Language Detection
Objective: Identify and flag promotional language that should be replaced with factual statements.
Why This Matters:
Enterprise-grade competitive analysis must be objective and fact-based. Marketing language undermines credibility and can lead to biased decision-making.
Fix: Define specifically what is meant or cite user reviews
Unsubstantiated Comparisons:
"better than competitors", "outperforms alternatives"
"more intuitive", "easier to use", "faster"
Fix: Cite comparative studies, user ratings, or specific metrics
Emotional Appeals:
"trusted by thousands", "loved by students"
"empowering", "inspiring", "delighting"
Fix: Replace with factual satisfaction data (ratings, NPS)
Future-Tense Marketing:
"will revolutionize", "poised to dominate"
"expected to lead", "set to transform"
Fix: Focus on current capabilities, note roadmap items separately
Detection Process:
Scan document for marketing language patterns (use word list above)
For each match, evaluate context:
Is it a direct quote from source? (acceptable if attributed)
Is it author's own characterization? (flag for review)
Is it supported by evidence? (acceptable if cited)
Generate marketing language report
Marketing Language Report Format:
## Marketing Language Review
**Document:** [filename]
**Date:** [review date]
### Summary
- Marketing phrases detected: X
- Requiring revision: Y
- Acceptable (quoted/supported): Z
### Phrases Requiring Revision
| Line | Phrase | Issue | Suggested Fix |
|------|--------|-------|---------------|
| 23 | "industry-leading platform" | Superlative without evidence | "platform serving X schools" |
| 89 | "seamless integration" | Vague claim | "integrates with X, Y, Z systems (Source, YYYY)" |
### Acceptable Usages
[List cases where marketing language is properly attributed or supported]
Acceptable Exceptions:
Direct quotes from company marketing (clearly attributed)
User review quotes (with source citation)
Industry analyst characterizations (with citation)
Historical statements about company positioning (with context)
Integration with Workflow:
Run marketing language check during initial document scan
Flag issues alongside citation problems
Include marketing language score in final QA report (target: <5 unattributed promotional phrases)
Phase 5.7: Cross-Document Consistency Check
Objective: Ensure key data points are consistent across all related documents.
Why This Matters:
Competitive analysis projects often have the same data (pricing, market share, customer counts) appearing in multiple documents. Inconsistencies undermine credibility and confuse stakeholders.
High-Risk Data Categories:
These data types commonly appear in multiple documents and require consistency verification:
Assess research against quality criteria from references/quality-standards.md:
Evidence Quality:
Are primary sources used appropriately?
Are secondary sources credible?
Is evidence recent and relevant?
Are claims properly supported?
Analysis Quality:
Are conclusions logical from evidence?
Is analysis sufficiently deep?
Are frameworks applied correctly?
Are limitations acknowledged?
Presentation Quality:
Is writing clear and professional?
Is formatting consistent?
Are tables/charts accurate and helpful?
Is document suitable for stated audience?
Step 2: Compare to Benchmarks
Evaluate against consulting industry standards:
McKinsey, BCG, Bain quality levels
Academic research standards
Industry analyst report standards (Gartner, Forrester)
Step 3: Identify Improvements
Suggest specific improvements:
Add missing sources
Update outdated information
Resolve inconsistencies
Strengthen weak claims
Enhance analysis depth
Improve presentation quality
Phase 8: Generate QA Report
Step 1: Structure Findings
Organize findings into clear report:
Executive Summary:
Overall assessment (Pass / Pass with Minor Issues / Needs Revision)
Critical issues requiring immediate attention
Number of claims verified vs. unverified
Key recommendations
Detailed Findings by Document:
For each reviewed document:
Document name and review date
Claims verified (with confidence level)
Source issues found
Consistency problems identified
Gaps or missing information
Quality assessment
Specific recommendations
Critical Issues List:
Claims that need correction
Broken or inaccessible sources
Major inconsistencies
Misleading or unsupported statements
Recommendations:
Priority 1 (Must Fix): Critical corrections
Priority 2 (Should Fix): Important improvements
Priority 3 (Nice to Have): Minor enhancements
Step 2: Provide Evidence
For each finding, provide:
What: Specific claim or issue
Where: Document location (file, section, line)
Issue: What's wrong or uncertain
Evidence: What verification found
Recommendation: How to fix
Example:
❌ ISSUE: Market Share Claim Unverified
WHERE: competitor-profile.md, Executive Summary, Line 12
CLAIM: "SCOIR has 12% market share"
ISSUE: Only one source cited, no date, percentage conflicts with other doc
EVIDENCE: Web search found conflicting numbers (10-15% range)
RECOMMENDATION: Add 2+ recent sources, use range if exact number unknown
Step 3: Deliver Report
Create professional QA report:
Clear, actionable findings
Specific locations for all issues
Evidence for all claims of problems
Prioritized recommendations
Estimated time to address issues
See references/quality-standards.md for report format examples.
Review Types
Quick Validation (30-60 minutes)
Scope:
Critical claims only (exec summary, key findings)
Spot-check 5-10 major sources
Basic consistency check
No comprehensive audit
Deliverable:
Brief bullet-point findings
Critical issues flagged
Pass/Needs Work assessment
When to Use:
Time-constrained review
Pre-meeting sanity check
Initial quality assessment
Standard Review (2-4 hours)
Scope:
Verify key claims (20-30 claims)
Validate major sources (10-15 sources)
Cross-document consistency
Gap identification
Deliverable:
Structured QA report
Verified/unverified claims list
Prioritized recommendations
Quality score
When to Use:
Pre-client delivery
Important internal review
Peer review process
Comprehensive Audit (1-2 days)
Scope:
Verify ALL claims
Validate ALL sources
Full consistency analysis
Deep gap assessment
Detailed quality evaluation
Deliverable:
Comprehensive QA report
Claim-by-claim verification
Full source audit
Detailed improvement plan
Executive summary
When to Use:
High-stakes deliverables
Client-facing reports
Publication submissions
Formal audits
Targeted Fact-Check (Variable)
Scope:
Specific claims flagged by requester
Particular document section
Single competitor profile
Specific type of claim (e.g., all pricing data)
Deliverable:
Focused findings on requested scope
Verification results
Targeted recommendations
When to Use:
Specific concerns raised
Dispute resolution
Spot verification
Follow-up on previous issues
Best Practices
Verification Excellence
Multi-Source Validation: Never trust single source for major claims
Primary Source Priority: Company websites, official reports, SEC filings best
Date Awareness: Note when information was gathered, flag if outdated
Context Preservation: Don't take quotes or stats out of context
Bias Recognition: Note when sources may be biased or have conflicts of interest
Efficiency Practices
Batch Verification: Group similar claims, verify together
Use Original Tools: Bright Data tools used in research often fastest for verification
Document As You Go: Note findings immediately, don't rely on memory
Focus High-Risk: Verify critical claims thoroughly, spot-check less critical
Know When to Stop: Perfect is enemy of good, time-box verification
Communication Practices
Be Specific: Always cite document, section, line for issues
Provide Evidence: Show what verification found
Be Constructive: Frame as opportunities to strengthen, not criticism
Prioritize: Critical vs. nice-to-have improvements
Acknowledge Uncertainty: Some claims may be unverifiable (that's okay)
Quality Mindset
Professional Skepticism: Verify but don't assume error
Evidence-Based: Every finding needs evidence
Fair Assessment: Compare to realistic standards, not perfection
Collaborative Spirit: Goal is better deliverable, not finding fault
Client Focus: Would client be satisfied with this quality?
Common Issues & Solutions
Issue: Broken Citation Links
Problem: URL in citation is inaccessible or returns 404
Solution:
Use web_search to find updated URL or archived version
Search for content using title/author
Note if content has changed or been removed
Recommend updating citation or finding alternative source
Issue: Claim Without Source
Problem: Important claim has no supporting citation
Solution:
Use web_search or Bright Data to find supporting sources
If found, recommend adding citation
If not found, flag claim for removal or qualification
Consider if claim is common knowledge (citation may not be needed)
Issue: Inconsistent Information
Problem: Different numbers or facts across documents
Solution:
Verify which version is correct using sources
If both correct (different timeframes), clarify dates
If one incorrect, recommend correction
If uncertainty, recommend noting range or caveat
Issue: Outdated Information
Problem: Source is old, newer information likely available
Solution:
Search for recent sources using web_search
If newer info found, recommend update
If no newer info, note last-updated date
Accept that some info (history, founding dates) won't have recent sources
If unresolved, recommend noting range or uncertainty
Consider if discrepancy is material (does it matter?)
Issue: Vague or Subjective Claim
Problem: Claim is opinion or subjective without qualification
Solution:
Identify as opinion, not fact
If supported by evidence, recommend clarifying ("according to X sources" or "reviews suggest")
If unsupported, recommend softening or removing
Distinguish between analysis (supported opinion) and speculation
References
verification-checklist.md - Comprehensive checklist of what to verify by category (company info, market data, features, pricing, claims, sources, consistency)
quality-standards.md - Consulting-grade quality standards, common quality issues, report format examples, benchmark comparisons
fact-checking-methodology.md - Detailed techniques for verifying different types of claims, source evaluation criteria, triangulation strategies, handling uncertainty
Load these references as needed based on review scope and specific verification challenges.
Integration with Competitive Research Skill
This skill works as an optional companion to the competitive-research-brightdata skill. You control when/if QA happens by explicitly requesting review.
Optional Workflow (You Decide Each Step):
Step 1: Do Research (competitive-research-brightdata)
You conduct research using Bright Data tools
You create company profiles, analyses, reports
You cite sources, document findings
Step 2: Request QA Review (ONLY When You Want It) (research-quality-assurance)
You explicitly ask: "Review the Naviance profile" or "Fact-check the pricing analysis"
Then this skill activates to verify claims, check consistency, validate sources
You decide: Address critical issues, make improvements
You update: Documents based on QA findings
Step 4: Request Final Check (ONLY If You Want It) (research-quality-assurance)
Optional: "Quick validation that I fixed the issues"
Or skip: Deliver without final QA if you're confident
Step 5: Deliver When YOU Decide It's Ready
You determine when deliverable meets your standards
You decide when to present to client
Key Point: Nothing happens automatically. You explicitly request each QA step only when you want quality review.
Success Metrics
Quality Indicators:
% of claims verified from multiple sources
% of citations validated as accurate
Critical issues identified and resolved
Document consistency score
Client satisfaction with deliverable quality
Efficiency Metrics:
Review time per document
Issues found per hour of review
False positive rate (non-issues flagged)
Turnaround time for QA reports
Impact Metrics:
Errors caught before client delivery
Client questions/disputes after delivery
Repeat business (client trust)
Reputation for quality research
When to Use This Skill:
✅ Review completed research before client delivery
✅ Fact-check specific claims that seem uncertain
✅ Validate sources and citations
✅ Check consistency across multiple documents
✅ Assess quality against consulting standards
✅ Pre-delivery QA audit
❌ Not for initial research (use competitive-research-brightdata)
❌ Not for generating new findings (use research skill)
❌ Not for strategic recommendations (that's analysis phase)