Scans a completed RFP response for internal contradictions, inconsistencies between answers, and conflicts with your source content library or technical documentation. Use this skill when a user has a draft or completed RFP response and wants to check it for contradictions before submission. Also trigger when users say "check my RFP for contradictions", "find inconsistencies in my response", "QA my bid", "does my proposal contradict itself", "review my RFP answers against our content library", "consistency check", or upload an RFP response asking for a quality review. This is a pre-submission quality gate.
Performs a systematic review of a completed RFP response to find internal contradictions between answers, inconsistencies with source documentation, and factual conflicts that could undermine credibility with evaluators.
Nothing kills an RFP score faster than contradicting yourself. When one answer says "we support on-premise deployment" and another says "our solution is cloud-only," evaluators lose confidence in every answer. This is especially common when multiple SMEs contribute sections independently, when content is pulled from a library without checking context, or when responses are reused from previous bids without full adaptation.
Evaluators are trained to cross-reference answers. This skill does the same thing before they do.
The completed or near-complete RFP response to check. Accepted formats:
.docx - Word document with questions and answers.xlsx / .csv - Spreadsheet with question-answer pairs (common for compliance questionnaires).pdf - PDF export of the responseOne or more sources of truth to check the RFP response against:
The more source material provided, the more thorough the contradiction check.
Read the full RFP response. Build an index of every question-answer pair, noting:
If the response is very large, work through it section by section, but always cross-reference across sections at the end.
Compare every answer against every other answer looking for:
Direct contradictions:
Numerical inconsistencies:
Capability conflicts:
Timeline mismatches:
Scope inconsistencies:
Terminology drift:
Tone/confidence inconsistencies:
If the user provided source documentation, compare every claim in the RFP response against the source material:
Factual accuracy:
Overpromises:
Outdated information:
Missing caveats:
Present findings organized by severity:
## RFP Contradiction Report
### Summary
- Total answers reviewed: [X]
- Contradictions found: [X]
- Critical (must fix): [X]
- Moderate (should fix): [X]
- Minor (consider fixing): [X]
### Critical Contradictions
Issues that an evaluator would likely catch and that could result in disqualification or significant score reduction.
For each:
| # | Type | Answer A | Answer B | The Contradiction | Suggested Resolution |
With full quotes from both answers showing the conflict.
### Moderate Contradictions
Inconsistencies that undermine confidence but are unlikely to disqualify.
[Same format]
### Minor Inconsistencies
Terminology drift, tone mismatches, or minor numerical rounding differences.
[Same format]
### Source Content Conflicts (if applicable)
Claims in the RFP response that conflict with provided documentation.
For each:
| # | RFP Answer | Source Document | The Conflict | Suggested Resolution |
### Consistency Recommendations
General observations about patterns -- e.g., "Implementation timelines are described differently in 4 separate answers; standardize on a single range and reference it consistently."
After presenting the report, offer to help resolve contradictions:
.md file and present to user