Sales engineering domain skill for the harness framework. Provides demo-led, POC-driven, and solution design methodologies, technical validation verification, and evaluation criteria for pre-sales engineering projects. Activated when domain_profile is sales_engineering.
Domain-specific. This skill provides the HOW for pre-sales engineering and technical validation projects. The harness provides the WHEN (orchestration, sprint contracts, feature ledger). This skill provides the HOW (demo methodology, POC planning, solution design verification, evaluation criteria anchors).
Activated when domain_profile: sales_engineering is declared in .harness/spec.md.
Section 1: Sales Engineering Methodology
Select based on buyer engagement type and technical complexity during /harness:start:
Methodology
When to Use
Harness Mapping
Demo-Led
Buyer needs to see the product in action; standard use cases; evaluation committee presentation
Each success criterion evaluated with evidence; performance metrics captured; gap analysis if criteria not fully met; recommendation (proceed/modify/stop)
Governance Deliverables
Deliverable
Gate Check
Pass Criteria
Technical Win/Loss Analysis
Root cause + lessons + improvements
Technical factors in win/loss identified; product gaps documented for product team; competitive positioning lessons captured
SE Engagement Summary
Effort + outcomes + next steps
Total SE hours invested; outcomes achieved; handoff to implementation team (if won) or lessons learned (if lost)
The 4 primary criteria for the sales_engineering domain profile. The evaluator MUST use these anchors for consistent scoring across sprints.
demo_completeness (0-5)
Score
Anchor Description
0
Absent -- no demo script or preparation; ad-hoc product walkthrough planned
1
Severely incomplete -- demo script exists but covers <50% of buyer use cases; no fallback scenarios
2
Partial -- most use cases covered but transitions are rough; no timing plan; demo environment not validated
3
Complete -- all buyer use cases mapped to demo steps; transitions scripted; timing fits allocated slot; demo environment tested and data seeded
4
Rehearsed -- demo rehearsed with internal team; common questions anticipated with prepared answers; fallback scenarios documented for top 3 failure modes
5
Compelling -- demo tells a story aligned with buyer's business outcomes; transitions highlight competitive differentiators; follow-up materials prepared; demo recording available for buyer's internal distribution
technical_validation (0-5)
Score
Anchor Description
0
Absent -- no technical validation performed; capabilities claimed without evidence
1
Severely incomplete -- basic product demo shown but no hands-on validation; buyer's environment not considered
2
Partial -- some technical scenarios tested but not in buyer's environment; results anecdotal, not measured
3
Validated -- POC executed against defined success criteria; results documented with metrics; buyer's environment constraints addressed; gaps identified with mitigation plans
4
Comprehensive -- all success criteria met with measured results; performance tested under buyer's expected load; integration points validated end-to-end; security requirements verified
5
Decisive -- technical validation eliminates all buyer objections; results exceed success criteria; buyer's technical team endorses the solution; competitive advantage demonstrated through measured comparison
solution_documentation (0-5)
Score
Anchor Description
0
Absent -- no solution documentation; architecture exists only in SE's head
1
Severely incomplete -- high-level diagram exists but no component details; integration points unnamed
2
Partial -- architecture diagram with component names but no data flows; integration points listed but protocols/formats undefined
3
Complete -- architecture diagram with data flows; all integration points specified (API, protocol, format, authentication); non-functional requirements documented (performance, security, scalability)
4
Implementation-ready -- solution design detailed enough for implementation team to build without SE involvement; deployment model specified; operational considerations documented
5
Exemplary -- solution design includes decision rationale (why this approach over alternatives); risk assessment; phased implementation plan; operational runbook; buyer's team can self-serve from documentation
integration_clarity (0-5)
Score
Anchor Description
0
Absent -- no integration requirements documented; "we'll figure it out later" approach
1
Severely incomplete -- integration points named but no technical details; direction of data flow unknown
2
Partial -- some integration points specified with protocols but authentication, error handling, and data format details missing
3
Clear -- all integration points documented with protocol, authentication method, data format, error handling approach; sample payloads provided for key integrations
4
Validated -- integration specifications tested against buyer's actual systems (or realistic simulators); round-trip data flow verified; error scenarios tested
5
Production-ready -- integration specifications include monitoring, alerting, retry logic, and graceful degradation; buyer's operations team reviewed and approved; runbook for integration troubleshooting provided
Section 6: Sprint Contract Checklist Templates
Pre-built checklists for common sales engineering deliverable types. The generator includes the relevant checklist(s) in each sprint contract. The evaluator uses them as acceptance criteria.
Standard Contract Checks
These are the minimum required checks for every sales engineering sprint. Check IDs map to the 4 sales engineering evaluation criteria.
Check ID
Criterion
Required
Verification Method
DC-01
demo_completeness
required
Verify all buyer use cases mapped to demo steps; transitions scripted; timing fits allocated slot; demo environment tested
DC-02
demo_completeness
advisory
Verify fallback scenarios documented for top 3 failure modes; common questions anticipated with prepared answers
TV-01
technical_validation
required
Verify POC success criteria defined, measurable, and agreed with buyer; results documented against each criterion with evidence
TV-02
technical_validation
advisory
Verify performance tested under buyer's expected load; security requirements validated
SD-01
solution_documentation
required
Verify architecture diagram includes all integration points with data flows; NFRs documented; deployment model specified
SD-02
solution_documentation
advisory
Verify solution design includes decision rationale and risk assessment for key architectural choices
IC-01
integration_clarity
required
Verify all integration points documented with protocol, authentication, data format, and error handling approach
IC-02
integration_clarity
advisory
Verify sample API payloads provided for key integrations; round-trip data flow validated or validation plan defined
For Demo Deliverables
All buyer-stated use cases mapped to demo steps
Demo script has clear transitions between scenarios with talking points
Timing plan fits within allocated presentation slot (with buffer)
Demo environment validated and data seeded before delivery
Fallback scenarios documented for top 3 likely failure modes (network, data, feature)
Competitive differentiators highlighted at natural points in the flow
Follow-up action items template prepared for post-demo capture
Customer environment data (if used) handled per data classification requirements
No customer production data in demo environment without explicit authorization
For POC Deliverables
Success criteria defined in writing and agreed with buyer before POC starts
Success criteria are measurable (not subjective "it works well")
Environment requirements documented and validated before execution
POC timeline defined with milestones and evaluation checkpoints
Results documented against each success criterion with measured evidence
Gap analysis included if any criteria not fully met, with mitigation options
Recommendation provided (proceed to purchase / modify scope / stop)
Customer data used in POC classified and handled per agreement
POC environment decommissioned or data purged after evaluation per customer requirements
For Solution Design Deliverables
Architecture diagram shows all system components and their relationships
All integration points specified with protocol, authentication, data format
Data flows documented including direction, frequency, and volume estimates
Deployment model specified (cloud, on-premises, hybrid) with justification
Sample API payloads provided for key integration points
Risk assessment included for technical implementation risks
Customer environment data and architecture details classified as confidential
No customer infrastructure details shared outside authorized team
For Technical Comparison Deliverables
All buyer evaluation criteria listed with product response for each
Evidence type specified per criterion (demo, documentation, reference, POC result)
Honest gap assessment included (not just strengths)
Competitive positioning documented without disparaging specific competitors
Scoring methodology explained if self-assessment scores provided
Competitive intelligence sourced from public information only
Sales Engineering Anti-Patterns
These trigger automatic score penalties when detected by the evaluator:
Anti-Pattern
Criterion Affected
Penalty
Detection Signal
Demo Without Discovery -- presenting product capabilities before understanding buyer's technical requirements and use cases
demo_completeness
Drop to max 2
Demo script not mapped to buyer use cases; generic product walkthrough; no technical discovery notes on file
Scope Creep POC -- POC scope expanding beyond agreed success criteria; no clear end date or evaluation method
technical_validation
Drop to max 2
Success criteria changed after POC started; no timeline; buyer adding requirements mid-POC without formal scope change
Vaporware Demo -- demonstrating features that do not exist or require significant custom development; roadmap items shown as current capabilities
demo_completeness
Drop to max 0
Demo includes unreleased features without disclosure; product roadmap items presented as generally available
Undocumented Architecture -- solution design exists only in SE's verbal description; no written architecture or integration specifications
solution_documentation
Drop to max 1
No solution design document; integration requirements verbal only; implementation team has no written reference
Integration Handwave -- claiming integration is "easy" or "standard" without specifying protocol, authentication, data format, and error handling
integration_clarity
Drop to max 2
Integration points listed without technical details; "we support REST APIs" without specifying endpoints or contracts
Customer Data Leak -- using customer environment data, architecture details, or technical requirements in other customer engagements without authorization
solution_documentation
Drop to max 0
Customer-specific data found in generic templates; architecture details from one customer referenced in another's POC
Security Considerations
Domain-specific security guidance for sales engineering deliverables. Applies when data_sensitivity in spec.md is anything other than none.
Data Sensitivity:
Customer environment data (architecture diagrams, network topologies, security configurations) obtained during technical discovery or POC must be classified as confidential
POC data sets containing customer production data must be handled per the customer's data classification policy; synthetic data preferred over production data
Performance test results revealing customer infrastructure capacity must not be shared outside the authorized engagement team
Access Control:
Solution design documents containing customer-specific integration details must be restricted to the SE team and authorized project stakeholders
POC environments with access to customer systems must use dedicated credentials with minimum necessary privileges; credentials must not be stored in code or shared documents
Demo environments must be isolated from customer production systems; no persistent access retained after engagement concludes
Confidentiality:
Customer architecture details and integration requirements must not be referenced in other customer engagements or marketing materials
Competitive bake-off results, scoring, and evaluation criteria shared by the buyer must be treated as confidential per the engagement terms
Technical discovery notes containing customer pain points, system limitations, or security vulnerabilities must be restricted to the account team