Post-merge validation. Checks that the merged PR satisfies the ACs and test plan for the story. Produces a DoD artefact recording AC coverage, any deviations, and metric signal status. Use when a PR is merged and someone says "mark as done", "definition of done", "validate the story", or "check what shipped". Requires merged PR, story artefact, test plan, and DoR artefact.
Before asking anything, verify:
If not met:
❌ Entry condition not met [Specific issue - e.g. "PR is not yet merged. Run this after merge, not before."]
Run /workflow to see the current pipeline state.
State what was found:
Story: [story title] PR: [ref] - merged [date] [n] [n]
Running definition-of-done check. Ready? Reply: yes - or specify a different story/PR
For each AC in the story, verify the merged code satisfies it. Reference specific test results or observable behaviour where possible.
Use the AC coverage table format from templates/definition-of-done.md.
A deviation is any difference between implemented behaviour and the AC - even if minor. Deviations are not failures, but they must be recorded.
If any AC is ❌:
❌ AC[n] not satisfied: [description]
What do you want to do?
- Create a follow-up story to address it
- Accept the gap and record it in /decisions as RISK-ACCEPT
- Reopen the PR - this should have been caught before merge
Reply: 1, 2, or 3
Verify the merged PR did not implement anything in the story's or epic's out-of-scope section.
If a violation is found:
⚠️ Scope deviation: [behaviour] was explicitly out of scope.
This is recorded for /trace and may need a follow-up story. Acknowledge and continue? Reply: yes - I'll note it / no - this needs to be reverted
Confirm the tests from the test plan were implemented and are passing in CI.
Coverage gap audit: If the test plan contains any CSS-layout-dependent gaps:
decisions.md)If any tests were not implemented:
⚠️ Test gap: [test name] was not implemented. Risk: [which AC is now less covered]
Accept this gap?
- Yes - log in /decisions as RISK-ACCEPT
- No - create a follow-up to implement it
Reply: 1 or 2
Check the feature-level NFR profile at artefacts/[feature]/nfr-profile.md (if it exists).
If it does not exist, fall back to individual story NFR fields.
For each NFR in the profile (or story), confirm it was addressed.
NFR categories to check:
If any NFR has no evidence:
⚠️ NFR not evidenced: [NFR description] What evidence exists that this was addressed?
Reply: describe evidence — or "not addressed, I'll log it"
Update the NFR profile's status if all NFRs in the profile are verified:
Status: Active → Verified at [date] in nfr-profile.mdIf no NFR profile exists and no story-level NFRs:
✅ NFR check: No NFRs defined — confirmed not applicable at [date]
For each metric in the feature's metrics array whose contributingStories list includes this story's slug:
on-track / at-risk / off-track (see definitions below).[Metric name] Signal: [on-track / at-risk / off-track / not-yet-measured] Evidence: [observed result, or "Measurement not yet possible - [reason]"] Date measured: [date, or null]
Signal definitions:
on-track - current result is within acceptable range of the targetat-risk - partial progress but below minimum validation signaloff-track - result is clearly not trending toward the targetnot-yet-measured - measurement is not yet possible (e.g. no real sessions run yet)This section does not claim success - it records what is now observable.
State write for metrics: After capturing signals, update the feature's metrics array in pipeline-state.json:
metrics[x].signal to the determined valuemetrics[x].evidence to the evidence string (or null if not yet measured)metrics[x].lastMeasured to today's date (ISO 8601) if a measurement was taken, else leave nullConforms to .github/templates/definition-of-done.md.
Save to artefacts/[feature]/dod/[story-slug]-dod.md.
Definition of done: [COMPLETE / COMPLETE WITH DEVIATIONS / INCOMPLETE] ✅
ACs satisfied: [n/n] Deviations: [None / n recorded] Test gaps: [None / n gaps]
[If COMPLETE WITH DEVIATIONS or INCOMPLETE:] Follow-up actions: [list]
Ready to run /release when all stories in this feature are DoD-complete? Reply: yes - or there are more stories to process first
Mandatory. Do not close this skill or produce a closing summary without writing these fields. Confirm the write in your closing message: "Pipeline state updated ✅."
Update .github/pipeline-state.json in the project repository when the DoD artefact is saved:
stage: "definition-of-done", dodStatus: "complete", prStatus: "merged", health: "green", updatedAt: [now]releaseReady: truereleaseReady: false, health: "amber", note deviation in blockerstatus: if all stories in the epic are dodStatus: "complete", set epic status: "complete"layoutGapsAtMerge: true if the test plan had any CSS-layout-dependent gaps at merge time; set layoutGapsRiskAccepted: true if a RISK-ACCEPT was recorded in /decisions before coding startedmetrics array:
metrics[x].signal ← determined signal value ("on-track" / "at-risk" / "off-track" / "not-yet-measured")metrics[x].evidence ← evidence string or nullmetrics[x].lastMeasured ← ISO 8601 date string or null