Disaster recovery and business continuity planning
UK Government GDS Service Standard and Technology Code of Practice
Command Purpose
Generate a comprehensive Operational Readiness Pack that prepares a service for production operation. This command bridges the gap between development completion and live service operation, ensuring the operations team has everything needed to support the service.
When to Use This Command
Use $arckit-operationalize after completing:
Requirements () - for SLA targets
Skills relacionados
$arckit-requirements
Architecture diagrams ($arckit-diagram) - for component inventory
HLD/DLD review ($arckit-hld-review or $arckit-dld-review) - for technical details
Data model ($arckit-data-model) - for data dependencies
Run this command before go-live to ensure operational readiness. This is complementary to $arckit-servicenow (which focuses on ITSM tooling) - this command focuses on the operational practices and documentation.
User Input
$ARGUMENTS
Parse the user input for:
Service/product name
Service tier (Critical/Important/Standard)
Support model preference (24/7, follow-the-sun, business hours)
Specific operational concerns
Target go-live date (if mentioned)
Instructions
Phase 1: Read Available Documents
Note: Before generating, scan projects/ for existing project directories. For each project, list all ARC-*.md artifacts, check external/ for reference documents, and check 000-global/ for cross-project policies. If no external docs exist but they would improve output, ask the user.
TRAC (Traceability Matrix) — Extract: Requirements-to-component mapping for runbook coverage
DATA (Data Model) — Extract: Data dependencies, backup requirements, retention policies
STKE (Stakeholder Analysis) — Extract: Stakeholder expectations, SLA requirements, support model preferences
IMPORTANT: Do not proceed until you have read the requirements and architecture files.
Phase 1b: Read external documents and policies
Read any external documents listed in the project context (external/ files) — extract SLA targets, support tier definitions, escalation procedures, DR/BCP plans, on-call rotas
Read any enterprise standards in projects/000-global/external/ — extract enterprise operational standards, SLA frameworks, cross-project support model benchmarks
If no external operational docs found but they would improve the readiness pack, ask: "Do you have any existing SLA documents, support procedures, or DR/BCP plans? I can read PDFs directly. Place them in projects/{project-dir}/external/ and re-run, or skip."
Citation traceability: When referencing content from external documents, follow the citation instructions in .arckit/references/citation-instructions.md. Place inline citation markers (e.g., [PP-C1]) next to findings informed by source documents and populate the "External References" section in the template.
Section 14: Handover Checklist
Comprehensive checklist for production handover:
All runbooks written and reviewed
Monitoring dashboards created and tested
Alerts configured and tested
On-call rotation staffed
DR tested within last 6 months
Backups verified and restore tested
Support team trained
Escalation contacts confirmed
Access provisioned for support team
Documentation in knowledge base
SLOs agreed with stakeholders
VMS enrolled and scanning active (UK Government)
Vulnerability remediation SLAs documented and agreed
Critical vulnerability remediation runbook tested
Section 15: Operational Metrics
MTTR (Mean Time to Recovery) target
MTBF (Mean Time Between Failures) target
Change failure rate target
Deployment frequency target
Toil percentage target (<50%)
Section 16: UK Government Considerations (if applicable)
GDS Service Standard Point 14 (operate a reliable service)
NCSC operational security guidance
NCSC Vulnerability Monitoring Service (VMS) enrollment and benchmark compliance
Cross-government service dependencies (GOV.UK Notify, Pay, Verify)
Cabinet Office Technology Code of Practice compliance
Section 17: Traceability
Map each operational element to source requirements
Link runbooks to architecture components
Connect SLOs to stakeholder expectations
Phase 4: Validation
Before saving, verify:
Completeness:
Every NFR has corresponding SLO/SLI
Every major component has a runbook
DR/BCP procedures documented
On-call rotation defined
Escalation paths clear
Training plan exists
Quality:
Runbooks have specific commands (not generic placeholders)
Contact details specified (even if placeholder format)
RTO/RPO align with NFRs
Support model matches service tier
Phase 5: Output
Before writing the file, read .arckit/references/quality-checklist.md and verify all Common Checks plus the OPS per-type checks pass. Fix any failures before proceeding.
CRITICAL - Use Write Tool:
Operational readiness packs are large documents (400+ lines). Use the Write tool to save the document to avoid token limits.
Save the file to projects/{project-name}/ARC-{PROJECT_ID}-OPS-v1.0.md
Provide summary to user:
✅ Operational Readiness Pack generated!
**Service**: [Name]
**Service Tier**: [Critical/Important/Standard]
**Availability SLO**: [X.XX%] (Error budget: [X] min/month)
**RTO**: [X hours] | **RPO**: [X hours]
**Support Model**:
- [24/7 / Business Hours]
- On-call: [Yes/No]
- L1 → L2 → L3 escalation defined
**Runbooks Created**: [N] runbooks
- Service Start/Stop
- Health Check Failures
- High Error Rate
- [etc.]
**DR Strategy**: [Active-Passive / etc.]
- Last DR test: [Date or "Not yet tested"]
**Handover Readiness**: [X/Y] checklist items complete
**File**: projects/{project-name}/ARC-{PROJECT_ID}-OPS-v1.0.md
**Next Steps**:
1. Review SLOs with service owner
2. Complete handover checklist items
3. Schedule DR test if not done recently
4. Train operations team
5. Conduct operational readiness review meeting
Flag gaps:
Missing NFRs (defaulted values used)
Untested DR procedures
Incomplete runbooks
Missing on-call coverage
Error Handling
If Requirements Not Found
"⚠️ Cannot find requirements document (ARC--REQ-.md). Please run $arckit-requirements first. Operational readiness requires NFRs for SLO definitions."
Markdown escaping: When writing less-than or greater-than comparisons, always include a space after < or > (e.g., < 3 seconds, > 99.9% uptime) to prevent markdown renderers from interpreting them as HTML tags or emoji