Testing toolkit for the FHIR Writing Clinical Notes specification at connectathons. Use when the user needs to test FHIR DocumentReference write operations, validate conformance with the Writing Clinical Notes spec, or participate in a FHIR connectathon for clinical notes. Includes templates, OAuth helpers, and automated test scenarios for both provider-authored and patient-asserted notes.
Comprehensive toolkit for testing the FHIR Writing Clinical Notes specification at connectathons and in development environments.
localize-document.ts to generate test documents - Do NOT write custom scriptsrequest.ts patternDO:
localize-document.ts CLI tool OR import localizeDocumentReference() functionrequest.ts pattern with fhir-request.ts library for all API callsbun path/to/script.tsdebug/<server-name>/ - summaries, notes, observations, temporary filesSUMMARY.md and observations.md files in each server's debug directoryDON'T:
fetch() directly - always use the fhir-request.ts librarydebug/<server-name>/ directoryBefore proceeding with any testing tasks, the agent MUST:
Check if project structure already exists:
# Check for existing setup
if [ -f ".gitignore" ] && [ -d ".fhir-configs" ]; then
echo "✓ Project structure already exists"
else
echo "Setting up project structure..."
# Copy .gitignore to exclude sensitive configs and debug logs
cp .claude/skills/write-clinical-notes/assets/project/.gitignore .
# Create directory structure
mkdir -p .fhir-configs debug localized
# Copy fhir-request library to debug/
cp .claude/skills/write-clinical-notes/assets/lib/fhir-request.ts debug/
echo "✓ Project structure created"
fi
This creates:
.gitignore - Excludes .fhir-configs/, debug/, and localized/.fhir-configs/ - Will store server connection configsdebug/ - Will store request/response logs organized by server and timestamplocalized/ - Will store template-generated resources organized by serverFHIR-TESTING.md - Documentation about the testing structureNote: If files already exist (from previous testing), they will NOT be overwritten.
The agent has access to the complete specification and test scenarios embedded below.
@references/spec.md --- read this in full
@references/scenarios.md -- read this in full
CRITICAL: Keep ALL server-related work in server-specific debug directories
When testing a server, create a dedicated directory under debug/ for that server and keep EVERYTHING related to that server in its directory:
What goes in the server's debug directory:
fhir-request.ts libraryWhy this matters:
Setup (one time per server):
# Example: Setting up for "epic" server
mkdir -p debug/epic
cp .claude/skills/write-clinical-notes/assets/lib/fhir-request.ts debug/epic/
Example directory structure:
debug/
├── epic/ # Epic server investigation
│ ├── fhir-request.ts # Shared library
│ ├── SUMMARY.md # Overall findings for Epic
│ ├── observations.md # Notes during testing
│ ├── 2025-11-06T10-15-00-scenario1-create/ # Test runs
│ │ ├── request.ts
│ │ ├── request-body.json
│ │ ├── response-metadata.json
│ │ └── response-body.json
│ └── 2025-11-06T10-20-00-scenario2-conditional/
│ ├── request.ts
│ ├── request-body.json
│ ├── response-metadata.json
│ └── response-body.json
└── smart-demo/ # SMART server investigation
├── fhir-request.ts
├── SUMMARY.md
├── observations.md
└── [timestamped test runs...]
Writing summaries and notes:
# Write investigation notes in the server's directory
cat > debug/epic/observations.md << 'EOF'
# Epic Server Testing Observations
## Authentication
- Using Epic-Client-ID header required
- OAuth flow successful with PKCE
## DocumentReference Support
- Accepts text/plain: ✅
- Accepts application/pdf: ✅
- Conditional create: Testing...
EOF
# Write final summary when testing complete
cat > debug/epic/SUMMARY.md << 'EOF'
# Epic Server Test Results
## Overview
Tested: 2025-11-06
Server: https://connectathon.epic.com/...
## Results
[Summary of all tests...]
EOF
For each FHIR request:
Create debug directory with timestamp and purpose:
TIMESTAMP=$(date -u +"%Y-%m-%dT%H-%M-%S")
DEBUG_DIR="debug/smart-demo-open/${TIMESTAMP}-scenario1-create"
mkdir -p "$DEBUG_DIR"
Write request.ts in that directory:
import { execute } from '../fhir-request.ts';
await execute({
method: "POST",
path: "/DocumentReference",
bodyFile: "localized/smart-demo-open/consultation-note.json",
headers: {
// Optional: additional headers like "If-None-Exist": "identifier=system|value"
},
purpose: "Scenario 1: Basic document creation",
configName: "smart-demo-open", // REQUIRED if multiple configs exist
callerDir: import.meta.dir // CRITICAL: Tells library where to write output files
});
Config Selection Behavior:
configName is optional (auto-selected)configName is required (explicit selection).json (e.g., smart-demo-open.json → "smart-demo-open")Run from PROJECT ROOT (not from the debug directory):
bun run debug/smart-demo-open/2025-11-03T18-17-25-scenario1-create/request.ts
Why from project root?
.fhir-configs/ (at project root)bodyFile path is relative to project rootimport.meta.dir in request.ts gives the library the debug directory to write toOutput files are automatically created as siblings to request.ts:
response-metadata.json - HTTP status, headers, timingresponse-body.json (or .txt) - Response bodyCRITICAL PATH RULES:
✅ DO THIS:
# Always run from project root
cd /path/to/project
bun run debug/server-name/timestamp-purpose/request.ts
❌ DON'T DO THIS:
# Don't cd into debug directory
cd debug/server-name/timestamp-purpose
bun request.ts # Will fail to find .fhir-configs
Key Points:
fhir-request.ts lives at debug/{server-name}/fhir-request.ts (one per server)request.ts files live at debug/{server-name}/{timestamp}-{purpose}/request.tscallerDir: import.meta.dir in your execute() callbun run debug/path/to/request.tscallerDir to write response files as siblings to request.tsBenefits:
Server configurations are stored in .fhir-configs/ in the project root (NOT in the skill directory). This makes them accessible across different skill invocations and preserves them between sessions.
The easiest way to get started is to use the bun setup script:
bun .claude/skills/write-clinical-notes/assets/config/setup.ts
📋 SELECTED_CONFIG: {name}The setup script will:
.fhir-configs/ directory in the project root📋 SELECTED_CONFIG: {name}The setup wizard supports four authentication modes:
Mode 1: Manual Token (fastest for testing)
Mode 2: Open Server (no authentication)
Mode 3: OAuth Public Client
Mode 4: OAuth Confidential Client
After successful setup, the script outputs:
📋 SELECTED_CONFIG: {config-name}
The agent should:
.fhir-configs/{config-name}.jsonExample configuration file structure:
{
"name": "SMART Open",
"fhirBaseUrl": "https://fhir.example.org/r4",
"accessToken": "eyJ...",
"patientId": "patient-123",
"encounterRef": "Encounter/enc-456",
"mode": "public",
"savedAt": "2025-11-03T10:00:00.000Z"
}
If you prefer not to use the bun script, you can manually create a file in .fhir-configs/ with the structure above.
Once setup is complete, the agent should:
Extract the config name from setup output and read the configuration file:
# Extract config name from setup script output
CONFIG_NAME=$(grep "📋 SELECTED_CONFIG:" setup_output.txt | cut -d: -f2 | tr -d ' ')
# Read the selected configuration
cat ".fhir-configs/${CONFIG_NAME}.json"
Make FHIR requests using the request.ts pattern:
For every FHIR API call, use the request.ts template approach (see Debug Logging section above).
Example workflow:
# 1. Determine server name from config
SERVER_NAME=$(jq -r '.name' ".fhir-configs/SMART Demo Open.json" | tr '[:upper:] ' '[:lower:]-')
# Result: "smart-demo-open"
# 2. Create debug directory with timestamp
TIMESTAMP=$(date -u +"%Y-%m-%dT%H-%M-%S")
DEBUG_DIR="debug/${SERVER_NAME}/${TIMESTAMP}-scenario1-create"
mkdir -p "$DEBUG_DIR"
# 3. Write request.ts using the fhir-request library
cat > "$DEBUG_DIR/request.ts" <<'EOF'
import { execute } from '../fhir-request.ts';
await execute({ method: "POST", path: "/DocumentReference", bodyFile: "localized/smart-demo-open/consultation-note.json", purpose: "Scenario 1: Basic document creation", configName: "smart-demo-open", // REQUIRED if multiple configs exist callerDir: import.meta.dir // CRITICAL: pass the directory to write responses }); EOF
bun run "$DEBUG_DIR/request.ts"
The script automatically:
- Loads your FHIR config from `.fhir-configs/`
- Makes the fetch request with proper auth
- Writes `response-metadata.json` and `response-body.json` as siblings to `request.ts`
- Prints summary with status, resource ID, location header
## Template Localization
**⚠️ CRITICAL: DO NOT write custom localization scripts! Use the provided `localize-document.ts` utility.**
Templates are located in `assets/templates/` with placeholder variables like `{{PATIENT_ID}}`, `{{BASE64_CONTENT}}`, etc. These templates are **NOT valid JSON** until placeholders are replaced.
### How to Localize Documents (REQUIRED APPROACH)
**ALWAYS use the provided localization utility:**
```bash
bun .claude/skills/write-clinical-notes/assets/lib/localize-document.ts \
--type=<TYPE> \
--patient-id=<PATIENT_ID> \
--server=<SERVER_NAME> \
[--patient-name="Patient Name"] \
[--author-reference="Practitioner/id"] \
[--author-display="Dr. Name"]
Required by Spec:
consultation - Consultation note (text/plain; charset=utf-8)progress - Progress note (text/plain; charset=utf-8)pdf - Consultation note (application/pdf)Additional Formats:
cda - Discharge summary (application/cda+xml) - Proper C-CDAxhtml - Discharge summary (application/xhtml+xml) - XHTML rich texthtml - Progress note (text/html; charset=utf-8)patient-asserted - Patient-authored log with PATAST security taglarge - Large file test (~5 MiB)# Example 1: No encounter reference (encounter is optional!)
bun .claude/skills/write-clinical-notes/assets/lib/localize-document.ts \
--type=pdf \
--patient-id=eWhK7w46yHKVkBetDW2-Brg3 \
--server=epic
# Example 2: With resolvable encounter reference
bun .claude/skills/write-clinical-notes/assets/lib/localize-document.ts \
--type=pdf \
--patient-id=eWhK7w46yHKVkBetDW2-Brg3 \
--server=epic \
--encounter-reference="Encounter/eR24JiO5hqr0yswk.k5BXlQ3" \
--encounter-display="Office Visit - Jan 15"
# Example 3: With contained encounter (must provide both reference and contained JSON)
bun .claude/skills/write-clinical-notes/assets/lib/localize-document.ts \
--type=pdf \
--patient-id=eWhK7w46yHKVkBetDW2-Brg3 \
--server=epic \
--encounter-reference="#e1" \
--encounter-contained='{"resourceType":"Encounter","id":"e1","status":"finished","class":{"system":"http://terminology.hl7.org/CodeSystem/v3-ActCode","code":"AMB"},"subject":{"reference":"Patient/eWhK7w46yHKVkBetDW2-Brg3"},"period":{"start":"2025-01-15T10:00:00Z","end":"2025-01-15T11:00:00Z"}}'
# Output in all cases: localized/epic/pdf-note.json (ready to POST)
#e1 referencelocalized/<server>/<type>-note.jsonRequired:
-t, --type <type> - Document type (see list above)-p, --patient-id <id> - Patient resource ID-s, --server <name> - Server name for organizing outputOptional:
--patient-name <name> - Default: "Test Patient"--author-reference <ref> - Default: "Practitioner/example"--author-display <name> - Default: "Dr. Example Provider"--encounter-reference <ref> - Encounter reference (optional - omit to exclude encounter entirely)--encounter-display <text> - Encounter display text (only used with --encounter-reference)--encounter-contained <json> - JSON string of contained Encounter resource (use with --encounter-reference="#id")--identifier-system <system> - Default: "https://example.com/fhir-test"-o, --output-dir <dir> - Default: localized/<server>The localization functionality is also available as an importable library:
import { localizeDocumentReference } from '.claude/skills/write-clinical-notes/assets/lib/localize-document.ts';
import { execute } from './fhir-request.ts';
// Generate and POST in memory
const docRef = await localizeDocumentReference({
type: 'consultation',
patientId: 'patient-123',
server: 'smart',
writeToFile: false
});
await execute({
method: 'POST',
path: '/DocumentReference',
body: docRef,
purpose: 'Create consultation note',
callerDir: import.meta.dir
});
// With resolvable encounter
const docRef2 = await localizeDocumentReference({
type: 'pdf',
patientId: 'patient-123',
server: 'epic',
encounterReference: 'Encounter/visit-789',
encounterDisplay: 'Office Visit',
writeToFile: false
});
// With contained encounter
const docRef3 = await localizeDocumentReference({
type: 'html',
patientId: 'patient-123',
server: 'smart',
encounterReference: '#e1',
encounterContained: {
resourceType: 'Encounter',
id: 'e1',
status: 'finished',
class: { system: 'http://terminology.hl7.org/CodeSystem/v3-ActCode', code: 'AMB' },
subject: { reference: 'Patient/patient-123' },
period: { start: '2025-01-15T10:00:00Z', end: '2025-01-15T11:00:00Z' }
},
writeToFile: false
});
Key Points:
writeToFile: false for in-memory use (no file created)encounterContained as object when using library (CLI requires JSON string)body parameter in execute() for in-memory DocumentReference objects❌ DON'T:
✅ DO:
localize-document.ts command-line tool for simple caseslocalizeDocumentReference() when writing codeTemplates contain placeholders that the script replaces:
Patient/Subject:
{{PATIENT_ID}} - Patient resource ID{{PATIENT_NAME}} - Patient display name{{PATIENT_GIVEN_NAME}} - Given name (for CDA){{PATIENT_FAMILY_NAME}} - Family name (for CDA)Author:
{{AUTHOR_REFERENCE}} - Practitioner reference{{AUTHOR_DISPLAY}} - Author display name{{AUTHOR_NAME}} - Full name in content{{AUTHOR_GIVEN_NAME}} - Given name (for CDA){{AUTHOR_FAMILY_NAME}} - Family name (for CDA){{AUTHOR_TITLE}} - Professional title (MD, etc.)Encounter:
{{ENCOUNTER_REFERENCE}} - Encounter reference (or #e1){{ENCOUNTER_DISPLAY}} - Encounter display text{{PERIOD_START}} - Clinical period start{{PERIOD_END}} - Clinical period endContent:
{{BASE64_CONTENT}} - Base64-encoded note content{{CONTENT_SIZE}} - Size in bytes (as number, not string){{CONTENT_TYPE}} - MIME typeMetadata:
{{IDENTIFIER_SYSTEM}} - System for note identifier{{IDENTIFIER_VALUE}} - Unique identifier value{{CURRENT_TIMESTAMP}} - ISO 8601 timestamp{{CURRENT_DATE}} - Date (YYYY-MM-DD){{CURRENT_TIME}} - Time (HH:MM)If you absolutely must localize manually:
Determine server directory name from config (e.g., "smart-launcher", "epic-sandbox")
Create server-specific directory:
mkdir -p "localized/${SERVER_NAME}"
Read template file from assets/templates/
Read sample content from assets/sample-content/ (if using text)
Base64 encode content:
cat assets/sample-content/consultation-note.txt | base64 -w 0
Check for missing context and ask user for guidance when needed:
IMPORTANT: If required context is missing or ambiguous, ASK THE USER how to proceed:
No encounter ID in config? Ask: "I don't see an encounter ID. Should I:
No author/practitioner reference? Ask: "What should I use for the author reference?
Patient name not specified? Ask: "Should I fetch the patient name from the server, use a placeholder, or omit it?"
Missing identifier system? Ask: "What identifier system should I use for note identifiers? (e.g., your app's URL)"
Don't guess or make assumptions about clinical data. User input ensures tests are meaningful and aligned with their testing goals.
Replace all placeholders with values from the configuration and user-provided context
Save to server-specific directory:
# Save localized file
echo "$localized_json" > "localized/${SERVER_NAME}/consultation-note.json"
Validate JSON before submission
This keeps each server's localized resources separate and organized.
The skill includes generator scripts for different content types. Run these from the skill's assets/sample-content/ directory:
Text Files (pre-created):
consultation-note.txt - Plain text consultation noteprogress-note.txt - Plain text progress notepatient-log.txt - Patient-authored health logGenerated Content (run these scripts as needed):
bun generate-pdf.ts - Creates consultation-note.pdf (minimal valid PDF)bun generate-cda-xml.ts - Creates discharge-summary.cda.xml (proper HL7 C-CDA with structured body)bun generate-xhtml.ts - Creates discharge-summary.xhtml (XHTML rich text, NOT CDA)bun generate-html.ts - Creates progress-note.html (HTML with rich formatting)bun generate-large-file.ts - Creates large-note.txt (~5 MiB text file)Why generate programmatically?
Example usage:
cd .claude/skills/write-clinical-notes/assets/sample-content
bun generate-pdf.ts
# Creates consultation-note.pdf in current directory
Execute scenarios in order. Each builds on previous validations. The complete scenario details are embedded above in the "Connectathon Test Scenarios" section.
Objective: Verify basic create and retrieval
# 1. Localize template for this server using the localization utility
SERVER_NAME="smart-launcher" # from config
PATIENT_ID="patient-123" # from config
bun .claude/skills/write-clinical-notes/assets/lib/localize-document.ts \
--type=consultation \
--patient-id="${PATIENT_ID}" \
--server="${SERVER_NAME}"
# Output: localized/smart-launcher/consultation-note.json
# 2. POST to server using request.ts pattern
TIMESTAMP=$(date -u +"%Y-%m-%dT%H-%M-%S")
DEBUG_DIR="debug/${SERVER_NAME}/${TIMESTAMP}-scenario1-create"
mkdir -p "$DEBUG_DIR"
cp "debug/${SERVER_NAME}/fhir-request.ts" "$DEBUG_DIR/../" 2>/dev/null || cp .claude/skills/write-clinical-notes/assets/lib/fhir-request.ts "debug/${SERVER_NAME}/"
# Create request.ts for POST
cat > "$DEBUG_DIR/request.ts" << 'EOF'
import { execute } from '../fhir-request.ts';
await execute({
method: "POST",
path: "/DocumentReference",
bodyFile: "localized/smart-launcher/consultation-note.json",
purpose: "Scenario 1: Basic document creation",
configName: "smart-launcher", // REQUIRED if multiple configs exist
callerDir: import.meta.dir
});
EOF
# Run the request (from project root)
bun run "$DEBUG_DIR/request.ts"
# 3. Response is automatically captured in:
# - response-metadata.json (status, headers, timing)
# - response-body.json (returned DocumentReference)
# Expect 201 Created or 202 Accepted
# 4. GET to verify (extract ID from response-body.json first)
DOC_ID=$(jq -r '.id' "$DEBUG_DIR/response-body.json")
GET_DIR="debug/${SERVER_NAME}/${TIMESTAMP}-scenario1-read"
mkdir -p "$GET_DIR"
cat > "$GET_DIR/request.ts" << EOF
import { execute } from '../fhir-request.ts';
await execute({
method: "GET",
path: "/DocumentReference/${DOC_ID}",
purpose: "Scenario 1: Verify created document",
configName: "${SERVER_NAME}", // REQUIRED if multiple configs exist
callerDir: import.meta.dir
});
EOF
bun run "$GET_DIR/request.ts"
# 5. Validate response (check response-body.json)
# - All Must Support elements present
# - Content matches original
# - Base64 decodes correctly
Success criteria:
Objective: Test duplicate prevention
# 1. Use same localized note from Scenario 1
# 2. POST with If-None-Exist header using request.ts
TIMESTAMP=$(date -u +"%Y-%m-%dT%H-%M-%S")
DEBUG_DIR="debug/${SERVER_NAME}/${TIMESTAMP}-scenario2-conditional"
mkdir -p "$DEBUG_DIR"
# Extract identifier from the localized note
IDENTIFIER_SYSTEM=$(jq -r '.identifier[0].system' "localized/${SERVER_NAME}/consultation-note.json")
IDENTIFIER_VALUE=$(jq -r '.identifier[0].value' "localized/${SERVER_NAME}/consultation-note.json")
cat > "$DEBUG_DIR/request.ts" << EOF
import { execute } from '../fhir-request.ts';
await execute({
method: "POST",
path: "/DocumentReference",
bodyFile: "localized/${SERVER_NAME}/consultation-note.json",
headers: {
"If-None-Exist": "identifier=${IDENTIFIER_SYSTEM}|${IDENTIFIER_VALUE}"
},
purpose: "Scenario 2: Conditional create (idempotency test)",
configName: "${SERVER_NAME}", // REQUIRED if multiple configs exist
callerDir: import.meta.dir
});
EOF
bun run "$DEBUG_DIR/request.ts"
# 3. Check response-metadata.json - Expect 200 OK or 304 Not Modified (not 201)
# 4. Search to confirm only one resource exists
SEARCH_DIR="debug/${SERVER_NAME}/${TIMESTAMP}-scenario2-search"
mkdir -p "$SEARCH_DIR"
cat > "$SEARCH_DIR/request.ts" << EOF
import { execute } from '../fhir-request.ts';
await execute({
method: "GET",
path: "/DocumentReference?identifier=${IDENTIFIER_SYSTEM}|${IDENTIFIER_VALUE}",
purpose: "Scenario 2: Verify no duplicate created",
configName: "${SERVER_NAME}", // REQUIRED if multiple configs exist
callerDir: import.meta.dir
});
EOF
bun run "$SEARCH_DIR/request.ts"
# Check response-body.json - should show Bundle with single entry
Success criteria:
Objective: Test entered-in-error workflow
# 1. Create a note (use Scenario 1)
# Assume DOC_ID is from Scenario 1's response
# 2. Submit partial update using request.ts
TIMESTAMP=$(date -u +"%Y-%m-%dT%H-%M-%S")
DEBUG_DIR="debug/${SERVER_NAME}/${TIMESTAMP}-scenario3-correction"
mkdir -p "$DEBUG_DIR"
# Create minimal update payload
PATIENT_ID=$(jq -r '.patientId' .fhir-configs/${SERVER_NAME}.json)
cat > "$DEBUG_DIR/update-payload.json" << EOF
{
"resourceType": "DocumentReference",
"id": "${DOC_ID}",
"status": "entered-in-error",
"subject": {"reference": "Patient/${PATIENT_ID}"}
}
EOF
cat > "$DEBUG_DIR/request.ts" << EOF
import { execute } from '../fhir-request.ts';
await execute({
method: "PUT",
path: "/DocumentReference/${DOC_ID}",
bodyFile: "debug/${SERVER_NAME}/${TIMESTAMP}-scenario3-correction/update-payload.json",
purpose: "Scenario 3: Mark document as entered-in-error",
configName: "${SERVER_NAME}", // REQUIRED if multiple configs exist
callerDir: import.meta.dir
});
EOF
bun run "$DEBUG_DIR/request.ts"
# 3. GET to verify status change
GET_DIR="debug/${SERVER_NAME}/${TIMESTAMP}-scenario3-verify"
mkdir -p "$GET_DIR"
cat > "$GET_DIR/request.ts" << EOF
import { execute } from '../fhir-request.ts';
await execute({
method: "GET",
path: "/DocumentReference/${DOC_ID}",
purpose: "Scenario 3: Verify status correction",
configName: "${SERVER_NAME}", // REQUIRED if multiple configs exist
callerDir: import.meta.dir
});
EOF
bun run "$GET_DIR/request.ts"
# Check response-body.json - status should be "entered-in-error"
# 4. Test search behavior (may be excluded from results)
Success criteria:
Objective: Test supersession with relatesTo
# 1. Create initial DocumentReference
original_id = POST /DocumentReference -> capture ID
# 2. Create replacement with relatesTo
replacement = {
...original_template,
"relatesTo": [{
"code": "replaces",
"target": {
"reference": "DocumentReference/{{original_id}}"
}
}],
"content": [{"attachment": {"data": "{{updated_content}}"}}]
}
# 3. POST replacement
POST /DocumentReference with replacement
# 4. Verify both exist
GET /DocumentReference/{{original_id}}
GET /DocumentReference/{{replacement_id}}
# 5. Confirm relatesTo preserved
Success criteria:
DocumentReference.c or .u#id formatIf-None-Exist header correctlyUse this checklist to verify server conformance:
Must Support Elements:
Additional Elements:
Server Capabilities:
The complete specification and test scenarios are embedded in this skill document:
Writing Clinical Notes Specification: @references/spec.md
Connectathon Test Scenarios: @references/scenarios.md
These provide all the details about Must Support elements, conformance requirements, and step-by-step testing instructions.