Emit events to the open-wrapper LLM watcher dashboard via webhook
Events are sent to the LLM Watcher dashboard via its open-wrapper webhook endpoint:
| Endpoint | Port | Path | Use case |
|---|---|---|---|
| LLM Watcher | 3000 | /api/open-wrapper | Dashboard event ingestion |
All examples below use this endpoint.
{
"id": "uuid-string",
"timestamp": "RFC 3339 timestamp",
"event_type": "request | streaming | completion | error | setup | status | system",
"command": "string",
"status": "start | running | completed | failed | info",
"model": "optional model name",
"metadata": {},
"source": "ignored -- server overrides to \"webhook\""
}
The id, timestamp, and source fields are required in the JSON payload (the server deserializes the full struct) but is always overwritten to by the server.
DashboardEventsource"webhook"curl -s -X POST ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/open-wrapper \
-H "Content-Type: application/json" \
-d '{
"id": "'$(uuidgen || cat /proc/sys/kernel/random/uuid 2>/dev/null || python3 -c "import uuid; print(uuid.uuid4())")'",
"timestamp": "'$(date -u +%Y-%m-%dT%H:%M:%S.000Z)'",
"event_type": "request",
"command": "ask",
"status": "start",
"model": "qwen2.5-coder:latest",
"metadata": {"prompt": "hello world"},
"source": "webhook"
}'
The server responds with {"ok": true} on success.
Wrap multiple events in a batch payload:
curl -s -X POST ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/open-wrapper \
-H "Content-Type: application/json" \
-d '{
"batch": true,
"events": [
{
"id": "uuid-1",
"timestamp": "2026-01-01T00:00:00.000Z",
"event_type": "request",
"command": "ask",
"status": "start",
"model": "qwen2.5-coder:latest",
"metadata": {},
"source": "webhook"
},
{
"id": "uuid-2",
"timestamp": "2026-01-01T00:00:01.000Z",
"event_type": "completion",
"command": "ask",
"status": "completed",
"model": "qwen2.5-coder:latest",
"metadata": {"tokens": 128},
"source": "webhook"
}
]
}'
Emit when an LLM request begins.
curl -s -X POST ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/open-wrapper \
-H "Content-Type: application/json" \
-d '{
"id": "'$(python3 -c "import uuid; print(uuid.uuid4())")'",
"timestamp": "'$(date -u +%Y-%m-%dT%H:%M:%S.000Z)'",
"event_type": "request",
"command": "ask",
"status": "start",
"model": "qwen2.5-coder:latest",
"metadata": {"prompt_length": 42},
"source": "webhook"
}'
Emit when the LLM response has been fully received.
curl -s -X POST ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/open-wrapper \
-H "Content-Type: application/json" \
-d '{
"id": "'$(python3 -c "import uuid; print(uuid.uuid4())")'",
"timestamp": "'$(date -u +%Y-%m-%dT%H:%M:%S.000Z)'",
"event_type": "completion",
"command": "ask",
"status": "completed",
"model": "qwen2.5-coder:latest",
"metadata": {"tokens": 256, "duration_ms": 1200},
"source": "webhook"
}'
Emit when a request fails.
curl -s -X POST ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/open-wrapper \
-H "Content-Type: application/json" \
-d '{
"id": "'$(python3 -c "import uuid; print(uuid.uuid4())")'",
"timestamp": "'$(date -u +%Y-%m-%dT%H:%M:%S.000Z)'",
"event_type": "error",
"command": "ask",
"status": "failed",
"model": "qwen2.5-coder:latest",
"metadata": {"error": "connection refused", "code": 502},
"source": "webhook"
}'
Emit informational or diagnostic messages.
curl -s -X POST ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/open-wrapper \
-H "Content-Type: application/json" \
-d '{
"id": "'$(python3 -c "import uuid; print(uuid.uuid4())")'",
"timestamp": "'$(date -u +%Y-%m-%dT%H:%M:%S.000Z)'",
"event_type": "system",
"command": "system",
"status": "info",
"metadata": {"detail": "Model cache cleared"},
"source": "webhook"
}'
Beyond the four standard patterns above, the schema supports:
event_type: "streaming" with status: "running" -- for streaming chunk progressevent_type: "setup" with status: "info" -- for initialization/configuration eventsevent_type: "status" with status: "info" -- for health checks or status reportsCheck that events are arriving via the LLM Watcher endpoints:
# Get full event history
curl -s ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/history | python3 -m json.tool
# Get aggregated stats
curl -s ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/stats | python3 -m json.tool
# Stream live events via SSE
curl -s -N ${OPEN_WRAPPER_WEBHOOK:-http://localhost:1420}/api/events