Process pending cerebellum work orders from the task pool. Use when asked to handle work orders, process dev tickets, batch-implement cerebellums, or clear the task backlog. Do NOT use for single cerebellum development — use cerebellum-developer instead.
Batch-process cerebellum development tickets generated by CursorDevPipeline.
List all pending work orders:
ls -la data/cursor_task_pool/DEV_*.md
Read each .md file header to extract:
capability_idSort work orders by:
Skip work orders where:
capability_id already exists in the registry (check data/cerebellum_signatures.json)Present a summary table to the user before proceeding:
| # | Ticket | capability_id | Priority | Status |
|---|--------|---------------|----------|--------|
| 1 | DEV_xxx_yyy | some_capability | 8/10 | ready |
| 2 | DEV_xxx_zzz | other_capability | 5/10 | blocked (parent) |
For each actionable work order (in priority order):
.md ticket file.$cerebellum-developer skill workflow (Steps 1-7).implemented / skipped (exists) / skipped (composable) / failed.After processing all work orders, produce a summary:
## Batch Processing Results
- Total scanned: N
- Implemented: X
- Skipped (already exists): Y
- Skipped (composable from existing): Z
- Failed: W
### Details
| capability_id | Result | File | Notes |
|---------------|--------|------|-------|
| ... | implemented | agents/.../xxx.py | |
| ... | skipped | — | exists as analyze_style |
Write the summary to evidences/workorder_batch_{timestamp}.json.
Run the smoke test to confirm nothing is broken:
PYTHONPATH=. python scripts/runtime_smoke.py
For each newly implemented capability, also run:
PYTHONPATH=. python scripts/codex_verify_cerebellum.py --capability_id {id}
blocked (needs API config) and move on to the next.