Optimize bulk API requests with batching, throttling, and parallel execution. Use when processing bulk API operations efficiently. Trigger with phrases like "process bulk requests", "batch API calls", or "handle batch operations".
Optimize bulk API operations with batch request endpoints, parallel execution with concurrency control, partial failure handling, and progress tracking. Implement batch processing patterns that accept arrays of operations in a single request, execute them efficiently with database bulk operations, and return per-item results with individual success/failure status.
id for result correlation, e.g., POST /batch with {operations: [{method: "POST", path: "/users", body: {...}, id: "op1"}]}.{id, status, result|error} for each operation.batchId and status polling URL, process in a background worker, and update progress in Redis.p-limit or asyncio.Semaphore to prevent database connection exhaustion.succeeded, failed, total).GET /batch/:batchId/status returning {total, completed, failed, progress: 0.75, status: "processing|completed|failed"}.See ${CLAUDE_SKILL_DIR}/references/implementation.md for the full implementation guide.
${CLAUDE_SKILL_DIR}/src/routes/batch.js - Batch request endpoint with sync/async routing${CLAUDE_SKILL_DIR}/src/batch/processor.js - Batch execution engine with concurrency control${CLAUDE_SKILL_DIR}/src/batch/validator.js - Batch request validation and size limit enforcement${CLAUDE_SKILL_DIR}/src/batch/progress.js - Redis-backed progress tracking for async batches${CLAUDE_SKILL_DIR}/src/batch/workers/ - Background worker for async batch processing${CLAUDE_SKILL_DIR}/src/batch/results.js - Per-item result aggregation with summary statistics${CLAUDE_SKILL_DIR}/tests/batch/ - Batch processing integration tests| Error | Cause | Solution |
|---|---|---|
| 413 Payload Too Large | Batch exceeds maximum item count (1000) or body size limit | Return clear error with maximum allowed count; suggest splitting into multiple batch requests |
| 207 Multi-Status | Some batch items succeeded while others failed | Return per-item status array; include error details for failed items; provide summary counts |
| 408 Batch Timeout | Synchronous batch processing exceeded request timeout | Switch to async processing for large batches; return 202 with status polling URL |
| Partial transaction failure | Database transaction rolls back all items due to one failure | Use savepoints for per-item isolation; or process items individually outside a wrapping transaction |
| Progress tracking stale | Worker crashed mid-batch; progress stops updating | Implement heartbeat monitoring; mark batch as failed after heartbeat timeout; enable retry from last checkpoint |
Refer to ${CLAUDE_SKILL_DIR}/references/errors.md for comprehensive error patterns.
Bulk user import: Accept a CSV-uploaded batch of 5000 user records via POST /batch/users/import, return 202 with batchId, process asynchronously with progress updates, and provide a downloadable results file when complete.
Multi-resource batch: Accept mixed operations in a single batch: [{method:"POST",path:"/users",...}, {method:"PUT",path:"/orders/123",...}, {method:"DELETE",path:"/products/456"}], executing each against the appropriate handler.
Idempotent batch retry: Client includes idempotencyKey per batch item; on retry, already-completed items return their cached result without re-execution, while failed items are re-attempted.
See ${CLAUDE_SKILL_DIR}/references/examples.md for additional examples.