Pattern for handling errors in long-running background tasks to prevent frontend infinite loading states.
This skill outlines the robust pattern for handling background tasks (like scraping, data processing, or AI generation) where a frontend is polling for results.
In naively implemented async systems, background tasks often crash silently (timeouts, OOMs, unhandled exceptions). When this happens, the shared state (Database/Cache) never gets updated. The frontend, seeing no change, continues to poll indefinitely, leading to a "Zombie Loading" state.
"Persist All Terminal States"
Every background task must end in a database write, whether it succeeded or failed.
Ensure your status tracking table has fields to store error information.
CREATE TABLE task_results (
id TEXT PRIMARY KEY,
status TEXT DEFAULT 'pending', -- 'pending', 'completed', 'failed'
result_data JSON,
error_message TEXT -- Store the failure reason here
);
Wrap your background logic in a robust specific error handler that guarantees a DB write.
async function safeBackgroundTask(taskId: string, taskLogic: () => Promise<any>) {
try {
// 1. Run the actual logic
const result = await taskLogic();
// 2. On Success: Write result
await db.update(taskId, {
status: 'completed',
result_data: result
});
} catch (error) {
console.error(`Task ${taskId} failed:`, error);
// 3. On Failure: CRITICAL - Write error to DB
// This allows the frontend to stop polling
await db.update(taskId, {
status: 'failed',
error_message: error.message || 'Unknown system error'
});
}
}
The polling logic should check for both success and failure states.
const poll = setInterval(async () => {
const status = await checkStatus(taskId);
if (status.state === 'completed') {
showResult(status.data);
clearInterval(poll);
}
else if (status.state === 'failed') {
// Handle the explicit error state
showError(status.error_message);
clearInterval(poll);
}
// If 'pending', continue polling...
}, 2000);
error column?Refer to TrafficLens implementation where last_error was added to traffic_latest table to resolve infinite scraping loops.