Background Tasks
Long-running operations (report generation, data exports, bulk imports, AI processing) should never block the UI thread. DBS implements a 202 Accepted + webhook callback pattern that lets the frontend stay responsive while the server works asynchronously.
How It Works
Client Server
β β
βββ POST /api/tasks βββββββΊβ 1. Create task record (status: pending)
ββββ 202 { taskId } ββββββββ 2. Return immediately
β β
β β 3. Run heavy work asynchronously (void)
β β
β [10-30 seconds later] β
β βββ POST /api/webhooks/tasks βββΊβ 4. Mark complete
ββββ SSE: task-completed βββ 5. Push event to client
β β
βββ GET /api/tasks/[id] βββΊβ 6. Fetch final resultAPI Endpoints
| Method | Endpoint | Description |
|---|---|---|
POST | /api/tasks | Create a new background task, returns { taskId } with 202 |
GET | /api/tasks/[id] | Poll task status: pending, processing, completed, failed |
POST | /api/webhooks/tasks | Internal webhook to mark a task as completed/failed |
Creating a Background Task (Server)
// src/app/api/tasks/route.ts
import { apiGuard } from '@/lib/api-guard';
import { prisma } from '@/lib/prisma';
import { eventBus } from '@/lib/events';
import { NextResponse } from 'next/server';
import { v4 as uuid } from 'uuid';
export async function POST(req: Request) {
const guard = await apiGuard('dashboard.access');
if (guard.error) return guard.error;
const { session } = guard;
const body = await req.json();
// 1. Create task record
const task = await prisma.task.create({
data: {
id: uuid(),
status: 'pending',
type: body.type,
userId: session.user.id,
payload: body.payload,
},
});
// 2. Start heavy work in background (non-blocking)
void runHeavyProcess(task.id, body.payload);
// 3. Return immediately
return NextResponse.json({ taskId: task.id }, { status: 202 });
}
async function runHeavyProcess(taskId: string, payload: unknown) {
// ... your expensive logic ...
await new Promise(resolve => setTimeout(resolve, 15000)); // simulate
// Notify via internal webhook
await fetch(`${process.env.BETTER_AUTH_URL}/api/webhooks/tasks`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ taskId, status: 'completed', result: { ... } }),
});
}Webhook Handler (Task Completion)
// src/app/api/webhooks/tasks/route.ts
import { prisma } from '@/lib/prisma';
import { eventBus } from '@/lib/events';
import { NextResponse } from 'next/server';
export async function POST(req: Request) {
const body = await req.json();
const { taskId, status, result } = body;
// Update task in database
const task = await prisma.task.update({
where: { id: taskId },
data: { status, result, completedAt: new Date() },
});
// Push real-time event to the task owner
eventBus.emit('system-event', {
type: 'task-completed',
userId: task.userId,
taskId: task.id,
status,
});
return NextResponse.json({ ok: true });
}Frontend Task Monitoring
Use useTasks from the notification package to watch task status in real time:
'use client';
import { useNotificationSystem } from '@/lib/notification-package';
export function TaskMonitor() {
const { tasks } = useNotificationSystem();
const pendingTasks = tasks.filter(t => t.status === 'pending' || t.status === 'processing');
return (
<div>
{pendingTasks.map(task => (
<TaskRow key={task.id} task={task} />
))}
</div>
);
}Or trigger a task from the client:
import { tasksApi } from '@/services/tasks/api';
const handleGenerateReport = async () => {
const { taskId } = await tasksApi.createTask({
type: 'generate-report',
payload: { dateRange: { from, to } },
});
toast.info('Report is being generated...', {
description: `Task ID: ${taskId}`,
});
// The SSE stream will notify you when it completes
};Task States
| Status | Description |
|---|---|
pending | Task created, not yet picked up by the worker |
processing | Worker has started execution |
completed | Work finished successfully |
failed | Work threw an error; check result.error |
In the UI Lab (/dashboard/lab), there is a live demo that lets you trigger simulated tasks, watch the pending state in the notification bell, and see them resolve in real time via SSE.
The native async pattern works well for Vercel Serverless Functions and short-lived tasks (under 60 seconds). For tasks that can take minutes or run on a schedule, consider replacing runHeavyProcess with a dedicated queue: Inngest, Upstash QStash, or AWS SQS. Only the function body needs to change β the API contract stays the same.