Let's walk through flashQ's features with practical examples. We'll simulate real scenarios showing exactly how each feature works, from basic job processing to advanced patterns.
🚀 1. Basic Job Processing
Let's start with the fundamentals: pushing a job and processing it.
$ bun run producer.ts
[10:00:00.000] Connecting to flashQ server...
[10:00:00.012] ✓ Connected to localhost:6789
[10:00:00.015] Pushing job to 'emails' queue...
[10:00:00.018] ✓ Job created: #1001
→ Queue: emails
→ Data: { to: "user@example.com", subject: "Welcome!" }
→ State: waiting
$ bun run worker.ts
[10:00:00.050] Worker started for 'emails' queue
[10:00:00.052] Pulled job #1001
[10:00:00.053] Processing: Send email to user@example.com
[10:00:00.250] ✓ Email sent successfully
[10:00:00.252] ✓ Job #1001 completed in 200ms
// producer.ts
const client = new FlashQ();
await client.connect();
const job = await client.push('emails', {
to: 'user@example.com',
subject: 'Welcome!',
body: 'Thanks for signing up.'
});
console.log(`Job created: #${job.id}`);
⚡ 2. Priority Queues
High-priority jobs get processed first. Let's see how priorities work:
[10:00:00.000] Pushing 5 jobs with different priorities...
[10:00:00.010] Job #1: priority 1 (low)
[10:00:00.012] Job #2: priority 1 (low)
[10:00:00.014] Job #3: priority 50 (high)
[10:00:00.016] Job #4: priority 100 (critical)
[10:00:00.018] Job #5: priority 10 (normal)
[10:00:00.020] Processing order (highest priority first):
[10:00:00.025] 1st: Job #4 (priority 100) ⭐
[10:00:00.030] 2nd: Job #3 (priority 50)
[10:00:00.035] 3rd: Job #5 (priority 10)
[10:00:00.040] 4th: Job #1 (priority 1)
[10:00:00.045] 5th: Job #2 (priority 1)
// Critical job - processed first
await client.push('tasks', { type: 'urgent' }, { priority: 100 });
// Normal job
await client.push('tasks', { type: 'regular' }, { priority: 10 });
// Background job - processed last
await client.push('tasks', { type: 'cleanup' }, { priority: 1 });
⏱️ 3. Delayed Jobs
Schedule jobs to run in the future:
[10:00:00.000] Scheduling reminder for 5 seconds from now...
[10:00:00.015] ✓ Job #2001 created (delayed)
→ State: delayed
→ Scheduled for: 10:00:05.000
[10:00:01.000] ⏳ Waiting... (4s remaining)
[10:00:02.000] ⏳ Waiting... (3s remaining)
[10:00:03.000] ⏳ Waiting... (2s remaining)
[10:00:04.000] ⏳ Waiting... (1s remaining)
[10:00:05.000] Job moved to waiting queue
[10:00:05.010] ✓ Job #2001 processed: "Don't forget your meeting!"
// Send reminder in 5 seconds
await client.push('reminders', {
message: "Don't forget your meeting!"
}, {
delay: 5000 // 5000ms = 5 seconds
});
// Schedule for specific time
const tomorrow = new Date();
tomorrow.setDate(tomorrow.getDate() + 1);
await client.push('reports', { type: 'daily' }, {
delay: tomorrow.getTime() - Date.now()
});
🔄 4. Automatic Retries with Exponential Backoff
When jobs fail, flashQ automatically retries them with exponential backoff:
[10:00:00.000] Pushing job with max 3 attempts...
[10:00:00.020] ✓ Job #3001 created
[10:00:00.025] Attempt 1/3
[10:00:00.500] ✗ Failed: Connection timeout
[10:00:00.502] → Retry in 1000ms (backoff: 1s × 2^0)
[10:00:01.510] Attempt 2/3
[10:00:02.000] ✗ Failed: Connection timeout
[10:00:02.002] → Retry in 2000ms (backoff: 1s × 2^1)
[10:00:04.010] Attempt 3/3
[10:00:04.200] ✓ Success! API responded
[10:00:04.202] ✓ Job #3001 completed after 3 attempts
await client.push('api-calls', { url: 'https://api.example.com' }, {
max_attempts: 3, // Try up to 3 times
backoff: 1000 // Start with 1s, then 2s, then 4s...
});
💀 5. Dead Letter Queue (DLQ)
Jobs that fail all attempts go to the DLQ for inspection:
[10:00:00.000] Job #4001 failed all 3 attempts
[10:00:00.001] Moving to Dead Letter Queue...
[10:00:05.000] Inspecting DLQ...
┌─────────┬─────────────────────┬──────────┬─────────────────────┐
│ Job ID │ Queue │ Attempts │ Last Error │
├─────────┼─────────────────────┼──────────┼─────────────────────┤
│ #4001 │ payments │ 3 │ Card declined │
│ #4002 │ emails │ 3 │ Invalid address │
│ #4003 │ webhooks │ 3 │ Timeout │
└─────────┴─────────────────────┴──────────┴─────────────────────┘
[10:00:10.000] Retrying job #4003 from DLQ...
[10:00:10.200] ✓ Job #4003 completed on retry!
// Get failed jobs
const failedJobs = await client.getDlq('payments', 10);
for (const job of failedJobs) {
console.log(`Job #${job.id} failed: ${job.failedReason}`);
// Retry specific job
await client.retryDlq('payments', job.id);
}
// Retry all DLQ jobs
await client.retryDlq('payments');
🚦 6. Rate Limiting
Control how fast jobs are processed to respect API limits:
[10:00:00.000] Setting rate limit: 10 jobs/second
[10:00:00.005] ✓ Rate limit configured
[10:00:00.010] Pushing 50 jobs at once...
[10:00:00.100] ✓ 50 jobs queued
[10:00:00.100] Processing: ████████░░ 10/50 (rate: 10/s)
[10:00:01.100] Processing: ████████████████░░░░ 20/50 (rate: 10/s)
[10:00:02.100] Processing: ████████████████████████░░░░░░ 30/50 (rate: 10/s)
[10:00:03.100] Processing: ████████████████████████████████░░░░ 40/50 (rate: 10/s)
[10:00:04.100] Processing: ████████████████████████████████████████ 50/50
[10:00:04.100] ✓ All jobs completed in 4s (exactly 10/s)
// Limit to 10 jobs per second
await client.setRateLimit('openai', {
max: 10,
duration: 1000 // 1 second
});
// Limit to 100 jobs per minute
await client.setRateLimit('emails', {
max: 100,
duration: 60000 // 60 seconds
});
📊 7. Progress Tracking
Track progress for long-running jobs:
[10:00:00.000] Processing 1000 documents...
[10:00:01.000] Progress: ██░░░░░░░░ 10% - Processing batch 1/10
[10:00:02.000] Progress: ████░░░░░░ 20% - Processing batch 2/10
[10:00:03.000] Progress: ██████░░░░ 30% - Processing batch 3/10
[10:00:04.000] Progress: ████████░░ 40% - Processing batch 4/10
[10:00:05.000] Progress: ██████████ 50% - Processing batch 5/10
[10:00:06.000] Progress: ████████████░░░░░░░░ 60% - Processing batch 6/10
[10:00:07.000] Progress: ██████████████░░░░░░ 70% - Processing batch 7/10
[10:00:08.000] Progress: ████████████████░░░░ 80% - Processing batch 8/10
[10:00:09.000] Progress: ██████████████████░░ 90% - Processing batch 9/10
[10:00:10.000] Progress: ████████████████████ 100% - Complete!
// Worker updates progress
new Worker('documents', async (job) => {
const docs = job.data.documents;
const batchSize = 100;
for (let i = 0; i < docs.length; i += batchSize) {
await processBatch(docs.slice(i, i + batchSize));
// Update progress
const percent = Math.round((i + batchSize) / docs.length * 100);
await job.updateProgress(percent, `Processing batch ${i/batchSize + 1}`);
}
return { processed: docs.length };
});
// Monitor progress from another process
const progress = await client.getProgress(jobId);
console.log(`${progress.percent}% - ${progress.message}`);
⏰ 8. Cron Jobs
Schedule recurring jobs with cron expressions:
[10:00:00.000] Registering cron jobs...
[10:00:00.010] ✓ daily-report: "0 9 * * *" (every day at 9 AM)
[10:00:00.015] ✓ hourly-sync: "0 * * * *" (every hour)
[10:00:00.020] ✓ cleanup: "*/5 * * * *" (every 5 minutes)
[10:00:00.025] Cron scheduler running...
[10:05:00.000] ⏰ Triggered: cleanup
[10:05:00.500] ✓ Cleanup completed: removed 127 stale records
[10:10:00.000] ⏰ Triggered: cleanup
[10:10:00.300] ✓ Cleanup completed: removed 89 stale records
[11:00:00.000] ⏰ Triggered: hourly-sync
[11:00:05.000] ✓ Sync completed: 1,234 records updated
// Daily report at 9 AM
await client.addCron('daily-report', {
queue: 'reports',
schedule: '0 9 * * *',
data: { type: 'daily' }
});
// Every 5 minutes
await client.addCron('cleanup', {
queue: 'maintenance',
schedule: '*/5 * * * *',
data: { action: 'cleanup' }
});
// List all cron jobs
const crons = await client.listCrons();
crons.forEach(c => console.log(`${c.name}: ${c.schedule}`));
🔗 9. Job Dependencies (Workflows)
Create complex workflows where jobs depend on other jobs:
[10:00:00.000] Creating AI pipeline workflow...
Workflow Structure:
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Extract │ ──▶ │ Embed │ ──▶ │ Store │
│ #5001 │ │ #5002 │ │ #5003 │
└─────────────┘ └─────────────┘ └─────────────┘
[10:00:00.100] Job #5001 (extract): waiting
[10:00:00.101] Job #5002 (embed): waiting-children [#5001]
[10:00:00.102] Job #5003 (store): waiting-children [#5002]
[10:00:00.200] Processing #5001 (extract)...
[10:00:01.000] ✓ #5001 completed → #5002 now waiting
[10:00:01.100] Processing #5002 (embed)...
[10:00:03.000] ✓ #5002 completed → #5003 now waiting
[10:00:03.100] Processing #5003 (store)...
[10:00:03.500] ✓ #5003 completed
[10:00:03.501] ✓ Workflow complete!
// Create workflow with dependencies
const extract = await client.push('pipeline', {
step: 'extract',
file: 'document.pdf'
});
const embed = await client.push('pipeline', {
step: 'embed'
}, {
depends_on: [extract.id] // Wait for extract
});
const store = await client.push('pipeline', {
step: 'store'
}, {
depends_on: [embed.id] // Wait for embed
});
// Wait for entire workflow
const result = await client.finished(store.id);
console.log('Workflow complete!', result);
📈 10. Real-Time Monitoring
Monitor your queues with live metrics:
📊 Queue: emails
12,847
Jobs processed today
Waiting: 23
Active: 5
Failed: 2
📊 Queue: embeddings
156,234
Jobs processed today
Waiting: 1,247
Active: 32
Failed: 8
[10:00:00.000] Fetching queue stats...
┌─────────────────┬─────────┬────────┬───────────┬────────┐
│ Queue │ Waiting │ Active │ Completed │ Failed │
├─────────────────┼─────────┼────────┼───────────┼────────┤
│ emails │ 23 │ 5 │ 12,847 │ 2 │
│ embeddings │ 1,247 │ 32 │ 156,234 │ 8 │
│ webhooks │ 0 │ 0 │ 8,912 │ 0 │
│ reports │ 3 │ 1 │ 47 │ 0 │
└─────────────────┴─────────┴────────┴───────────┴────────┘
Throughput: 2,847 jobs/min | Avg latency: 127ms
// Get queue statistics
const stats = await client.stats();
console.log(`Total queues: ${stats.queues}`);
console.log(`Jobs waiting: ${stats.waiting}`);
console.log(`Jobs active: ${stats.active}`);
// Get detailed metrics
const metrics = await client.metrics();
console.log(`Throughput: ${metrics.throughput}/min`);
console.log(`Avg latency: ${metrics.avgLatency}ms`);
// Get job counts by state
const counts = await client.getJobCounts('emails');
console.log(counts);
// { waiting: 23, active: 5, completed: 12847, failed: 2 }
🎯 Summary
flashQ provides a complete toolkit for background job processing:
| Feature |
Use Case |
| Priority Queues |
Process urgent jobs first |
| Delayed Jobs |
Schedule for future execution |
| Retries + Backoff |
Handle transient failures |
| Dead Letter Queue |
Debug and retry failed jobs |
| Rate Limiting |
Respect API limits |
| Progress Tracking |
Monitor long-running jobs |
| Cron Jobs |
Recurring scheduled tasks |
| Dependencies |
Complex workflows |
| Real-Time Metrics |
Monitor queue health |
Try It Yourself
Get started with flashQ in under 5 minutes.
Quick Start Guide →