
Product
Reachability for Ruby Now in Beta
Reachability analysis for Ruby is now in beta, helping teams identify which vulnerabilities are truly exploitable in their applications.
@vercel/queue
Advanced tools
A TypeScript client library for interacting with the Vercel Queue Service API, designed for seamless integration with Vercel deployments.
npm install @vercel/queue
The package includes:
npx @vercel/queue/local-discover for local development handler discoveryFor local development, you'll need to pull your Vercel environment variables:
# Install Vercel CLI if you haven't already
npm i -g vercel
# Pull environment variables from your Vercel project
vc env pull
Queues just work locally. When you send() messages in development mode, they automatically trigger your handlers locally - no external queue infrastructure needed.
For Next.js API routes (or others that are lazy-loaded), run this simple command to initialize handlers:
npx vercel-queue-local-init
That's it! The script reads your vercel.json, finds your queue handlers, and triggers Next.js to load them.
# Start your dev server
npm run dev
# Initialize handlers (only needed for Next.js lazy loading)
npx vercel-queue-local-init
# Send messages - they process locally automatically!
# Custom port
npx vercel-queue-local-init --port 3001
# Different config file
npx vercel-queue-local-init --config ./my-vercel.json
# Skip vercel.json, use defaults
npx vercel-queue-local-init --no-vercel-config
Update your tsconfig.json to use "bundler" module resolution for proper package export resolution:
{
"compilerOptions": {
"moduleResolution": "bundler"
}
}
The send function can be used anywhere in your codebase to publish messages to a queue:
import { send } from "@vercel/queue";
// Send a message to a topic
await send("my-topic", {
message: "Hello world",
});
// With additional options
await send(
"my-topic",
{
message: "Hello world",
},
{
idempotencyKey: "unique-key", // Optional: prevent duplicate messages
retentionSeconds: 3600, // Optional: override retention time (defaults to 24 hours)
},
);
Example usage in an API route:
// app/api/send-message/route.ts
import { send } from "@vercel/queue";
export async function POST(request: Request) {
const body = await request.json();
const { messageId } = await send("my-topic", {
message: body.message,
});
return Response.json({ messageId });
}
Messages are consumed using API routes that Vercel automatically triggers when messages are available.
The recommended approach is to handle multiple topics and consumers in a single API route to keep your vercel.json configuration simple:
// app/api/queue/route.ts
import { handleCallback } from "@vercel/queue";
export const POST = handleCallback({
// Single topic with one consumer
"my-topic": {
"my-consumer": async (message, metadata) => {
// metadata includes: { messageId, deliveryCount, createdAt }
console.log("Processing message:", message);
// If this throws an error, the message will be automatically retried
await processMessage(message);
},
},
// Multiple consumers for different purposes
"order-events": {
fulfillment: async (order, metadata) => {
// By default, errors will trigger automatic retries
// But you can control retry timing if needed:
if (!isSystemReady()) {
// Override default retry with a 5 minute delay
return { timeoutSeconds: 300 };
}
await processOrder(order);
},
analytics: async (order, metadata) => {
try {
await trackOrder(order);
} catch (error) {
// Optional: Custom exponential backoff instead of default retry timing
const timeoutSeconds = Math.pow(2, metadata.deliveryCount) * 60;
return { timeoutSeconds };
}
},
},
});
While you can split handlers into separate routes if needed (e.g., for code organization or deployment flexibility), consolidating them in one route is recommended for simpler configuration.
Configure which topics and consumers your API route handles:
{
"functions": {
"app/api/queue/route.ts": {
"experimentalTriggers": [
{
"type": "queue/v1beta",
"topic": "my-topic",
"consumer": "my-consumer",
"maxAttempts": 3,
"retryAfterSeconds": 60,
"initialDelaySeconds": 0
},
{
"type": "queue/v1beta",
"topic": "order-events",
"consumer": "fulfillment"
},
{
"type": "queue/v1beta",
"topic": "order-events",
"consumer": "analytics",
"maxAttempts": 5,
"retryAfterSeconds": 300
}
]
}
}
}
vercel.json file tells Vercel which routes handle which topics/consumersThe queue client supports customizable serialization through the Transport interface:
Example:
import { send, JsonTransport } from "@vercel/queue";
// JsonTransport is the default
await send("json-topic", { data: "example" });
// Explicit transport configuration
await send(
"json-topic",
{ data: "example" },
{ transport: new JsonTransport() },
);
| Use Case | Recommended Transport | Memory Usage | Performance |
|---|---|---|---|
| Small JSON objects | JsonTransport | Low | High |
| Binary files < 100MB | BufferTransport | Medium | High |
| Large files > 100MB | StreamTransport | Very Low | Medium |
| Real-time streams | StreamTransport | Very Low | High |
The queue client provides specific error types:
QueueEmptyError: No messages available (204)MessageLockedError: Message temporarily locked (423)MessageNotFoundError: Message doesn't exist (404)MessageNotAvailableError: Message exists but unavailable (409)MessageCorruptedError: Message data corruptedBadRequestError: Invalid parameters (400)UnauthorizedError: Authentication failure (401)ForbiddenError: Access denied (403)InternalServerError: Server errors (500+)Example error handling:
import {
BadRequestError,
ForbiddenError,
InternalServerError,
UnauthorizedError,
} from "@vercel/queue";
try {
await send("my-topic", payload);
} catch (error) {
if (error instanceof UnauthorizedError) {
console.log("Invalid token - refresh authentication");
} else if (error instanceof ForbiddenError) {
console.log("Environment mismatch - check configuration");
} else if (error instanceof BadRequestError) {
console.log("Invalid parameters:", error.message);
} else if (error instanceof InternalServerError) {
console.log("Server error - retry with backoff");
}
}
Note: The
receivefunction is not intended for use in Vercel deployments. It's designed for use in the Vercel Sandbox environment or alternative server setups where you need direct message processing control.
// Process next available message
await receive<T>(topicName, consumerGroup, handler);
// Process specific message by ID
await receive<T>(topicName, consumerGroup, handler, {
messageId: "message-id"
});
// Process message with options
await receive<T>(topicName, consumerGroup, handler, {
messageId?: string; // Process specific message by ID
skipPayload?: boolean; // Skip payload download (requires messageId)
transport?: Transport<T>; // Custom transport (defaults to JsonTransport)
visibilityTimeoutSeconds?: number; // Message visibility timeout
refreshInterval?: number; // Refresh interval for long-running operations
});
// Handler function signature
type MessageHandler<T = unknown> = (
message: T,
metadata: MessageMetadata
) => Promise<MessageHandlerResult> | MessageHandlerResult;
// Handler result types
type MessageHandlerResult = void | MessageTimeoutResult;
interface MessageTimeoutResult {
timeoutSeconds: number; // seconds before message becomes available again
}
If you need more than 1,000 messages per second, you can create multiple topics (e.g., user-specific or shard-based topics) and handle them with a single consumer using wildcards in your vercel.json:
{
"functions": {
"app/api/queue/route.ts": {
"experimentalTriggers": [
{
"type": "queue/v1beta",
"topic": "user-*",
"consumer": "processor"
}
]
}
}
}
This allows you to:
user-1, user-2, etc.MIT
FAQs
A Node.js library for interacting with the Vercel Queue Service API
We found that @vercel/queue demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 7 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Product
Reachability analysis for Ruby is now in beta, helping teams identify which vulnerabilities are truly exploitable in their applications.

Research
/Security News
Malicious npm packages use Adspect cloaking and fake CAPTCHAs to fingerprint visitors and redirect victims to crypto-themed scam sites.

Security News
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.