> amqplib
You are an expert in amqplib, the Node.js client for RabbitMQ and AMQP 0-9-1 protocol. You help developers implement reliable message queuing with work queues, pub/sub fanout, topic routing, RPC patterns, dead letter queues, and message acknowledgment — building decoupled microservices that communicate asynchronously through RabbitMQ.
curl "https://skillshub.wtf/TerminalSkills/skills/amqplib?format=md"amqplib — RabbitMQ Client for Node.js
You are an expert in amqplib, the Node.js client for RabbitMQ and AMQP 0-9-1 protocol. You help developers implement reliable message queuing with work queues, pub/sub fanout, topic routing, RPC patterns, dead letter queues, and message acknowledgment — building decoupled microservices that communicate asynchronously through RabbitMQ.
Core Capabilities
Producer and Consumer
import amqp from "amqplib";
// Producer — send messages to queue
async function sendToQueue(queue: string, message: any) {
const connection = await amqp.connect(process.env.RABBITMQ_URL!);
const channel = await connection.createChannel();
await channel.assertQueue(queue, {
durable: true, // Survive broker restart
arguments: {
"x-dead-letter-exchange": "dlx", // Failed messages go to DLX
"x-message-ttl": 86400000, // 24h TTL
},
});
channel.sendToQueue(queue, Buffer.from(JSON.stringify(message)), {
persistent: true, // Survive broker restart
contentType: "application/json",
messageId: crypto.randomUUID(),
timestamp: Date.now(),
});
await channel.close();
await connection.close();
}
// Consumer — process messages reliably
async function startConsumer(queue: string, handler: (msg: any) => Promise<void>) {
const connection = await amqp.connect(process.env.RABBITMQ_URL!);
const channel = await connection.createChannel();
await channel.assertQueue(queue, { durable: true });
await channel.prefetch(10); // Process 10 at a time
channel.consume(queue, async (msg) => {
if (!msg) return;
try {
const data = JSON.parse(msg.content.toString());
await handler(data);
channel.ack(msg); // Success — remove from queue
} catch (error) {
console.error("Processing failed:", error);
channel.nack(msg, false, false); // Failed — send to DLX (no requeue)
}
});
}
// Usage
await sendToQueue("orders", { orderId: "ORD-123", total: 99.99 });
await startConsumer("orders", async (order) => {
await processOrder(order);
await sendEmail(order);
});
Pub/Sub with Exchanges
// Topic exchange — route messages by pattern
async function setupTopicExchange() {
const connection = await amqp.connect(process.env.RABBITMQ_URL!);
const channel = await connection.createChannel();
await channel.assertExchange("events", "topic", { durable: true });
// Publish events
channel.publish("events", "order.created", Buffer.from(JSON.stringify({
orderId: "ORD-456", items: 3,
})));
channel.publish("events", "order.shipped", Buffer.from(JSON.stringify({
orderId: "ORD-456", trackingId: "TRACK-789",
})));
channel.publish("events", "user.signup", Buffer.from(JSON.stringify({
userId: "usr-99", email: "new@user.com",
})));
}
// Subscribe to patterns
async function subscribeToPattern(pattern: string, handler: (data: any, key: string) => void) {
const connection = await amqp.connect(process.env.RABBITMQ_URL!);
const channel = await connection.createChannel();
await channel.assertExchange("events", "topic", { durable: true });
const { queue } = await channel.assertQueue("", { exclusive: true });
await channel.bindQueue(queue, "events", pattern);
channel.consume(queue, (msg) => {
if (!msg) return;
handler(JSON.parse(msg.content.toString()), msg.fields.routingKey);
channel.ack(msg);
});
}
// Subscribe to all order events
await subscribeToPattern("order.*", (data, key) => {
console.log(`Order event [${key}]:`, data);
});
// Subscribe to everything
await subscribeToPattern("#", (data, key) => {
console.log(`[${key}]:`, data);
});
Installation
npm install amqplib
npm install -D @types/amqplib
# RabbitMQ server
docker run -d -p 5672:5672 -p 15672:15672 rabbitmq:management
Best Practices
- Durable queues + persistent messages — Both needed to survive broker restarts; set both always
- Manual ack — Never use
noAck: truein production; explicitly ack after successful processing - Dead letter exchanges — Configure DLX for failed messages; analyze and retry later
- Prefetch — Set
channel.prefetch(N)to limit concurrent processing; prevents consumer overload - Connection pooling — Reuse connections, create channels per operation; connections are expensive
- Topic exchanges — Use
order.*patterns for flexible routing; decouple publishers from consumers - Message TTL — Set
x-message-ttlto prevent queue buildup; stale messages expire automatically - Idempotent consumers — Use
messageIdto deduplicate; messages may be delivered more than once
> related_skills --same-repo
> zustand
You are an expert in Zustand, the small, fast, and scalable state management library for React. You help developers manage global state without boilerplate using Zustand's hook-based stores, selectors for performance, middleware (persist, devtools, immer), computed values, and async actions — replacing Redux complexity with a simple, un-opinionated API in under 1KB.
> zoho
Integrate and automate Zoho products. Use when a user asks to work with Zoho CRM, Zoho Books, Zoho Desk, Zoho Projects, Zoho Mail, or Zoho Creator, build custom integrations via Zoho APIs, automate workflows with Deluge scripting, sync data between Zoho apps and external systems, manage leads and deals, automate invoicing, build custom Zoho Creator apps, set up webhooks, or manage Zoho organization settings. Covers Zoho CRM, Books, Desk, Projects, Creator, and cross-product integrations.
> zod
You are an expert in Zod, the TypeScript-first schema declaration and validation library. You help developers define schemas that validate data at runtime AND infer TypeScript types at compile time — eliminating the need to write types and validators separately. Used for API input validation, form validation, environment variables, config files, and any data boundary.
> zipkin
Deploy and configure Zipkin for distributed tracing and request flow visualization. Use when a user needs to set up trace collection, instrument Java/Spring or other services with Zipkin, analyze service dependencies, or configure storage backends for trace data.