> kafka-js

You are an expert in KafkaJS, the pure JavaScript Apache Kafka client for Node.js. You help developers build event-driven architectures with producers, consumers, consumer groups, exactly-once semantics, SASL authentication, and admin operations — processing millions of events per second for real-time analytics, event sourcing, log aggregation, and microservices communication.

fetch
$curl "https://skillshub.wtf/TerminalSkills/skills/kafka-js?format=md"
SKILL.mdkafka-js

KafkaJS — Apache Kafka Client for Node.js

You are an expert in KafkaJS, the pure JavaScript Apache Kafka client for Node.js. You help developers build event-driven architectures with producers, consumers, consumer groups, exactly-once semantics, SASL authentication, and admin operations — processing millions of events per second for real-time analytics, event sourcing, log aggregation, and microservices communication.

Core Capabilities

Producer

import { Kafka, Partitioners, CompressionTypes } from "kafkajs";

const kafka = new Kafka({
  clientId: "my-app",
  brokers: ["kafka1:9092", "kafka2:9092", "kafka3:9092"],
  ssl: true,
  sasl: { mechanism: "plain", username: process.env.KAFKA_USER!, password: process.env.KAFKA_PASS! },
  retry: { initialRetryTime: 300, retries: 10 },
});

const producer = kafka.producer({
  createPartitioner: Partitioners.DefaultPartitioner,
  idempotent: true,                       // Exactly-once delivery
  transactionalId: "order-service",
});

await producer.connect();

// Send single message
await producer.send({
  topic: "orders",
  messages: [
    {
      key: order.userId,                  // Same user → same partition → ordered
      value: JSON.stringify({ type: "order.created", data: order }),
      headers: { "correlation-id": requestId, "source": "order-service" },
    },
  ],
  compression: CompressionTypes.GZIP,
});

// Transactional send (atomic multi-topic)
const transaction = await producer.transaction();
try {
  await transaction.send({ topic: "orders", messages: [{ key: order.id, value: JSON.stringify(order) }] });
  await transaction.send({ topic: "notifications", messages: [{ key: order.userId, value: JSON.stringify(notification) }] });
  await transaction.commit();
} catch (err) {
  await transaction.abort();
}

Consumer

const consumer = kafka.consumer({
  groupId: "order-processor",
  sessionTimeout: 30000,
  heartbeatInterval: 3000,
  maxBytesPerPartition: 1048576,          // 1MB per partition fetch
});

await consumer.connect();
await consumer.subscribe({ topics: ["orders", "payments"], fromBeginning: false });

await consumer.run({
  eachMessage: async ({ topic, partition, message }) => {
    const event = JSON.parse(message.value!.toString());

    switch (topic) {
      case "orders":
        await processOrder(event);
        break;
      case "payments":
        await processPayment(event);
        break;
    }
  },
});

// Batch processing for throughput
await consumer.run({
  eachBatch: async ({ batch, resolveOffset, heartbeat }) => {
    for (const message of batch.messages) {
      await processMessage(message);
      resolveOffset(message.offset);
      await heartbeat();                  // Prevent session timeout on long batches
    }
  },
});

// Graceful shutdown
const shutdown = async () => {
  await consumer.disconnect();
  await producer.disconnect();
  process.exit(0);
};
process.on("SIGTERM", shutdown);

Installation

npm install kafkajs

Best Practices

  1. Idempotent producer — Enable idempotent: true for exactly-once delivery; prevents duplicate messages on retries
  2. Key-based partitioning — Use message keys (userId, orderId) to ensure related events go to the same partition (ordered)
  3. Consumer groups — Add more consumers to a group for horizontal scaling; Kafka auto-rebalances partitions
  4. Manual offset commits — Commit offsets after processing, not before; prevents data loss on consumer crashes
  5. Heartbeat in batches — Call heartbeat() during long batch processing to prevent session timeout
  6. Dead-letter topics — Send failed messages to a DLT (topic.DLT) after retries; don't block the consumer
  7. Schema validation — Use Avro/Protobuf with Schema Registry for strong typing across producers/consumers
  8. Compression — Use GZIP or LZ4 compression; reduces network bandwidth 60-80% for JSON payloads

> related_skills --same-repo

> zustand

You are an expert in Zustand, the small, fast, and scalable state management library for React. You help developers manage global state without boilerplate using Zustand's hook-based stores, selectors for performance, middleware (persist, devtools, immer), computed values, and async actions — replacing Redux complexity with a simple, un-opinionated API in under 1KB.

> zoho

Integrate and automate Zoho products. Use when a user asks to work with Zoho CRM, Zoho Books, Zoho Desk, Zoho Projects, Zoho Mail, or Zoho Creator, build custom integrations via Zoho APIs, automate workflows with Deluge scripting, sync data between Zoho apps and external systems, manage leads and deals, automate invoicing, build custom Zoho Creator apps, set up webhooks, or manage Zoho organization settings. Covers Zoho CRM, Books, Desk, Projects, Creator, and cross-product integrations.

> zod

You are an expert in Zod, the TypeScript-first schema declaration and validation library. You help developers define schemas that validate data at runtime AND infer TypeScript types at compile time — eliminating the need to write types and validators separately. Used for API input validation, form validation, environment variables, config files, and any data boundary.

> zipkin

Deploy and configure Zipkin for distributed tracing and request flow visualization. Use when a user needs to set up trace collection, instrument Java/Spring or other services with Zipkin, analyze service dependencies, or configure storage backends for trace data.

┌ stats

installs/wk0
░░░░░░░░░░
github stars17
███░░░░░░░
first seenMar 17, 2026
└────────────

┌ repo

TerminalSkills/skills
by TerminalSkills
└────────────

┌ tags

└────────────