Skip to main content

Command Palette

Search for a command to run...

Learning Kafka: A Step-by-Step Guide

Chapter 1 - Introduction

Updated
2 min read
Y

Hello! I'm Yuvraj. I'm a Computer Science Student. I love to learn, create, and explore new things. I am currently doing a Bachelor of Computer Science from the University of Delhi.

What is Kafka?

Think of Kafka as a super-powered message system. It's like a central hub where different parts of your application can send messages to each other without needing to know about each other directly.

Imagine it like a post office:

  • Some people (producers) drop off letters (messages)

  • The post office (Kafka) organizes these letters into different mailboxes (topics)

  • Other people (consumers) come and pick up the letters from their mailboxes

The cool thing is that:

  • The post office keeps the letters for a while, so people can pick them up when they're ready

  • Multiple people can read the same letter

  • The system can handle millions of letters per second without breaking a sweat

Why Use Kafka?

  • Decoupling: Your apps don't need to know about each other directly

  • Scalability: Can handle huge amounts of data

  • Reliability: Doesn't lose messages even if parts of the system fail

  • Flexibility: Can be used for many different purposes (logging, tracking, connecting systems)

How We'll Learn

We'll learn Kafka by building and understanding simple examples using NodeJS with ES6 imports. Our learning path will be:

  1. Setup: Getting Kafka running on your computer using Docker

  2. Basic Concepts: Understanding the core ideas of Kafka

  3. Simple Producer: Creating an app that sends messages to Kafka

  4. Simple Consumer: Creating an app that reads messages from Kafka

  5. Real-world Example: Building a mini-project that shows Kafka in action

Let's get started with the setup!

Learning Path

Follow these guides in order:

  1. Setup Guide

  2. Basic Concepts

  3. Your First Producer

  4. Your First Consumer

  5. Real-world Example

Key Concepts

  1. Topics: A category or feed name to which records are published

  2. Partitions: Topics are split into partitions for scalability

  3. Producers: Applications that publish data to Kafka topics

  4. Consumers: Applications that subscribe to topics and process the feed of published records

  5. Consumer Groups: A group of consumers that together consume a topic

  6. Brokers: Kafka servers that store the data

  7. Zookeeper: Used for managing and coordinating Kafka brokers

Project Structure

  • docker-compose.yml: Sets up Kafka and Zookeeper locally

  • python/: Python examples for Kafka producers and consumers

  • nodejs/: Node.js examples for Kafka producers and consumers

  • examples/: Various use case examples

Prerequisites

  • Docker and Docker Compose

  • Node.js 14+ (for Node.js examples)

15 views

More from this blog

Y

Yuvraj's CS Blog - The official blog of Yuvraj

25 posts

Hello 👋 I am Yuvraj Singh Jadon. I am a computer science student. I build user-friendly web apps with aesthetic UI and meaningful code for humans as well as machines.

I write about all things CS