Learning Kafka: A Step-by-Step Guide
Chapter 1 - Introduction
Hello! I'm Yuvraj. I'm a Computer Science Student. I love to learn, create, and explore new things. I am currently doing a Bachelor of Computer Science from the University of Delhi.
What is Kafka?
Think of Kafka as a super-powered message system. It's like a central hub where different parts of your application can send messages to each other without needing to know about each other directly.
Imagine it like a post office:
Some people (producers) drop off letters (messages)
The post office (Kafka) organizes these letters into different mailboxes (topics)
Other people (consumers) come and pick up the letters from their mailboxes
The cool thing is that:
The post office keeps the letters for a while, so people can pick them up when they're ready
Multiple people can read the same letter
The system can handle millions of letters per second without breaking a sweat
Why Use Kafka?
Decoupling: Your apps don't need to know about each other directly
Scalability: Can handle huge amounts of data
Reliability: Doesn't lose messages even if parts of the system fail
Flexibility: Can be used for many different purposes (logging, tracking, connecting systems)
How We'll Learn
We'll learn Kafka by building and understanding simple examples using NodeJS with ES6 imports. Our learning path will be:
Setup: Getting Kafka running on your computer using Docker
Basic Concepts: Understanding the core ideas of Kafka
Simple Producer: Creating an app that sends messages to Kafka
Simple Consumer: Creating an app that reads messages from Kafka
Real-world Example: Building a mini-project that shows Kafka in action
Let's get started with the setup!
Learning Path
Follow these guides in order:
Key Concepts
Topics: A category or feed name to which records are published
Partitions: Topics are split into partitions for scalability
Producers: Applications that publish data to Kafka topics
Consumers: Applications that subscribe to topics and process the feed of published records
Consumer Groups: A group of consumers that together consume a topic
Brokers: Kafka servers that store the data
Zookeeper: Used for managing and coordinating Kafka brokers
Project Structure
docker-compose.yml: Sets up Kafka and Zookeeper locallypython/: Python examples for Kafka producers and consumersnodejs/: Node.js examples for Kafka producers and consumersexamples/: Various use case examples
Prerequisites
Docker and Docker Compose
Node.js 14+ (for Node.js examples)