Thilan Dissanayaka Interview Guides Jan 28

Kafka - Interview preparation guide

What is Apache Kafka?

Apache Kafka is a distributed event streaming platform designed for high-throughput, fault-tolerant, and real-time data streaming. It is used for building real-time data pipelines and streaming applications.

What are the core components of Kafka?

  • Producer: Sends messages to Kafka topics.
  • Consumer: Reads messages from topics.
  • Broker: Kafka server that stores and serves messages.
  • Topic: Logical channel to which messages are sent and from which consumers read.
  • Partition: A topic is divided into multiple partitions for parallelism and scalability.
  • Zookeeper: Manages metadata and cluster coordination (deprecated in newer versions).

What is a Kafka topic?

A Kafka topic is a logical channel to which producers publish data and consumers read data. Topics can have multiple partitions to scale horizontally.

What is a Kafka partition, and why is it important?

A Kafka partition is a subset of a topic that allows parallel processing and ensures message ordering within the partition. It improves throughput and fault tolerance.

How does Kafka ensure message durability?

Kafka writes data to disk and replicates it across multiple brokers using a replication factor. This ensures data is not lost even if a broker fails.

What is the difference between Kafka Consumer Group and Consumer?

  • Consumer: A client that reads messages from Kafka topics.
  • Consumer Group: A group of consumers where each message is consumed by only one consumer in the group, enabling parallel consumption.

What is an offset in Kafka?

An offset is a unique identifier for each record within a Kafka partition. It allows consumers to track their position and resume reading from where they left off.

What is Kafka retention policy?

Kafka’s retention policy determines how long messages are stored. It can be configured by:

  • Time-based: e.g., retain messages for 7 days.
  • Size-based: e.g., retain up to 100 GB of data.

How does Kafka handle failure?

  • Replication: Messages are replicated across multiple brokers.
  • Leader Election: If a broker fails, Kafka promotes a replica to leader.
  • Consumer Offset Management: Allows consumers to resume processing after failures.

What are the main APIs in Kafka?

  1. Producer API: For publishing records.
  2. Consumer API: For subscribing to topics.
  3. Streams API: For processing and transforming data in real-time.
  4. Admin API: For managing Kafka topics and brokers.
ALSO READ
OAuth: The Secret Behind
May 17 Application Security

Ever clicked that handy "Sign in with Google" button instead of creating yet another username and password? You're not alone! Behind that convenient button lies a powerful technology called OAuth....

Introduction to Edge Computing
Mar 23 Computing Concepts

Edge computing is a distributed computing paradigm where computation and data storage are performed closer to the location where it is needed. Instead of relying solely on a centralized data center,....

XSS - The Ultimate guide for Cross Site Scripting
May 27 Application Security

Cross-Site Scripting (XSS) is one of the most prevalent and dangerous web application security vulnerabilities. According to OWASP, XSS consistently ranks among the top 10 web application security....

Basic concepts of Cryptography
May 03 Cryptography

Cryptography is the practice of securing communication in the presence of third parties. It's a cornerstone of digital security, allowing us to protect sensitive information even when it's sent....

Proxy Pattern explained simply
Apr 26 Software Architecture

Sometimes you don't want or can't allow direct access to an object. Maybe it's expensive to create, needs special permissions, or you want to control access in some way. This is where the **Proxy....

Database Indexing: Speeding Up Your Queries Like a Pro
Apr 26 Database Systems

In the world of databases, speed matters. Whether you're powering an e-commerce store, a social media app, or a business dashboard — users expect data to load instantly. That’s where database....