Welcome to this deep dive into Kafka security. In this guide, we’ll explore how to protect sensitive data in transit, in use, and at rest using Kafka’s built-in mechanisms.
Use Case Overview
Imagine a banking application that publishes two types of events to a Kafka cluster:
Login Events
When a customer logs in, the app writes a message to the login-events topic.
Card Payment Events
When a payment is processed, the app writes a message to the card-payment-events topic.
Downstream microservices consume these events for auditing, notifications, analytics, and more.
Key Security Layers
Both login and payment events contain sensitive customer data. To safeguard this information, address these three layers:
Layer Goal Kafka Feature Data in Transit Encrypt and authenticate communication between clients and brokers TLS / SSL Data in Use Authenticate clients and authorize topic-level operations SASL Authentication + ACLs Data at Rest Encrypt log segments and snapshots on disk Volume encryption or Native Kafka DSP
Always rotate certificates and keys periodically to minimize risk in case of credential leakage.
Kafka Security Features
Kafka provides several built-in mechanisms to meet these requirements:
Feature Description Reference TLS Encryption Encrypt data in transit between producers, brokers, and consumers. TLS Setup SASL Authentication Support for SCRAM, GSSAPI (Kerberos), OAUTHBEARER, and PLAIN mechanisms. SASL Mechanisms ACL Authorization Fine-grained control over which principals can read/write specific topics and consumer groups. ACLs Disk Encryption Use OS-level volume encryption or integrate third-party solutions for encrypting log directories. —
Next Steps
Configure TLS on brokers and clients.
Enable SASL for authenticating producers and consumers.
Define ACLs to restrict topic access.
Implement disk encryption for log segments and snapshots.
By following these steps, you’ll achieve end-to-end protection of sensitive Kafka data.
Links and References