Use this file to discover all available pages before exploring further.
In this lesson, we’ll build the backend dashboard for our warehouse—completing the event-driven flow using Apache Kafka and Flask. We’ll begin by reviewing how the toy shop frontend publishes order events, then implement a Kafka consumer in the warehouse UI.
def get_kafka_messages(): """Retrieve all available order events from Kafka.""" messages = [] consumer_config = { 'bootstrap.servers': '3.68.92.91:9092', 'group.id': f'warehouse_reader_{uuid.uuid4()}', 'auto.offset.reset': 'earliest', 'enable.auto.commit': False } consumer = Consumer(consumer_config) logger.info("Created new Kafka consumer") try: consumer.subscribe([KAFKA_TOPIC]) logger.info(f"Subscribed to topic: {KAFKA_TOPIC}") # Poll the broker up to 10 times for new messages for _ in range(10): msg = consumer.poll(1.0) if msg is None: continue if msg.error(): logger.error(f"Kafka error: {msg.error()}") continue messages.append(json.loads(msg.value().decode('utf-8'))) finally: consumer.close() return messages
Below is a breakdown of the Kafka consumer configuration:
Configuration Key
Description
Example
bootstrap.servers
Address of your Kafka broker
3.68.92.91:9092
group.id
Unique consumer group identifier
warehouse_reader_<UUID>
auto.offset.reset
Where to start reading if no offset exists
earliest
enable.auto.commit
Disable automatic offset commits for safety
False
Ensure bootstrap.servers matches your Kafka broker’s public IP. You can verify it in the AWS EC2 dashboard.
Persist orders to a database and map products to rack locations
Add packing instructions or real-time inventory lookups
Congratulations! You’ve built an end-to-end event-driven warehouse dashboard using Apache Kafka. Next, we’ll recap the full architecture and explore scaling strategies.