EFK Stack: Enterprise-Grade Logging and Monitoring

Instrumenting a Simple Python App for Logging

Building Kibana Dashboards to Visualize Our Application Part 2

Hello and welcome back! In the previous lesson, we successfully deployed our Login App and visualized its logs in Kibana. However, we noticed that the logs generated by our application were not very informative. Today, we will improve our logging mechanism to produce more structured and insightful logs.

Reviewing the Original Logging Implementation

Let's start by examining how logs are generated in our existing application. In the Python web application's repository, open the Dockerfile to observe that the entire working directory is being copied. Then, navigate to the application's entry point by opening the app.py file.

In app.py, we import Python’s built-in logging module to send structured logs. Below is the original logging configuration:

import logging
import re
from flask import Flask, render_template, request, redirect, url_for, flash

app = Flask(__name__)
app.secret_key = 'your_secret_key'  # Replace with a real secret key in production

# Configure logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

# Default credentials
USERNAME = 'admin'
PASSWORD = 'password'

def is_weak_password(password):
    if len(password) < 8:
        return True
    if not re.search("[a-zA-Z]", password) or not re.search("[0-9]", password):
        return True
    return False

@app.before_request
def log_request_info():
    logger.info("Request method: %s", request.method)
    logger.info("User Agent: %s", request.user_agent)
    logger.info("Client IP: %s", request.remote_addr)

@app.after_request
def log_response_info(response):
    logger.info("Response status: %s", response.status)
    return response

@app.route('/', methods=['POST'])
def login():
    return render_template('login.html')

In this implementation, logger.info is used to send logs, which then appear in the Kubernetes pod logs.

Improving the Logging Structure with a Custom JSON Formatter

To deliver more structured and insightful logs, we updated our application in a file named update_app.py. In this updated version, we continue to use the logging module and introduce a custom JSON formatter. This formatter creates log entries in JSON format with detailed key-value pairs, making it easier for tools like Elasticsearch and Kibana to parse and analyze the logs.

Below is the updated code snippet from update_app.py:

import logging
import re
import json
from flask import Flask, render_template, request, redirect, url_for, flash

app = Flask(__name__)
app.secret_key = 'your_secret_key'  # Replace with a real secret key in production

# Define a custom JSON formatter
class JSONFormatter(logging.Formatter):
    def format(self, record):
        log_record = {
            "level": record.levelname,
            "message": record.msg,
            "time": self.formatTime(record, self.datefmt),
            "logger": record.name,
            "pathname": record.pathname,
            "lineno": record.lineno,
            "funcname": record.funcName,
            "request": {
                "method": request.method,
                "url": request.url,
                "remote_addr": request.remote_addr,
                "user_agent": str(request.user_agent)
            }
        }

        if record.args:
            log_record['message'] = log_record['message'] % record.args
        return json.dumps(log_record)

# Configure logging
logger = logging.getLogger(__name__)
handler = logging.StreamHandler()
handler.setFormatter(JSONFormatter())
logger.addHandler(handler)
logger.setLevel(logging.INFO)

# Default credentials
USERNAME = 'admin'
PASSWORD = 'password'

def is_weak_password(password):
    if password == 'password':
        return True

Note

Structured logging with JSON format streamlines the process of querying logs in Elasticsearch and building detailed dashboards in Kibana.

Benefits of Structured Logging

  • Enhanced Debugging: With more detailed log entries, you can pinpoint issues faster.
  • Improved Queryability: JSON-formatted logs allow for precise queries and filtering in log management tools.
  • Dashboard Readiness: The structured format integrates seamlessly with tools like Kibana, making dashboard creation more efficient.

Next Steps: Deploying the Updated Application

With the improved logging mechanism in place, the next step is to redeploy your application to the Kubernetes cluster and observe the new logs. This updated structure should provide better insights and more actionable data compared to the previous setup.

Deployment Tip

Ensure to update your Kubernetes manifests appropriately and monitor the logs in real-time after redeployment to confirm that the new changes are taking effect.

That concludes this lesson. In the next session, we will dive deeper into analyzing these structured logs and building effective Kibana dashboards. Thank you for joining, and see you in the next lesson!

References and Further Reading

Watch Video

Watch video content

Previous
Building Kibana Dashboards to Visualize Our Application Part 1