This article explains the components of Fluent Bit, including input, filter, and output plugins for efficient log management and analysis.
Welcome! In this article, we will walk through the key components of Fluent Bit—input plugins, filter plugins, and output plugins. Fluent Bit efficiently collects, processes, and publishes log data, making it an essential tool for modern log management and analysis.
By understanding each component, you can configure Fluent Bit to best meet your system requirements. Let’s dive into the details.
Input plugins are the starting point for data collection in Fluent Bit. They interface with various data sources, ensuring that logs are correctly captured for further processing. Here are some common input plugins:
Tail Plugin
The Tail plugin reads data from log files as new entries are written. It is particularly useful for monitoring application logs, such as those from Nginx.
Copy
Ask AI
[INPUT] Name tail Path /var/log/nginx/access.log Tag nginx.access Parser nginx
Systemd Plugin
Use the Systemd plugin to collect logs from the system journal. This is ideal for systems managed by Systemd, capturing both application and service logs.
TCP Plugin
The TCP plugin listens for logs sent over TCP connections, making it versatile for capturing network-based logging data.
Copy
Ask AI
[INPUT] Name tcp Listen 0.0.0.0 Port 5170 Tag tcp.input
For most deployments, you will often work with the Tail or Systemd plugins, as they cover the majority of log collection scenarios.
Once the logs are collected, they often need processing before reaching their final destination. Fluent Bit’s filter plugins facilitate this by transforming, enriching, or screening the raw log data.
Grep Plugin
The Grep plugin filters log records by using regular expressions. For instance, to capture only logs that contain error messages:
Copy
Ask AI
[FILTER] Name grep Match nginx.access Regex message error
Modify Plugin
With the Modify plugin, you can add, remove, or alter fields in your log entries. This is useful for including extra context, such as the service name.
Copy
Ask AI
[FILTER] Name modify Match * Add service nginx
Parser Plugin
The Parser plugin converts unstructured log data (e.g., JSON strings) into a more organized structure for easier analysis.
Copy
Ask AI
[FILTER] Name parser Match nginx.access Key_Name message Parser json
Filter plugins are essential for cleaning and enriching log data before it reaches the storage or analysis stage.
After processing, log data is transmitted to designated destinations through output plugins. These plugins ensure that logs are stored in systems where they can be queried and analyzed.
Elasticsearch (ES) Plugin
The Elasticsearch output plugin sends logs directly to an Elasticsearch instance, offering a centralized solution for log management.
Copy
Ask AI
[OUTPUT] Name es Match * Host 127.0.0.1 Port 9200 Index fluentbit Type _doc
HTTP Plugin
For more flexibility, the HTTP plugin forwards log data to any HTTP endpoint. This is useful if you have custom endpoints or need to integrate with other systems.
Copy
Ask AI
[OUTPUT] Name http Match * Host example.com Port 80 URI /data Format json
Consider a scenario where you need to capture Nginx logs, filter out only error messages, and send the results to Elasticsearch. The configuration below demonstrates how to set up this logging pipeline:
Copy
Ask AI
# Input section: Collect Nginx logs from the access log file[INPUT] Name tail Path /var/log/nginx/access.log Tag nginx.access Parser nginx# Filter section: Filter only the logs that include error messages[FILTER] Name grep Match nginx.access Regex message error# Output section: Send the filtered logs to Elasticsearch for analysis[OUTPUT] Name es Match * Host 127.0.0.1 Port 9200 Index fluentbit Type _doc
This configuration instructs Fluent Bit to continuously collect logs with the Tail plugin, process them with the Grep plugin by filtering error messages, and finally send the output to Elasticsearch using the ES plugin.
Fluent Bit’s powerful architecture based on input, filter, and output plugins provides a scalable and flexible solution for log collection and processing. Mastering these components will enable you to build efficient logging pipelines tailored to your environment. For more details on log management solutions and best practices, explore the resources below: