GELF Kafka Input
The GELF Kafka input supports collecting logs from Kafka topics with the help of Filebeats. Once logs are generated by the system and pushed to a Kafka topic, they are automatically ingested by this input.
Prerequisites
-
Install Beats, Kafka, and Zookeeper.
-
Provide full access permissions to all Kafka and Filebeats folders.
-
Configure the
filebeats.yml
file as shown below:Hint: Remember to replace localhost with your unique IP address. -
Configure the Kafka
server.properties
fileadvertised.listeners=PLAINTEXT://localhost:9092
. -
Create a Kafka topic.
-
Go to the Kafka directory bin folder and execute the following command:
Copy./kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic <Topic name>
-
Create GELF Kafka Input
To launch a new GELF Kafka input:
-
Navigate to the System >Inputs.
-
Select GELF Kafka from the input options and click the Launch new input button.
-
Enter your configuration parameters in the pop-up configuration form.
Configuration Parameters
-
Global
-
Select this check box to enable the input on all Graylog nodes, or keep it unchecked to enable the input on a specific node.
-
-
Title
-
Assign a title to the input. Example: “GELF Kafka Input for XYZ Source”.
-
-
Legacy mode
-
If you are running an older version of Graylog (prior to 3.3) or need to maintain compatibility with older setups that use ZooKeeper, enable legacy mode.
-
-
Bootstrap Servers (optional)
-
Enter the IP Address and port on which the Kafka server is running.
-
-
ZooKeeper address (legacy mode only) (optional)
-
Enter the IP Address and port on which the Kafka server is running.
-
-
Topic filter regex
-
Enter the topic name filter which is configured in the
filebeats.yml
file.
-
-
Fetch minimum bytes
-
Enter the minimum byte size a message batch should reach before fetching.
-
-
Fetch maximum wait time (ms)
-
Enter the maximum time (in milliseconds) to wait before fetching.
-
-
Processor threads
-
Enter the number of threads to process. This is based on the number of partitions available for the topic.
-
-
Allow throttling this input
-
If enabled, no new message is read from this input until Graylog catches up with its message load. This configuration parameter is typically useful for inputs reading from files or message queue systems like AMQP or Kafka. If you regularly poll an external system, e.g. via HTTP, you should leave this option disabled.
-
-
Auto offset reset (optional)
-
Choose the appropriate selection from the drop down menu if there is no initial offset in Kafka or if an offset is out of range.
-
-
Consumer group id (optional)
-
Enter the name of the consumer group the Kafka input belongs to.
-
-
Override source (optional)
-
Enter the default hostname derived from the received packet. Only set this source if you want to override it with a custom string.
-
-
Encoding (optional)
-
Default encoding is UTF-8. Set this to a standard charset name if you want to override the default. All messages would need to support the encoding configured for the input. UTF-8 encoded messages shouldn’t be sent to an input configured to support UTF-16.
-
-
Decompressed size limit
-
The maximum size of the message after being decompressed.
-
-
Custom Kafka properties (optional)
-
Provide additional properties to Kafka by separating them in a new line.
-