Create Data Lake Backend on Azure Blob Storage

The following article exclusively pertains to a Graylog Enterprise feature or functionality. To learn more about obtaining an Enterprise license, please contact the Graylog Sales team.

Before you can start routing Graylog data to an internal Data Lake, you must first set up your backend storage. This article shows you how to set up a Data Lake storage backend in Graylog using Azure Blob Storage.

Prerequisites

Before proceeding, ensure that the following prerequisites are met:

Warning: We strongly recommend that you utilize object storage in either an Amazon S3, Google Cloud Storage, or Azure Blob Storage as your method of backend storage for logs routed to a Data Lake. If you store logs on a local file store and reach your storage capacity, you must switch your backend to gain more capacity. Changing your storage backend requires that you delete all of the data housed in your current backend storage solution!

  • You must be a Graylog administrator to set up and manage a Data Lake.

  • To use Azure Blob Storage, you must have an existing Azure Blob container and appropriate credentials.

Create an Azure Blob Storage Backend

To create an Azure Blob storage backend for your Data Lake:

  1. Navigate to Data Lake > Internal Lake Setup. If you have existing backends, select the Backend tab. Any existing storage backends are displayed here.

  2. Select Create Data Lake Backend.

  3. Select Azure Blob from the dropdown as the Backend Type.

  4. Enter configuration details for your Azure Blob backend:

    Title

    Enter a unique and descriptive name for the backend.

    Description

    Enter a description of the backend.

    Azure Blob Endpoint URL

    (optional)

    Enter the URL for the Azure Blob endpoint. This value is required only if you want to override the default endpoint.

    Azure Blob Container Name

    Enter the name of the Azure Blob container in which logs will be stored.

    Azure account

    Enter the name of your Azure storage account.

    Azure account key

    Enter the account key for your Azure storage account.

    Azure Blob Output Base Path

    Enter the base path where the archives should be stored within the Azure Blob container.

    You can use a single container for multiple purposes. For instance, you could use the same container for a Data Lake backend and a warm tier snapshot backend. However, if you do, it is important to use different sub folders for each specific use. The base path you set here determines the sub folder structure for this backend.

    Warning: This value can only be set on backend creation and cannot be changed at a later date!

  5. Click Create to complete configuration of the storage backend.

  6. Click Activate to make this the active storage backend. You must activate the backend before it can be used for storage. You can have multiple backends defined, but only one can be active. See the warning below about data loss if you are switching from an existing storage backend.

If you need to update settings for the Data Lake, such as changing access credentials, click Edit. You are presented with the same options as on initial creation. As noted, you cannot change the output base path after your initial save, but you can update the other settings.

Change Your Storage Backend

Warning: When you change your storage backend, you are required to delete all the data stored in your current backend. At this time, we recommend that you do NOT change your storage backend unless absolutely necessary because this data will be lost!

To change your active storage backend:

  1. Create a new storage backend or select one you have previously created.

  2. Click Activate.

    Graylog prompts you to confirm you want to change your storage backend. Graylog recommends you do not change your storage backend! All the data written to the previous storage backend must be deleted before you can switch.

    Warning: Deleting Data Lake data requires you to first stop routing data to the Data Lake. Note that if the affected streams are routing only to the Data Lake, you risk losing new data until you complete the process and start routing again with the new storage backend.

  3. Click Confirm to proceed.

The storage backend has now been switched. As new logs arrive, they are routed to the newly activated Data Lake storage backend.

Delete Backend Data

Before you can switch a storage backend, you must delete any data in the old storage backend. It is recommended that you delete this data with the following steps:

  1. Navigate to the Overview tab of Data Lake > Internal Lake Setup.

  2. Disable the Data Lake for each stream that is routing data to this backend. Click Data Routing, then toggle the Data Lake to Disabled.

  3. Delete the data from each stream.

    1. Select More > Delete.

    2. Select the Full Delete check box.

    3. Click Delete.

  4. Verify that the message count for all streams hits 0.

Further Reading

Explore the following additional resources and recommended readings to expand your knowledge on related topics: