admin管理员组

文章数量:1122832

I tried reading azure diagnostic logs from storage accounts via Grafana Loki and Promtail, and it works with old logs that are already archived. As soon as I try to instantaneously scrape diagnostic logs that are being written, I get only a couple of them read and the rest gets ignored.

After searching, I understood it could be due to the fact that object stores are immutable, and hence new blobs (or json files) get written with every change. This means the file I am scraping is being replaced during the process, rather than just appended as the case with normal file system.

Is there a reasonable way to scrape log files while being written to azure storage accounts?

I tried reading azure diagnostic logs from storage accounts via Grafana Loki and Promtail, and it works with old logs that are already archived. As soon as I try to instantaneously scrape diagnostic logs that are being written, I get only a couple of them read and the rest gets ignored.

After searching, I understood it could be due to the fact that object stores are immutable, and hence new blobs (or json files) get written with every change. This means the file I am scraping is being replaced during the process, rather than just appended as the case with normal file system.

Is there a reasonable way to scrape log files while being written to azure storage accounts?

Share Improve this question asked Nov 21, 2024 at 12:51 kloudkidkloudkid 431 gold badge1 silver badge10 bronze badges 1
  • For real-time scraping of logs from Azure Blob Storage into Grafana Loki with Promtail, You can use Azure Event Grid to trigger an Azure Function that reads new blobs and forwards them to Grafana Loki via the Loki push API and Alternatively tou can also use Azure Monitor Logs to directly stream logs into Log Analytics and query them using Grafana. – Venkat V Commented Nov 27, 2024 at 7:57
Add a comment  | 

1 Answer 1

Reset to default 0

For real-time scraping of logs from Azure Blob Storage into Grafana Loki, you can use the Azure Event Grid** to trigger an azure function that will track if any new blobs were created in the container and forward the same data to Grafana Loki via the Loki Push API.

Set up an **Event Grid Subscription to listen for blob creation events in the storage account. This will notify you whenever new blobs (logs) are created in the container.

Navigate to Azure Storage account > Events >> reate Event > Subscription > Select Blob created as the event type > Set the event > destination to an Azure Function (existing function)

  1. Set up an Azure Function to handle blob events by creating a function that triggers on an Event Grid event whenever a new log blob is created.

  2. Forward the blob content to **Grafana Loki via its HTTP API using the Loki Push API. For more details, refer to Stack Link by Michał Jaroń.
    Once all settings are configured, any new log file is immediately pushed to Grafana Loki for real-time visualization

Alternatively you can also use **Azure Monitor LLogs todirectly stream logs into **Log AAnalytics andquery them using Grafana.

  1. Use tthe GrafanaAzure Monitor pplugin toquery and visualize the logs from **Log Analytics innear real-time. Followthe MS Doc for more details.

Reference: Azure Monitor data source

本文标签: azureScraping audit logs from storage accountsStack Overflow