admin管理员组文章数量:1122832
I tried reading azure diagnostic logs from storage accounts via Grafana Loki and Promtail, and it works with old logs that are already archived. As soon as I try to instantaneously scrape diagnostic logs that are being written, I get only a couple of them read and the rest gets ignored.
After searching, I understood it could be due to the fact that object stores are immutable, and hence new blobs (or json files) get written with every change. This means the file I am scraping is being replaced during the process, rather than just appended as the case with normal file system.
Is there a reasonable way to scrape log files while being written to azure storage accounts?
I tried reading azure diagnostic logs from storage accounts via Grafana Loki and Promtail, and it works with old logs that are already archived. As soon as I try to instantaneously scrape diagnostic logs that are being written, I get only a couple of them read and the rest gets ignored.
After searching, I understood it could be due to the fact that object stores are immutable, and hence new blobs (or json files) get written with every change. This means the file I am scraping is being replaced during the process, rather than just appended as the case with normal file system.
Is there a reasonable way to scrape log files while being written to azure storage accounts?
Share Improve this question asked Nov 21, 2024 at 12:51 kloudkidkloudkid 431 gold badge1 silver badge10 bronze badges 1- For real-time scraping of logs from Azure Blob Storage into Grafana Loki with Promtail, You can use Azure Event Grid to trigger an Azure Function that reads new blobs and forwards them to Grafana Loki via the Loki push API and Alternatively tou can also use Azure Monitor Logs to directly stream logs into Log Analytics and query them using Grafana. – Venkat V Commented Nov 27, 2024 at 7:57
1 Answer
Reset to default 0For real-time scraping of logs from Azure Blob Storage into Grafana Loki, you can use the Azure Event Grid** to trigger an azure function that will track if any new blobs were created in the container and forward the same data to Grafana Loki via the Loki Push API.
Set up an **Event Grid Subscription to listen for blob creation events in the storage account. This will notify you whenever new blobs (logs) are created in the container.
Navigate to Azure Storage account > Events >> reate Event > Subscription > Select Blob created as the event type > Set the event > destination to an Azure Function (existing function)
Set up an Azure Function to handle blob events by creating a function that triggers on an Event Grid event whenever a new log blob is created.
Forward the blob content to **Grafana Loki via its HTTP API using the Loki Push API. For more details, refer to Stack Link by
Michał Jaroń
.
Once all settings are configured, any new log file is immediately pushed to Grafana Loki for real-time visualization
Alternatively you can also use **Azure Monitor LLogs todirectly stream logs into **Log AAnalytics andquery them using Grafana.
- Use tthe GrafanaAzure Monitor pplugin toquery and visualize the logs from **Log Analytics innear real-time. Followthe MS Doc for more details.
Reference: Azure Monitor data source
本文标签: azureScraping audit logs from storage accountsStack Overflow
版权声明:本文标题:azure - Scraping audit logs from storage accounts - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1736310765a1934495.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论