All the audit logs from a running cluster get stored to “insights-logs-*” containers in an Azure storage account. This was already configured in the “diagnostic settings” section in the Azure Kubernetes Service.
I would like to introduce loki for cost-effective logging and have managed to configure the storage in the Helm loki-distributed setup so that the logs get stored to a storage account back-end, after giving the node pool the “Storage Blob Data Contributor” role.
How can I configure promtail in loki-distributed so that it can process the old audit logs stored by Azure diagnostic settings? Unlike other logs, these logs are not generated from a container, rather they have been archived for several moths.
I tried checking if someone else had a similar problem reading log files from Azure storage accounts, but so far only found answers related to storing the logs in Azure.
I don’t think you can configure promtail to pull directly from an azure storage. Probably would be easier to either script it, or have promtail running on an instance and just feed the files to it manually.
One thing to note, when sending old logs into Loki make sure you start from the oldest.
Thanks a lot for the answer. Yes, this is what I think too. Either using a script or other tools I can feed the files from the storage account to Promtail in the right order, and then it will forward them to Loki.
It seems I will have to figure a way to get this done periodically because managed Azure clusters seem to support only a few patterns of storing logs to storage accounts or streaming them to an event hub. The third variant is using Azure Log Analytics Workspace, but this is quite costly.
Following your advice here, I was also able to collect the logs from azure storage accounts using persistent volumes mounted to another promtail instance /var/log directory.