Loki ingestion of Azure PAAS and FAAS logs

I am the new Observability Engineer for a company with Azure hosted SAAS solutions using mostly Azure PAAS and FAAS (App Services, Functions, Cosmo, Redis Cache, Azure SQL API Management, App Gateway, etc.).

I would like to pull all of our PAAS and FAAS logs, along with Application Logs, into Loki but cannot seem to find any references to the best way to do that.

Grafana’s integration with Log Analytics and Application Insights is obvious - but I am not sure if that information is then available to Loki or if we need to setup Ingestion to pull logs from say - Blob Storage.

Where should I look for Azure details for Loki integration?

1 Like

You may be venturing into uncharted territory here a little, we have a little work on getting amazon cloudwatch logs and are working on GCP logs but I don’t think I’ve seen anyone capturing azure logs yet.

Typically it’s done through a lambda type process (amazon) or a messaging bus type solution (google pub/sub)

The tricky part in most cases is the ordering constraint for Loki requring all logs for a stream to be sent in order, we have been working around this by using a promtail client in the middle which accepts logs from many lambdas but re-stamps the timestamp based on when they are received. This isn’t a great solution however.

We have started work on removing the ordering constraint from Loki, gonna take a little work though to do this.

This blog post talks about the lambda and promtail solution a little more.

This is the lambda code

These may be helpful if you start exploring!

1 Like


Azure seems to be focusing more effort on AKS and Kubernetes than serverless these days, which is likely due to core differences between AWS Lamda and Azure FunctionApps approach to deployment containers (mentioned below), as well as Azure’s large Enterprise customer base. As a result, I’ve been spending a fair amount of time both wishing we were in AWS and looking at AKS over Azure Serverless solutions.

My first thought is that Kubernetes should have a similar issue unless cluster or namespace can be used to group streams more easily. Although I’ve not looked at Loki and Kubernetes yet.

That said, any move from Azure Functions Apps to AKS is both a large effort big shift and our monitoring focus is necessarily on the present. Like AWS what is lacking in Azure is connectivity between metrics, logs, and trace.

Cardinality of Azure Serverless components may be less of an issue for two reasons:

  1. Azure Serverless (Function Apps) containers are at the function app level, rather than function. While AWS Lamda deploys each function separately to the infrastructure, Azure Function Apps deploy grouped functions in each container.
  2. ApplicationInsights itself provides trace capabilities across multiple instances of each FunctionApp, whether this capability solves the cardinality and ordering issues for Loki is not clear.

I will take some time to digest the information in the blog article. Thank you for the follow up.

@ehuggins I am sort of facing the same challenge. In my case, I am running my grafana/loki in my AKS cluster, what makes it easier to collect my cluster-deployed apps. But I would like to bring some legacy stuff (and also for functions, gateways, frontdoors, etc.) to my in-cluster loki/promtail. I am thinking in trying to push my app logs to an event hub and trigger functions from there to push them to promtail. Let me know if your endeavors since February have helped you to arrive somewhere.

@ewelch if you have had any news on this as well, I would be glad to know.

If that’s not the case, I will try to leverage this solution and share the results/code.

@paulogtabarro I’ve not moved forward with Loki yet - my focus has been on leveraging Grafana to view product statistics based on Traces from AppInsights, which is going well for now. Your direction for Azure log data using LogAnalytics ability to push data to EventHubs and then promtail makes sense and should be fairly straight-forward. Loki integration with custom Application Logs in blob stores and CosmosDB will be my first focus when it comes to Loki. EventHubs or EventGrid and FunctionApps are the most likely method.

I have started this project now. The first source I am tackling is logs stored in Azure CosmosDB. The plan is to use the CosmosDB ChangeFeed to trigger an Azure Function which will forward relevant loglines to Azure EventHub. A second Azure Function will process EventHub data and forward the data to the GrafanaAgent - which will then push data to the Loki Online datastore.

The components are fairly simple. I will provide an update when it is working.

Were you successful @ehuggins ? Looking into doing the same but for Azure Data Factory.



We wrote a Function using the Azure CosmosDB ChangeFeed trigger which parses incoming data and pushes results to the loki push api for use by Grafana.


@ehuggins ,

Much appreciated if you could share the setup details and necessary codes for us ?