Makes sense, I will keep that in mind.
Quick question while I have you.
I am having some issues understanding the concept of “Log lines”, particularly when setting up Grafana Alert Manager when keeping Cardinality low.
My case is a bit different than most as I’m parsing json data that can be extremely large (larger than 64K in most cases), so I’ll be selecting specific information from each json log to push to Loki.
If you have any good worked-example references for setting up logging of any type (flat file would be fine) from source to Grafana Agent, to querying data that would be terrific.
There are not many Azure based Grafana examples out in the wild, but I believe I have a good pattern for our CosmosDB stored Application logs*:
- Azure Function triggered by Azure CosmosDb Change Processor, parsing and pushing select “log lines” to Azure Event Hubs
- Azure Function triggered by Event Hub to push log lines to Grafana Agent
- Grafana Agent to Loki Online
What I’m not quite clear about is what can/should constitutes a “log line”.
I’ve successfully tested pushing some older flatfile logs to Loki – but am not sure where “log line” stops and full file content starts.
*While I could just push the Functions output to Grafana Agent, EventHub is more scalable.
- EventHubs are designed for high-volume log input and output
- EventHubs events output is in time-based created order (although soon to be no longer a requirement for Loki, is/was a requirement)
- Azure Log Analytics and Application insights can use EventHub as a sink target, making the Event Hub triggered Azure Function a repeatable pattern for pushing logs and metrics to almost every Azure resource to Prometheus and Loki
I expect to be configuring the Grafana Agent by the end-of-the-week.
Any detailed configuration examples you are aware of and can share would be appreciated.
Thank you again,