How to determine disk usage per log stream in Loki

I am using Grafana Loki for storing logs from multiple microservices. Currently, I encounter huge amount of data logs in loki (approximately 40gb) in short period of time. I would like to know, which application stores so much data. What is the best way how to calculate amount of storage per data stream?

I am using three labels: service, stack and replica

At first, I tried to check metrics - there is no info about it. Second I tried the logcli - there is no option for this use case. I tried to analyze chunk directory, which could lead to some results.

I know, all data are stored in chunks and when I list them its just some list of base64 strings:


When I decode it with base64:


From the output I can determine only one think fake/ is default tenancy ID but do not know, what other numbers means.

I am using Promtail to harvest data from docker swarm services with docker_sd_configs.

I also thinking to use chunks-inspect but it is quiet brute force logic I would like not to try because it take huge amount of computation time and energy with increasing logs amount.

What is the best way to analyze how much data per stream is stored?