Python script to export csv files of specific panel/ time range

Hello all,

I currently have a few raspberry pis with sensors set up with Prometheus as a database and grafana as an awesome visualization tool. My grafana dashboard is set to a local ip on my local network ie: ‘localip:3000’. The dashboard has 7 panels and I am wondering if there is a way to interact with the panels through a python script and retrieve csv files for specific panels/ time frames. I am able to download csv files on the grafana UI manually: panel title ==> inspect ==> data ==> Download CSV.
My main goal here is to have grafana on one computer on a specific server, eg: ‘example_server.com’ and be able to access sensor data through other devices. For example say I have sensor 3 in room 1, then I’d like to set up a python call like:

graf = Grafana.client(‘example_server.com’)
graf.get_reading(room = ‘1’, sensor = 3’)
graf.get_csv(room = ‘1’, sensor = ‘3’’, time_frame = ‘1hr’)

And be able to see the last available reading/ access data for 1hr, 6hr, 1 day… ect as a CSV file. I am hoping that this is out there and I just cannot find it. I personally don’t think I have the necessary programming background to set this up properly on my own. Any help would be great, thank you!

Dana

Hi Dana,
You’re definitely not the first person to have this use case, but unfortunately I don’t think there is a good solution. The CSV that you download through the browser is generated by some JavaScript code that runs in the browser itself (after querying the data), so unfortunately there’s no back-end API endpoint that you can query for a CSV like that.

The only real way I can think of is to use a package like puppeteer/pyppeteer to run a headless browser and get it to execute the required JS that then generates the CSV. Definitely far from trivial!

Another hacky approach that I took at some point to do something similar, was to (semi-automatically) extract the queries from the dashboard’s JSON definition and run them directly from the python script, which then assembled the results into a CSV. So basically trying to replicate what the browser does in the front-end, but in a python script. I can’t say I’d recommend it “at scale”.

Would love to hear about any other ideas to do this!

1 Like

Thanks for the reply svetb. That is unfortunate that there does not seem to be a well documented source to achieve this. Hopefully it will come as more people could use a package/ extension to easily pull data from dashboards without having to manually do it. In the meantime I found this blog post which could be useful here.

Thanks for sharing that @danarampini. I’d be interested to hear about experiences from anyone who’s used Skedler.

1 Like

@svetb not sure if you use prometheus at all, but I was able to find a great package to read values/ build dataframes and works great: prometheus-api-client

Hi @danarampini yes that can certainly work - especially for simpler use cases where you literally just need to run a query and save the data as a CSV. Indeed, Grafana doesn’t need to play a role at all in this case. [FWIW I mostly use Influx; an equivalent approach is possible there]

It gets tricky if you for example need to combine data from multiple queries into a single panel (then potentially do formatting, value mapping, transformations, etc). It’s quite tricky to reliably reproduce these kinds of Grafana manipulations in a Python script, in order to end up with a CSV that is an exact copy of what the user would see in Grafana.

Finally, I do think that Grafana Enterprise these days includes CSV reporting/exporting capabilities. So that may work too, though it’s not something I personally have access to :slight_smile:

1 Like