Large query takes forever to load

Large query takes forever to load
I am using Grafana 8.0 (the upgrade to latest comes in a few weeks) and i am trying to visualize data from the JSON API Datasource, but each API call returns 1.2 Million lines of JSON which of course takes a long time to load. Almost every load gives the “Grafana stopped responding” popup.

My questions are:

  • Can you store the result in Grafana, and replace the result once there is new data available?

  • Can you somehow stream the data so the website does not crash while waiting for a response?

  • Is there some other solution to this problem?

This is my first post, so please tell me if the subcategory is wrong.

Hello i think 1.2 million line is time to switch to database technology,
this could permit you to store your working call and review it over long period,
and solve your performance issue by using some aggregation function.
you can google solution like Telegraf Influxdb Grafana

first question is do you need these 1.2 millions line to make a graph ? these data need to be processed ?

Hi, Sadly I am in no position to switch to data base technology and there are no way for me to reduce the query. I agree that this solution is not optimal, but now I have to do the best of a bad thing. Any Ideas?

I think it’s useless to have so much point, you can’t even see it.
For example when i want display 6 months of data gathered every minutes, i also have millions of point maybe hundred of millions but since i can’t even see a variation under thousand point what is the point to draw it ?
But i can use agregation function like a shitfting window to apply a mean and reduce my point number and keeping most information.
is that help ?

Yea but then you merge the data by month for example. I get nice panels so that is not the problem.

But it takes 30s to load this one panel.

What do you mean by:

But i can use agregation function like a shitfting window to apply a mean and reduce my point number and keeping most information.

Can you somehow stream data and load data as new data appears?

ah yes are you using Grafana transformation to group by month ?
i think you can’t do more with a single API endpoint . Can’t you create new endpoint with data already processed by your database ? database can do this faster.
About flux stream, there is feature about that :


What is this data source and why is it returning so much data? Can you not trim it down via filters?

It’s an internal data source so sadly not. I am just trying to do the best out of a bad situation.

What kind of data source is it? I am not sure it is not possible

You could write a proxy service to reduce the data before it to Grafana.

1 Like

@ollisco the JSON API plugin (JSON API plugin for Grafana | Grafana Labs) is pretty cool in allowing queries directly against a JSON API… but that plugin is getting a bit old now, and I am not sure it is compatible with Grafana 9, so that might be another reason to look into a different way to meet your needs, if possible.

One other thing that might be worth looking into is this setting, that can compress all HTTP traffic between the Grafana server and your browser:

Unfortunately, I just realized that this probably won’t help, either, because this is a frontend plugin (meaning the communication happens directly between the browser and the datasource), not a backend datasource plugin. Backend plugins have many advantages, as explained in Backend plugins | Grafana documentation. But the JSON API plugin may never be converted to a backend plugin, as the developer explains here:

And finally, as explained here:

The JSON API doesn’t store historical data from previous queries. It can only visualize the data from the last query that was run. If you want to store metrics over time, you’re likely be better of switching to a proper time series database, such as Prometheus.

In short: if there is no way for you to re-write your query to return less data, there is probably no way to speed up the response, because the query returns such a large amount of JSON data.