Grafana performance with ~144,000 data points

I’m new to Grafana and I’d like to potentially use it to visualize some geospatial data (using the Geomap visualization) that comes from a proprietary system with a C++ library interface.

I’ve written a prototype backend plugin in C++ to retrieve this data. It works and I can see my data points displayed on the map, however performance is quite sluggish.

For testing purposes I am retrieving one “medium” day’s worth of data points consisting of 144,095 records. The source records contain 15-20 fields but for now I am only parsing 4 of them:

  • type - string
  • ID - string
  • latitude - double
  • longitude - double

In an attempt to understand the performance issue I added some debug to my plugin. With this debug I measured that the QueryData call for the 144,095 data points is taking ~5.8 seconds inside the plugin. However the query inspector reports that the “total request time” as 91.2 seconds.

I guess my question is: am I approaching this wrong or can Grafana not handle this number of data points? Also, what is happening during the intervening ~85 seconds?

I am somewhat disappointing with this result as I was hoping to visualize 10’s of millions of data points.

144000 data points is quite a lot for a browser to render and you’re planning to render 10 millions - does this even make sense for a human to look at?

I believe it might be the browser struggling with rendering this. You can enable router logging, Configure Grafana | Grafana documentation, and you’ll see the duration logged for each HTTP request hitting the Grafana API and figure out where the time is spent.

I guess I see your point but at the same time I have no control over how many data points there are. I thought Grafana would handle this situation of a large number of data points more elegantly than just sending them all to the client side for display.

Is it possible to get the zoom level and display window coordinates (e.g. bottom-left lat and lon, and top-right lat and lon) of the map UI? If I had that additional information my plugin could cluster and filter the data before returning it.

throttle it in this prototype you have written. the other thing is if you see grafana source code it uses a lot for each

for (let index = 0; index < theArray.length; ++index) {
    const element = theArray[index];
    // ...use `element`...

which is notorious for performance issues. So you are getting a double whammy: too much data on your side and grafana weak approach of handling large datasets using for each instead of modern and faster .map