Newb trying to get to grips with heatmaps. I’m using Grafana 9.1, as this is what’s baked into rancher’s monitoring solution right now.
What I want is a basic heatmap of counts in the request durations buckets over time - x requests in this interval took <200ms, y requests took 1s>x>=200ms, z requests took >= 1s, etc.
Query I’m using is just
sum(increase(request_durations_bucket[$__interval])) by (le)
I’m using a heatmap format in the query options, and the underlying metric is a histogram type.
My actual buckets - the ‘le’ label on my metric - are [5, 10, 25, 50, 75, 100, 250, 500, 1000, 2500, 5000, 10000, +Inf], in ms.
I don’t really understand what “calculate from data” is doing. With it off, grafana does show my actual bucket values, but they’re being sorted alphabetically, which is nonsense. With it on, buckets are ordered, but they’re being shown as ridiculous values that aren’t the buckets I’m using - not sure where those are coming from.
I’ve been fiddling with the interface for hours and I haven’t yet stumbled on a way to use my actual metric buckets, ordered numerically. This feels like the most obvious requirement for a heatmap, so I’m missing something substantial here. Anyone able to clue me in?