I’m trying to use heatmaps to show how the performance of a service varies over time.
The current problem I’m running into is that the colours displayed in each cell of the heatmap seem to be related to an absolute scale for the entire heatmap, so that time periods with little activity simply show “not much happened here” (ie: the heatmap is mostly dark), instead of showing me the detail of whatever activity there was.
To explain possibly better:
Suppose I have a heatmap with one column per day over 28 days, measuring the duration of something which is busy on workdays but relatively (not entirely) quiet at the weekends.
Suppose that on each workday I have around 1000 events, of which 250 take between 10 and 12 seconds to complete. On each weekend day I only have 100 events, but 25 of these still take between 10 and 12 seconds.
What I want to get as a result is an identical column of colours for every day throughout the month, telling me that the peak activity for this event is between 10 and 12 seconds, independent of how much activity there is on each day in total.
What I actually get is bright cells (high values) for 10 to 12 seconds on workdays, but nothing of any significance shown at the weekends - it’s impossible to tell how long anything took to complete then, because 100 events per day pales (literally) into insignificance compared to 1000 events per day on the workdays.
So, what I think I’m looking for is a way to show the percentage (or proportion) of events which took a certain time to complete within each time bucket interval, unaffected by how many events there in total in that bucket compared to all the other buckets.
Has anyone got any ideas?