I am using Grafana to calculate P90 for a histrogram metric. The buckets are: `[0.1, 0.2, 0.5, 0.75, 1, 5, 10, 20, 30, 60, 120, 300, 600]`

For a given time range, I believe the time range is immaterial, all the observations are less than 300s.

In fact, underlying the data is inconsistent as well. We have more observations in the cumulative bucket with upper limit 300s than 600s and +Inf.

For the query:

`sum(increase(compilation_latency_seconds_bucket{le=~"300"}[5m])) - sum(increase(compilation_latency_seconds_bucket{le=~"600"}[5m]))`

the graph looks like this:

Similarly for the query:

`sum(increase(compilation_latency_seconds_bucket{le=~"300"}[5m])) - sum(increase(compilation_latency_seconds_bucket{le=~".+Inf"}[5m]))`

the graph looks like this:

However when I calculate the P90 using the query:

`histogram_quantile(0.9, sum(rate(compilation_latency_seconds_bucket[5m])) by (le))`

I get:

Is this expected or is a bug?