Calculation of time in hours of a value remains in a certain abounds

I use Grafana OSS variant together with Timeseries DB - Influx DB 2.x version.

I have some timeseries of temperature values from a number of sensors. And the same plot. Everything works fine but I’d like to automate a procedure of calculation of a sensor data kept between two values.

For example, I need to have information how long each of sensors provide me with values above 60 and below 70 C in used time window. I need to have it in the same window preferable at the legend side.

Is it possible?

For example, here is a plot driven by two sensors:

The time windows is 24 h and I would like to see how long the sensors were between 60 and 70 C

The query is very simple in my case:

from(bucket: "Prod")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["iServer"] == "Prdn")
  |> filter(fn: (r) => r["iProbe"] == "Hell_2_1" or r["iProbe"] == "Hell_2_2")
  |> aggregateWindow(every: v.windowPeriod, fn: mean, createEmpty: false)
  |> yield(name: "mean")

Hi @superboss9

I think the elapsed() function will do what you need.

I worked up an example in my test data to filter out those records between 60 and 70, calculate the elapsed time in fractional hours, and then sum to get the total. You could create a Stat Panel next to your time series with this function. It would correspond to the same 24h time window.

from(bucket: "Bucket1")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "WeatherData")
  |> filter(fn: (r) => r["_field"] == "OutdoorTemp")
  |> filter(fn: (r) => r["_value"] >= 60.0 and r["_value"] <= 70.0)
  |> elapsed(unit: 1s)
  |> map(fn: (r) => ({ r with elapsedFloat: float(v: r.elapsed)/3600.0 })) // convert seconds to hours
  |> sum(column: "elapsedFloat")
  |> yield(name: "hours-between-60-and-70")

1 Like

As far as I understand - you suggest to get the result in a separate panel (dashboard), Am I correct?
How do you think - is it possible to incorporate the sum data into the legend accordingly to each of the sensors?

Correct. I do not know of any way to feed a custom value or query value into the legend. Seems you can only choose First, Last, Min, Max, etc.

1 Like

Thank you very much.
In general it works well. But if I need to calculate two values in one query. For example, in the first example we calculated hours between 60 and 70, but what if I need to calculate also hours for from 70 to 80?
I’m a little bit lost with Influx query language.

The second problem is in data naming:

Here I receive a set of data in calculated hours but I don’t know which sensor gave which results because the information about the source of sensor is omitted in the query. The same not understanding how to work with Influx QL (probably).

I’ve added to panels with Stat Panel and it brings the correct results except one thing:

It correctly calculates sum value of a sensor being in bounds between 60 and 70. But if I have an interruption of this chain at the example there is another process that is >70 and after it I have a cooling process and the value falls down to windows between 60 and 70 then it calculates the entire value as 60…

Reading manual of Influx - StateDuration should help but I can’t get results with it.

Manage to produce the following Influx query:

  from(bucket: "Production")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["iServer"] == "Production")
  |> filter(fn: (r) => r["iProbe"] == "Hell_1_1" or r["iProbe"] == "Hell_1_2" or r["iProbe"] == "Hell_1_3" or r["iProbe"] == "Hell_1_4" or r["iProbe"] == "Hell_1_5" or r["iProbe"] == "Hell_1_6")
  |> stateDuration(fn: (r) => r["_value"]>=60.0 and r["_value"]<=65.0, column: "keep_60")
  |> keep(columns: ["iProbe", "keep_60", "keep_60h"])
  |> highestMax(n:6, column: "keep_60", groupColumns: ["iProbe"])
  |> sort(columns: ["iProbe"])
  |> map(fn: (r) => ({ r with keep_60h: float(v: r["keep_60"])/3600.0 })) // convert seconds to hours
  |> yield(name: "Hell_1")

It returns a table that contains correct values. Not I’m not able to push it into Stat Panel :rofl:

PS. Finally fixed the panel and everything works as it should be:

PSS. Had to add filtering of spikes with this
|> aggregateWindow(every: 5m, fn: median, column: "_value")
That removes the spikes and they do not devide a region into two (or more) regions.

1 Like