I use Grafana OSS variant together with Timeseries DB - Influx DB 2.x version.
I have some timeseries of temperature values from a number of sensors. And the same plot. Everything works fine but I’d like to automate a procedure of calculation of a sensor data kept between two values.
For example, I need to have information how long each of sensors provide me with values above 60 and below 70 C in used time window. I need to have it in the same window preferable at the legend side.
Is it possible?
For example, here is a plot driven by two sensors:
I think the elapsed() function will do what you need.
I worked up an example in my test data to filter out those records between 60 and 70, calculate the elapsed time in fractional hours, and then sum to get the total. You could create a Stat Panel next to your time series with this function. It would correspond to the same 24h time window.
As far as I understand - you suggest to get the result in a separate panel (dashboard), Am I correct?
How do you think - is it possible to incorporate the sum data into the legend accordingly to each of the sensors?
Thank you very much.
In general it works well. But if I need to calculate two values in one query. For example, in the first example we calculated hours between 60 and 70, but what if I need to calculate also hours for from 70 to 80?
I’m a little bit lost with Influx query language.
Here I receive a set of data in calculated hours but I don’t know which sensor gave which results because the information about the source of sensor is omitted in the query. The same not understanding how to work with Influx QL (probably).
It correctly calculates sum value of a sensor being in bounds between 60 and 70. But if I have an interruption of this chain at the example there is another process that is >70 and after it I have a cooling process and the value falls down to windows between 60 and 70 then it calculates the entire value as 60…
Reading manual of Influx - StateDuration should help but I can’t get results with it.
from(bucket: "Production")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["iServer"] == "Production")
|> filter(fn: (r) => r["iProbe"] == "Hell_1_1" or r["iProbe"] == "Hell_1_2" or r["iProbe"] == "Hell_1_3" or r["iProbe"] == "Hell_1_4" or r["iProbe"] == "Hell_1_5" or r["iProbe"] == "Hell_1_6")
|> stateDuration(fn: (r) => r["_value"]>=60.0 and r["_value"]<=65.0, column: "keep_60")
|> keep(columns: ["iProbe", "keep_60", "keep_60h"])
|> highestMax(n:6, column: "keep_60", groupColumns: ["iProbe"])
|> sort(columns: ["iProbe"])
|> map(fn: (r) => ({ r with keep_60h: float(v: r["keep_60"])/3600.0 })) // convert seconds to hours
|> yield(name: "Hell_1")
It returns a table that contains correct values. Not I’m not able to push it into Stat Panel
PS. Finally fixed the panel and everything works as it should be:
PSS. Had to add filtering of spikes with this |> aggregateWindow(every: 5m, fn: median, column: "_value")
That removes the spikes and they do not devide a region into two (or more) regions.