Transform - Reduce Row is filling "0" for rows with missing values - how to prevent?

I have a simple problem (hopefully):
I have four signals (a1, a2, a3, b1) that I am visualizing on Grafana in a single Timeseries panel. Signals a1, a2, a3 are sampled at one rate (say, once per minute), while signal b1 is sampled faster (once per second). I am using Transform → Reduce Row —> Total(a1,a2,a3) to sum together the first three signals (call this a_total). However, the transformation gets evaluated at all timepoints, including those of b1. At the timepoints of b1, the signals a1,a2,a3 have no value, they are missing. The transformation Total result at these “b1” timepoints is 0, which is not correct. This breaks the visualization as now most of the points are 0.

Is there a way to replace the incorrect “0” values with NaN? Or alternatively, a way to filter the resulting transformation to remove 0 values?


  • What Grafana version and what operating system are you using?
    Grafana v8.5.24 on Windows 10 Enterprise
  • What are you trying to achieve?
    I would like to plot signals of varying sampling rates and transforms together, on one panel, without having the transforms having 0 values where the input signals are missing.
  • How are you trying to achieve it?
    Using transforms
  • What happened?
    There are 0 values in the output of the transforms where the input values are missing
  • What did you expect to happen?
    That the output value of the transforms should be missing/NaN where the input values are missing
  • Can you copy/paste the configuration(s) that you are having problems with?
    No
  • Did you receive any errors in the Grafana UI or in related logs? If so, please tell us exactly what they were.
    No
  • Did you follow any online instructions? If so, what is the URL?
    Have not found instructions

Welcome @soren707

What is your datasource?
Can you share the query and output of the query for b1 that shows the “no value” points?

My feeling is you would apply the Value Mappings to remove the 0 values and then your transformation should work.

Hi grant2, thanks for the reply:

My datasource is Azure Data Explorer.
The query for b1 (the higher frequency signal) is:
"
tableName
| where $__timeFilter(Timestamp)
| where signal_name == ‘signalName’
| where var1== $var
| where todouble(signal_value)>0
| summarize [‘b1’] = avg(todouble(signal_value)) by bin(Timestamp, time($__interval))
| order by Timestamp asc
"
The output I showed in the table above: the outputs for the a0, a1, a2, and b1 signals are correct. Signal b1 has a faster sampling than a0, a1, a2. This is expected and intended. The problem is that the transform on a0, a1, a2 is filling in 0s for rows where the a0, a1, a2 are missing, rather than filling in missing or NaN values.

The result I want is that the timeseries visualization for a_total only has points where the inputs a0, a1, a2 have non-missing entries, instead having 0s for all the rows where the inputs are empty. I’d like the visualization to have both a_total and b1 on the same plot.

I tried to experiment with Value Mappings, but I’m not sure this solves the problem. I can change the “0” value to have some text output, like “test1,” but this doesn’t change the timeseries visualization (it just replaces 0 on the y-axis with the string “test1”). Can Value Mappings remove the 0 values? It seems to just replace them with something else, with no real change to the output.

How about using a second transform like this?

Thanks grant2 - I find that using “Filter Data by Values” to remove a_total==0 entries will remove the entire row where a0,a1,a2 are missing, including removing the b1 entries. I don’t want to remove b1 entries, just a_total==0 entries

Did you ever find a solution to this? I’m having the exact same issue and it’s really annoying… I just want to make the zeroes to nulls and, like you, the filter removes the whole row, which is not what I want.