Why must I select "mean" to view data in a graph?

I’m working with a simple InfluxDB data source, which logs a single measurement which I’d like to display on a graph.

When I select the measurement, then the field, it defaults to “mean”, see below:

I understand the definition of “mean” (as in, “average”) but can someone explain the meaning of this interface field here?

1/ My understanding is that each point plotted on the graph is a single measurement, so I don’t understand what the relevance of “mean” is.

2/ Why does the line on the graph disappear if I remove this?

Just found this Question because I was wondering about the same thing.

So for everyone googling this in the future:

I found this article explaining the whole thing a bit:

Essentially you have to tell Grafana how to display data when the resolution of the timeline changes, because it cannot show all data with precise resolution while still understandable.

Two examples:

  • when showing temperature you want to use mean to make the graph more smooth when ‘zooming out’
  • when showing absolute counts, like the case numbers of a global pandemic, you wouldn’t want to average the values out, because 10 cases a day is not 10 cases a month. It’d be around 300 cases a month, so in that case you would use the sum() function.

I noticed there is a way to show every single data point if you’re sure you want to do that:
Delete all the fields in the “GROUP BY” line. That way Grafana won’t group data so you also won’t need an Aggregation function.

I know its a little bit old post, but if someone is looking for answer in this question.

  1. By default, grafana uses aggregation function when query-ing Influx Data, if you want to get raw values of your datasource, just simply use “First” function snd change the timeFilter into “1s”.

  2. Other options is to delete the “mean” , snd delete the “Group By”, you definitely get the raw data.

1 Like