Alerting when count is 0 or no data

Hi,

I’m trying to add alerting to a graph and I’m facing what seems to be a rather odd behavior.
I’m running Grafana 5.4.3 on Windows, using an Elasticsearch 6.4.2 as a data source.

The graph is showing the count of results of a specific search query:

For now no document correspond to this search, so this graph is basically displaying empty data, which is what I want.

Now, I need Grafana to trigger an alert when no value is retrieved, I configured it as follows:

Follow in the next post due to image limitations.

Even though the graph shows the A metric has being 0 for more than 5 minutes (it’s been days), when I test it it doesn’t fire the alarm:

What seems really strange to me is, if I invert my alerting rule and I make it fire an alert if the count is above 1, then if fires an alert!

Why is that?

Self answering, I finally found why it wasn’t working as expected.

ElasticSearch was giving results to Grafana. For each interval is was giving a count of 0 documents, which is a proper value for Grafana.
Since my alerting rule was relying on the count of objects in ElasticSearch response, it was seeing hundreds of results, hence the alert never fired.

To make it work as desired, I had to set “Min Doc Count” to “1” in each of the queries under the metrics tab:

I think I’m getting a similarish problem, but your fix isn’t helping.

I’m trying to send an alert when the count() IS ABOVE 0.
The count is 0, but when I test the rule it says it’s firing. Why is that?

My graph states No Data, but the alert rule is firing anyway…

1 Like

Same problerm with mine… have you fixed your issue?

Yeah, I switched from using count() to total() and that worked.


Thanks for your quick reply, but I didn’t find total(), i can see sum(), is it same?

1 Like

Oh yeah! I guess it was sum(), then.

1 Like

Hello all, I am facing the same issue of not getting the Alert when there is no data or all values are null. Any pointers will be really helpful.