Alerting when count is 0 or no data




I’m trying to add alerting to a graph and I’m facing what seems to be a rather odd behavior.
I’m running Grafana 5.4.3 on Windows, using an Elasticsearch 6.4.2 as a data source.

The graph is showing the count of results of a specific search query:

For now no document correspond to this search, so this graph is basically displaying empty data, which is what I want.

Now, I need Grafana to trigger an alert when no value is retrieved, I configured it as follows:

Follow in the next post due to image limitations.


Even though the graph shows the A metric has being 0 for more than 5 minutes (it’s been days), when I test it it doesn’t fire the alarm:

What seems really strange to me is, if I invert my alerting rule and I make it fire an alert if the count is above 1, then if fires an alert!

Why is that?


Self answering, I finally found why it wasn’t working as expected.

ElasticSearch was giving results to Grafana. For each interval is was giving a count of 0 documents, which is a proper value for Grafana.
Since my alerting rule was relying on the count of objects in ElasticSearch response, it was seeing hundreds of results, hence the alert never fired.

To make it work as desired, I had to set “Min Doc Count” to “1” in each of the queries under the metrics tab: