Grafana Alerting with Cloudwatch as datasource


We using cloudwatch as datasource in grafana and are planing to use grafana alerting.

Lets take the below example here:

Assume we have AWS cloudwatch account added as datasource. The query on which I am setting up the alert is essentially number of requests coming to a load balancer. Thus the metrics in this case is the number of requests.

Now if we set a alert like this: if the avg number of request is more than 100 for last 5 mins, trigger and alert

Could you please help to understand how grafana will work in the back ground?

In this case, as the datastore for metrics is in cloudwatch, could you please help to understand how grafana will work in the background?

Will the alerting engine of grafana poll cloudwatch every 5 minutes and get all the values, then compute the avg and if it is greater than 100, it will generate the alert?


1 Like

I have the same question.
This is important to know as it will determine no of api calls made which usually happens when dashboard is loaded. But, in this case grafana has to keep polling? So, approximately how many api calls per alert metric are we looking at per day ?
I believe Amazon charges 1 cent per 1000 api calls ?

Please let us know.


Did anyone manage to get an answer for this?