Grafana version: 6.5.0
Kubernetes version: Client Version: v1.14.7-eks-1861c5, Server Version: v1.14.9-eks-502bfb
I set up the PagerDuty integration in Grafana’s notification channel as the default for all alerts. When I test this integration in the notification channel, I receive the test notification in PagerDuty. However, when I set up alerts in a dashboard I no longer receive all the alerts in PagerDuty.
For instance, I have 5 alerts firing in Grafana and I only receive a notification for one alert in PagerDuty. When I open the graph for the alert and click Test alert Grafana reports they are all firing.
The logs read similar data for the alerts firing, despite them having the default notification channel:
t=2020-05-11T14:57:04+0000 lvl=info msg=“Alert Rule returned no data” logger=alerting.evalContext ruleId=13 name=“Kubernetes Deployment Mismatch Alert” changing state to=ok
When I click on these alerts that are not firing they have the PagerDuty under Send to: