Error with Alert Rule intermittently firing with PagerDuty

Grafana version: 6.5.0
Kubernetes version: Client Version: v1.14.7-eks-1861c5, Server Version: v1.14.9-eks-502bfb

I set up the PagerDuty integration in Grafana’s notification channel as the default for all alerts. When I test this integration in the notification channel, I receive the test notification in PagerDuty. However, when I set up alerts in a dashboard I no longer receive all the alerts in PagerDuty.

For instance, I have 5 alerts firing in Grafana and I only receive a notification for one alert in PagerDuty. When I open the graph for the alert and click Test alert Grafana reports they are all firing.

The logs read similar data for the alerts firing, despite them having the default notification channel:

t=2020-05-11T14:57:04+0000 lvl=info msg=“Alert Rule returned no data” logger=alerting.evalContext ruleId=13 name=“Kubernetes Deployment Mismatch Alert” changing state to=ok

When I click on these alerts that are not firing they have the PagerDuty under Send to:

I think alert depend on Evaluation time that you must set, also the data behavior.
If the data only occurred in 1s, you need to change the valuation every to 1s for, let say 2m, also the conditions. For very small time data concurrence, please be aware of the avg() or sum().

Regards,
Fadjar Tandabawana