I’m encountering an issue with alert notifications in Grafana. I’ve set up a notification policy to group alerts by alertname so that multiple clusters under the same alert rule should trigger only one notification when the alert state changes (from firing to resolved or vice versa).
However, the issue arises when:
One cluster (e.g., Cluster A) fires an alert. I receive a single notification, as expected.
Another cluster (e.g., Cluster B) fires under the same alert rule while Cluster A is still firing. At this point, I receive a second notification stating that there are now two firing alerts.
This behavior continues, and I end up receiving additional notifications for each new cluster that triggers the alert, even though all these clusters are part of the same alert rule.
What I want is to receive only a single notification per state change:
When the first cluster fires, I get notified.
If additional clusters fire while the alert is already active, I don’t want a new notification.
When all clusters resolve, I get a single resolved notification.
This behavior continues, and I end up receiving additional notifications for each new cluster that triggers the alert, even though all these clusters are part of the same alert rule.
Each alert instance is defined by its complete label set.
For instance: your_alert query:
{cluster=“a”} value1
{cluster=“b”} value2
This query generates two separate alert instances, one for cluster="a" and another for cluster="b". Even though these instances are part of the same alert rule, they are treated as distinct alerts because of their differing labels.
To not receiving separate notifications for each cluster, modify the alert query so it doesn’t include the cluster label in its results. For instance, instead of returning values per cluster, consider returning the max value across all clusters.