Prometheus labels are missing in notification email sent by Alert rules with Classic condition

I noticed that when I use “Alert rules” with “Classic condition” only few labels are displayed in the notification email. Notably - none of the labels that exist in the prometheus query - appear in the email.

When I use “Classic condition” in an “alert rule” - the labels I see in email are just

“allername”
“grafana_folder”
“rule_uid”

If I modify the “Alert rule” to use “Reduce” and “Threshold” instead of “Classic condition” then in addition to the above labels I see all the labels that are defined in my prometheus query. In my case I see labels like:

“host”
“port”
“instance”
“job”
“city”
“vm_name”
etc

I tried to troubleshot this and had my golang template to print everything that’s in the “Alert” structure and the prometheus labels were absent in the notification emails that came from “alert rules” with “Classic condition”.

(I use grafana 10.0.0 but same thing was happening in 9.5.1).

Hi! :wave: The use of Classic Conditions disables multi-dimensional alerts. I’ve created a pull request here to update the in-app documentation.

I guess all my alerts are multi-dimensional as I lost prometheus labels everywhere
when I upgraded to 9.5.1 :slight_smile:

I was hoping that I could get the labels back with the “Classic Condition” as I can’t figure out how to easily replicate things like “percent_diff_abs()” (that I use a lot) using “Reduce”, “Math” and “Threshold”.

Did you upgrade from the old dashboard alerts? If so, those wouldn’t have been multi-dimensional, but the labels would have been concatenated together, comma separated.

I see them now. They are all inside the “valuestring”. Hopefully I will figure out how to get them out of there using template…

Hello, I have the same problem. Whenever I add any expression to my alert rules, I lose the labels from my actual query. And yes, the labels can be found as strings in a kind of CSV format under the key “valueString,” but having to parse them out from there again is cumbersome and unnecessary. Does anyone know why this is happening and how to change it so that the labels remain where they are?

Do you use “Classic Condition” in your alert rules ?

I see all labels from the query in “Labels” map unless I use “Classic Condition”
then they are available only in the “ValueString”.

I was able to extract what I need from the “ValueString” using reReplaceAll function.

Hi! :wave:

In Grafana there are two kinds of alert rule. These are known as multi-dimensional and uni-dimensional (also sometimes refered to as single dimensional).

A multi-dimensional alert rule can have more than one firing alert at the same time. A uni-dimensional alert rule on the other hand can have just one firing alert at the same time (hence uni).

In Grafana, all alert rules are multi-dimensional unless the condition of the rule is a Classic Condition. Classic Conditions make alert rules uni-dimensional by removing labels from queries such that the only labels remaining are the inherited labels (such as alertname) and any custom labels that are in the rule definition.

Classic Conditions do this to ensure the rule cannot have more than one firing alert at a time. You can still see which labels exceeded the conditions in the Classic Condition using $values or the value string, but the labels are not added to the alert to enforce uni-dimensional behavior.

okay, i can approve this behavior, thx for clearing the features!

volker.