Grafana Unified Alerting - frame cannot uniquely be identified by its labels: has duplicate results with labels {}

TLDR : I can’t access field_keys or alias when working with multidimensional rules from the new Unified Grafana Alerting System (& InfluxDB)

I ran into a problem with the new unified grafana alerting system. I created a new alert rule and entered a basic query for my InfluxDB database refID=A . Because I want to sent multiple alerts based on the query I changed the standard classic condition expression to a reduce max(A) and a math $B > 0 expression.

After trying to execute this Alert Rule I got the error: invalid format of evaluation results for the alert definition : frame cannot uniquely be identified by its labels: has duplicate results with labels {}

I figured out that the error occurs because the alerting system is not able to differentiate between the individual signals. After further inspection I also noticed that the results after the query are displayed with the normal field keys of the InfluxDB, but the results of the reduce and math expression are just all named B (reduce) and C (math). The original field keys (or any alias) just got removed ( see example image ).

When using the classic condition expression the field keys were at least visible in the generated ValueString ( metric='fieldKey' , ... ), but even that is no longer the case when using reduce and math since the ValueString now no longer has a metric field but instead only var='B' , var='C' fields that hold the reuslts of the reduce and math expressions

I don’t really understand why the field keys get removed when using the reduce and math expressions and why the unified alerting system is not able to uniquely identify the signals by their names (and instead need labels to do so). My only solution so far is to add a GROUP BY tagkey statement to my influx query, to create labels based on the existing tags in the database.

Even so this kind of solves the problem in terms of that I no longer get an error message when executing the alert rule, I still have the problem that the original names of the signals I queried got removed. Since I’m trying to create a multidimensional alert rule (that can produce multiple alerts based on the query) I want to add the field keys to the alert messages (so that I can directly see which signal caused the alert).

I find it strange that it seems to be such a hard (or even impossible) task to add the field keys (names of the signal) that I queried inside the alert rule to the labels / alert message (message templating) when dealing with multidimensional alert rules. In my opinion this should be a basic feature.

Am I overlooking something or does anybody have a hint on how to overcome this problem? Because the only solution I see is to add the fieldKeys as an extra tag (fieldKeyNames), since then I could use GROUP BY fieldKeyNames to create a label that holds the fieldKeys. I would then be able to uniquely identify the signals (no more error message) and also to directly access this label (fieldKeyNames) in the message templating system. But if possible I would like to avoid doing this, since this would mean that I have to rewrite multiple (large) databases (and add a normally useless tag)

welcome to the :grafana: forum, @tben

Grafana’s transition from legacy alerting to the Unified Alerting platform represents a big step forward. But, there are many factors that can influence behavior, and it is often hard for the community to troubleshoot issues without a thorough understanding of your unique setup. Try to include the following info:

  • What is your Grafana version?
  • Are you using Grafana Cloud or self-hosted Grafana?
  • Are you using legacy alerting or Unified Alerting?
  • was the alert in question migrated from the legacy platform into Unified Alerting, or did you first create it inside the new platform?
  • Please list ALL configuration options related to alerting. You can find these in the Alerting and Unified Alerting sections of Grafana’s config file. If you are now using or have previously used the beta version of ngalert (released with Grafana 8), please note that too.
    • you can use this table to better understand how configuration options can interact with each other
  • If this is a templating issue on Unified Alerting, check if your alert is using a multi-dimensional rule or not.
  • List the datasource associated with the alert
  • Increase the verbosity of the Grafana server logs to debug and note any errors. For printing to console, set the console logs to debug as well.
  • Search for open issues on GitHub that sound similar to your problem