Hello everyone,
I got a problem with the Grafana Agent for Logs. Every time I make one request to our internal Grafana Dashboard, I got two log entries from our ingress-nginx. They contain the same data, but the “job” label is set differently. One is “scraper/grafana-agent” and the other one is “scraper/grafana-agent-logs”.
It worked properly before I integrated Grafana Tempo with a second Grafana Agent running in the Kubernetes cluster. The Grafana Agent for Logs is managed by the Grafana Operator.
So, what I’ve done is to upgrade the Grafana Agent from Version 0.19.0 to 0.20.0. I see that the config file of the Grafana Agent has duplicated entries and even totally new scrape_configs entries:
var/lib/grafana-agent/config-in/agent.yml:
logs:
configs:
- clients:
- external_labels:
cluster: scraper/grafana-agent-logs
url: http://loki-distributed-gateway.loki.svc.cluster.local/loki/api/v1/push
name: scraper/grafana-agent
scrape_configs:
- job_name: podLogs/scraper/grafana-agent-logs
kubernetes_sd_configs:
- role: pod
pipeline_stages:
- docker: {}
relabel_configs:
- source_labels:
- job
target_label: __tmp_prometheus_job_name
- source_labels:
- __meta_kubernetes_namespace
target_label: namespace
- source_labels:
- __meta_kubernetes_service_name
target_label: service
- source_labels:
- __meta_kubernetes_pod_name
target_label: pod
- source_labels:
- __meta_kubernetes_pod_container_name
target_label: container
- replacement: scraper/grafana-agent-logs
target_label: job
- replacement: /var/log/pods/*$1/*.log
separator: /
source_labels:
- __meta_kubernetes_pod_uid
- __meta_kubernetes_pod_container_name
target_label: __path__
- job_name: podLogs/scraper/grafana-agent
kubernetes_sd_configs:
- role: pod
pipeline_stages:
- docker: {}
relabel_configs:
- source_labels:
- job
target_label: __tmp_prometheus_job_name
- source_labels:
- __meta_kubernetes_namespace
target_label: namespace
- source_labels:
- __meta_kubernetes_service_name
target_label: service
- source_labels:
- __meta_kubernetes_pod_name
target_label: pod
- source_labels:
- __meta_kubernetes_pod_container_name
target_label: container
- replacement: scraper/grafana-agent
target_label: job
- replacement: /var/log/pods/*$1/*.log
separator: /
source_labels:
- __meta_kubernetes_pod_uid
- __meta_kubernetes_pod_container_name
target_label: __path__
- clients:
- external_labels:
cluster: scraper/grafana-agent-logs
url: http://loki-distributed-gateway.loki.svc.cluster.local/loki/api/v1/push
name: scraper/grafana-agent-logs
scrape_configs:
- job_name: podLogs/scraper/grafana-agent-logs
kubernetes_sd_configs:
- role: pod
pipeline_stages:
- docker: {}
relabel_configs:
- source_labels:
- job
target_label: __tmp_prometheus_job_name
- source_labels:
- __meta_kubernetes_namespace
target_label: namespace
- source_labels:
- __meta_kubernetes_service_name
target_label: service
- source_labels:
- __meta_kubernetes_pod_name
target_label: pod
- source_labels:
- __meta_kubernetes_pod_container_name
target_label: container
- replacement: scraper/grafana-agent-logs
target_label: job
- replacement: /var/log/pods/*$1/*.log
separator: /
source_labels:
- __meta_kubernetes_pod_uid
- __meta_kubernetes_pod_container_name
target_label: __path__
- job_name: podLogs/scraper/grafana-agent
kubernetes_sd_configs:
- role: pod
pipeline_stages:
- docker: {}
relabel_configs:
- source_labels:
- job
target_label: __tmp_prometheus_job_name
- source_labels:
- __meta_kubernetes_namespace
target_label: namespace
- source_labels:
- __meta_kubernetes_service_name
target_label: service
- source_labels:
- __meta_kubernetes_pod_name
target_label: pod
- source_labels:
- __meta_kubernetes_pod_container_name
target_label: container
- replacement: scraper/grafana-agent
target_label: job
- replacement: /var/log/pods/*$1/*.log
separator: /
source_labels:
- __meta_kubernetes_pod_uid
- __meta_kubernetes_pod_container_name
target_label: __path__
positions_directory: /var/lib/grafana-agent/data
server:
http_listen_port: 8080
Before it looked like this:
logs:
configs:
- clients:
- external_labels:
cluster: scraper/grafana-agent-logs
url: http://loki-distributed-gateway.loki.svc.cluster.local/loki/api/v1/push
name: scraper/grafana-agent
scrape_configs:
- job_name: podLogs/scraper/grafana-agent
kubernetes_sd_configs:
- role: pod
pipeline_stages:
- docker: {}
relabel_configs:
- source_labels:
- job
target_label: __tmp_prometheus_job_name
- source_labels:
- __meta_kubernetes_namespace
target_label: namespace
- source_labels:
- __meta_kubernetes_service_name
target_label: service
- source_labels:
- __meta_kubernetes_pod_name
target_label: pod
- source_labels:
- __meta_kubernetes_pod_container_name
target_label: container
- replacement: scraper/grafana-agent
target_label: job
- replacement: /var/log/pods/*$1/*.log
separator: /
source_labels:
- __meta_kubernetes_pod_uid
- __meta_kubernetes_pod_container_name
target_label: __path__
positions_directory: /var/lib/grafana-agent/data
server:
http_listen_port: 8080
Currently, I’m not convinced how to fix it. I’ve made a rollback to the old versions, but it doesn’t resolve the problem.
Any advice in this matter would be very much appreciated.
Thanks
Tom