I have taken the default Loki configuration from here .
I simply wanted to add a job that will extract the log level from my log and add it to a new label. I never see the new label in Loki. What am I missing?
Here is an example log line:
[00:02:50 DBG] Something happened in your application."
After the last job that came from the default config I added:
- job_name: detect-level
kubernetes_sd_configs:
- role: pod
pipeline_stages:
- match:
selector: '{namespace=~"my-namespace.+"}'
stages:
- regex:
expression: '\[\d{2}:\d{2}:\d{2} (?P<level>[A-Z]{3})'
- labels:
level:
b0b
February 4, 2022, 2:22pm
2
Hello @gabecalabro ,
Have a look at my config. Can’t remember where I found it but it works for me
- name: kubernetes_pods
positions:
filename: /tmp/positions_pods.yaml
scrape_configs:
- job_name: kubernetes_pods
kubernetes_sd_configs:
- role: pod
pipeline_stages:
- docker: {}
relabel_configs:
- source_labels:
- __meta_kubernetes_pod_controller_name
target_label: __service__
- source_labels:
- __meta_kubernetes_pod_node_name
target_label: __host__
- action: labelmap
regex: __meta_kubernetes_pod_label_(app|project|service)
- action: replace
replacement: $1
source_labels:
- name
target_label: job
- action: replace
source_labels:
- __meta_kubernetes_namespace
target_label: namespace
- action: replace
source_labels:
- __meta_kubernetes_pod_name
target_label: pod
- action: replace
source_labels:
- __meta_kubernetes_pod_container_name
target_label: container
- replacement: /var/log/pods/*$1/*.log
separator: /
source_labels:
- __meta_kubernetes_pod_uid
- __meta_kubernetes_pod_container_name
target_label: __path__
system
Closed
February 4, 2023, 2:22pm
3
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.