HI all,
I got a file configuration like below…
but there is no way to have this property (path) dynamic?
'Cause if I recreate my contaier or rebuil from my docker-compose I got another ID…
I would like somethig related to the service name which is constant.
Thanks Alen
....
- job_name: exporter
pipeline_stages:
- docker: {}
static_configs:
- targets:
- localhost
labels:
job: exporter
__path__: /var/lib/docker/containers/c22c0933c266275467ecf28a7ed259d8b9e9572131b55*/*-json.log
....
Yeah, you can of course make it dynamic, you are already doing it with a wildcard in the path, just add another one:
- job_name: exporter
pipeline_stages:
- docker: {}
static_configs:
- targets:
- localhost
labels:
job: exporter
__path__: /var/lib/docker/containers/*/*.log
Thanks Tony… but in this way I catch all my FOLDER , all my docker app!
AS you can see I use just a pattern…
containers/c22c0933c266275467ecf28a7ed259d8b9e9572131b55*/*-json.log
I need this dynamic… but related to my APP, maybe it’s not possible…
c22c0933c266275467ecf28a7ed259d8b9e9572131b55
It’s not a solution…
Thank anyway.
ALEN
: (
It is a solution, you just need to filter it. When you configure json driver, you can provide a label like so:
{
"log-driver": "json-file",
"log-opts": {
"max-size": "1g",
"max-file": "10",
"labels": "logs_i_care_about"
}
}
And then you filter it in your promtail configuration:
scrape_configs:
- job_name: docker-logs
static_configs:
- targets:
- localhost
labels:
job: docker-logs
__path__: /var/lib/docker/containers/*/*log
pipeline_stages:
- json:
expressions:
attrs:
- json:
expressions:
tag:
source: attrs
- labels:
tag:
- match:
selector: '{tag="logs_i_care_about"}'
stages:
### Do something ###
- match: # Drop logs that we don't care.
selector: '{tag=!"logs_i_care_about"}'
action: drop
drop_counter_reason: log_i_do_not_care_about
You’ll want to verify the promtail configuration yourself, i just typed it and it’s not tested.
Also, just to be clear, your original configuration will break as soon as you refresh your container, because the ID will have changed. So it’s already not a working configuration.