Hello, I’m trying to configure Promtail and Loki with Caddyserver. It outputs json logfiles with multiple levels. Something like:
{
"level": "info",
"ts": 1613986597.8528846,
"logger": "http.log.access.log1",
"msg": "handled request",
"request": {
"remote_addr": "xxx.xxx.xxx.xxx:54754",
"proto": "HTTP/2.0",
"method": "GET",
"host": "dash.xxxxxxx.com",
"uri": "/grafana/api/datasources/pro...",
"headers": {
},
"tls": {
}
},
"common_log": "xxx.xxx.xxx.xxx- - [22/Feb/2021:09:36:37 +0000] \"GET /grafana/api/datasou...",
"duration": 0.24274698,
"size": 625,
"status": 200,
"resp_headers": {
}
}
My job configuration is
- job_name: caddy
pipeline_stages:
- json:
expressions:
level: level
ts: ts
logger: logger
request:
- json:
expressions:
remote_addr:
source: request
static_configs:
- targets:
- localhost
labels:
job: caddy
host: dash.tinygs.com
__path__: /var/log/caddy/*log
Grafana shows parsed json data but only up to first level, even if I do not use pipeline_stages section.
My intention is to parse request fields to use them as tags. Do you see any mistake in job definition?
Thanks