Keep journal fields and and stay compatible with file logs

I’m a bit lost on how to best use journald and log files in Loki, in practice.

  • I started, as shown in the documentation, by putting the systemd-unit and the level of the message in labels. But then I read the “best practices”, and they strongly warn, not to do this, to keep cardinality low.
  • So I removed them – but now these essential information are missing completely.
  • So I enabled json: true for the journal-scraper, so that the information is packed into the log message.
  • But now I need to use the json parser in LogQ to extract the relevant information.
  • But since my regular log files are not json, the json parser will not work for log lines from those, meaning I cannot use both the log lines of journald and of log files in the same query.
  • To fix this, I tried to use the pack stage on the log files, so that these entries also become json
  • But while the json feature of the journal scraper puts the log message into the field named MESSAGE, the pack stage uses _entry instead. So I still cannot properly use both in the same query.

This all feels like a long chain of dirty hacks for a pretty basic task: search logs from the journal and from log lines. Am I missing something? How are you handling these issues?

I found a solution. The trick is not to use json: true for the scraper but initially create labels for everything and as the last step of the pipeline use pack to pack them into the message. Then you can also call pack on the file based logs and consistently use unpack on query-side.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.