Parsing logs with promtail and loki

Hi, i am completely new to grafana. I am trying to ship logs to grafana using promtail. Here is my sample log line

	
[13/Sep/2023:10:29:20 +0530] | 404 | 1 ms | 780 B | 127.0.0.1 |          - | - | - | "GET /server-status?auto HTTP/1.1"

I would like to parse this as follows

short_date	2023-09-13T04:59:20.000Z
response         404 
duration          1ms
bytes               780B
client ip : 127.0.0.1
remoteip: -
token: -
method: GET
url:  /server-status?auto
http version : 1.1

I had acheived this using grok patterns in logstash, but i’ve no idea how this can be done with promtail or loki.
I tried the following promtail config, label names are slightly different but with this config the loki data source does not generate the label from regex.

server:
  http_listen_port: 9080
  grpc_listen_port: 0

positions:
  filename: /tmp/positions.yaml

clients:
  - url: 'http://lokiip:3100/loki/api/v1/push'
scrape_configs:
  - job_name: apache-logs
    pipeline_stages:
      - regex:
          expression: "\\[(.*?)\\] \\| (\\d+) \\| (\\d+) ms \\| (\\d+) B \\| (.*?) \\| - \\| - \\| - \\| \"(.*?)\""
          labels:
            timestamp: "${1}"
            status_code: "${2}"
            duration_ms: "${3}"
            bytes: "${4}"
            ip: "${5}"
            method: "${6}"
            url: "${7}"
    static_configs:
      - targets:
          - localhost
        labels:
          job: apache-logs
          __path__: /opt/httpd-2.4.57/logs/3dx_access.log
          host: 3dx


how can i parse it to get fields like the image below which has been parsed by logstash and sent to grafana from es data source

Hi @neelam ,

I would not try to treat Loki like a direct replacement of ES. Loki is not a fully indexed log store. You should try to keep the number of labels and label values as low as possible for best performance.

You parse the logs at query time. You might be able to use a pattern parser. I have not tried that. I try to get everyone to output JSON or logfmt logs for Promtail, so the parsing is easier in Grafana.

1 Like

well, i would not say replacement for es. i want to replicate a dashboard i made using kibana in grafana. For example i have a log file which has log lines in this format

[15/Sep/2023:11:26:45 +0530] | 200 | 0 ms | 636 B | 3.108.45.125 | - | - | - | “GET /3dcomment/test HTTP/1.1”

i want to parse the logs to store
timestamp : [15/Sep/2023:11:26:45 +0530]
status: 200
duration: 0 ms
bytes: 636 B
remoteip: 3.108.45.125
ignore the - - -
method: GET
url: /3dcomment/test
service name : 3dcomment

I used grok patterns in elk to parse it in elk. I basically want to build a dashboard
in which for example i want a stacked bar graph
with timestamp on x axis, bytes on y axis, breakdown by service name. i am okay with parsing in anyway. i want to build such graphs. how is it possible?
This is the graph i made in kibana

b0b is correct in that you don’t want to use Loki like ES. What you want to do is:

  1. First get your logs into Loki. To start I would recommend you to parse for only timestamp so your logs are written with the correct time. Ignore everything else.

  2. Look up LogQL and how to use pattern filter to parse your logs. There are various metrics function as well, such as count_over_time and sum by. This is how you parse and create dashboards.

  3. After you are more familiar with the way of things, you can start to consider what other labels you want to parse for in promtail. But as b0b suggested, you want to keep labels to a minimum, both in terms of number of them, as well as potentially values of them.

1 Like

can i get some video tutorial for it? how can i parse only the timestamp, in loki config?
I used logql pattern filter


which is generating fields

now how can i build dashboard using these fields. like i want to extract only the minutes from date and show it on x axis of bar chart and the bytes on y axis, breakdown by service name

As mentioned above, I’d recommend you to parse for timestamp when sending logs to Loki, not afterwards.

What agent are you using to send logs to Loki? If you are using promtail it’s pretty easy to parse the timestamp.

1 Like

yes i am using promtail. I have the following in config-promtail.yml

server:
  http_listen_port: 9080
  grpc_listen_port: 0

positions:
  filename: /tmp/positions.yaml

clients:
  - url: 'http://15.206.20.110:3100/loki/api/v1/push'
scrape_configs:
  - job_name: apache-logs
    static_configs:
      - targets:
          - localhost
        labels:
          job: apache-logs
          __path__: /opt/httpd-2.4.57/logs/3dx_access.log
          host: 3dx
      - targets:
          - localhost
        labels:
          job: test
          __path__: /opt/httpd-2.4.57/logs/sample.log
          host: test

    pipeline_stages:
      - regex:
          expression: '\\[(.*?)\\] \\| (\\d+) \\| (\\d+) ms \\| (\\d+) B \\| (.*?) \\| - \\| - \\| - \\| \"(.*?)\"'
          labels:
            timestamp: "${1}"
            status_code: "${2}"
            duration_ms: "${3}"
            bytes: "${4}"
            ip: "${5}"
            method: "${6}"
            url: "${7}"

but this is not parsing timestamp or any other field. the only label i am getting in grafana explore is job, host and filename. am i going wrong with the query or promtail config?


Hi neelam,
normaly the next step is to extraxt all the lable fields in a table.
To do that, you can use transformations.
First step is “Extract fields” use labels as Source.
Change the visualisation to table and go on…
Jo

is this what you are saying, i am sorry but i am completely new to grafana so i’m finding it difficlt to understand


The entries in the table are coming after i use the pattern operation as shown below

Hi, yes thats the way to go…

now you can all do the things you want with the metrics.
as an example - “convert filed type” date to time with the format you want.
so you can create graphes, whatever…

can i get any tutorial link for the same? i tried the field conversion thing, but i am not getting it

HEY, yes i somewhat got it working following the field conversion. didn’t get the entire graph yet though. Thank you so much. is it possible to just extract the first part from my log label into another label
for example
This is the Log sample

[21/Sep/2023:11:39:35 +0530] | 200 | 3 ms | 4 B | 172.31.0.88 | - | - | - | "GET /3dswym/monitoring/healthcheck HTTP/1.1"

using the pattern

pattern `<date> | <response> | <duration> ms | <bytes> B | <remoteIP> | <clientIP> | <token> | - | "<method> <url> <protocol>"` | duration = `0`

i am extracting something called url
in this case url : /3dswym/monitoring/healthcheck
is it possible for me to create another field which holds just the first value after /, in this case i want
url :/3dswym/monitoring/healthcheck
service_name: 3dswym
is it possible?

You can attach a regex pattern as an example…

“GET /3dswym/monitoring/healthcheck HTTP/1.1”

| regexp /(?P<service_name>\w+)/

so you create the new label “service_name”

1 Like

The above regexp is returning me Sep instead of 3dnotification
Here is the sample logline
[22/Sep/2023:10:21:21 +0530] | 200 | 1 ms | 2 B | 172.31.0.88 | - | - | - | “GET /3dnotification/healthcheck HTTP/1.1”

I think the regex function knows spaces, so use " /(?P<service_name>\w+)/" with a space at the front.
or use “T /(?P<service_name>\w+)/” because the http methode can be GET or POST.

be a little creative :wink:

1 Like

thank you :slight_smile:

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.