I am struggling with creating visualization from container logs from loki on geomap in grafana
here is logs from the container from explore loki to show the data we want to use for the dashboard
{“time_local”: “23/Sep/2024:14:08:48 +0000”, “remote_addr”: “4300:1200:2050:6050:7098:e236:5997:e734”, “request”: “GET /test HTTP/1.1”, “status”: “200”, “body_bytes_sent”: “840”, “http_referer”: “https://example.com ”, “http_user_agent”: “Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36”, “country_code”: “US”, “country”: “United States”, “city”: “Chicago”, “postal_code”: “60602”, “timezone”: “America/Chicago”, “lat”: “19.64200”, “lon”: “-57.34300”}
when i run the following query on basic logs i see the following
query
{container=“trading-app”} | json | line_format “{{.lat}},{{.lon}},{{.city}},{{.country}}”
result
19.64200, -57.34300, Chicago, United States
so now when i try to run this query on geomap , nothing is showing up on the map and i also get this error in the panel and have no idea what to do
query
{container=“trading-app”} | json | line_format “{{.lat}},{{.lon}}”
so please help what am i missing to graph loki data properly on geomap
Two problems I see:
The lon/lat fields need to be made into labels.
Your query is not a metrics query.
Try this (i am using count because I assume you want to see number of connections based on coordinates):
sum by (lat,lon) (
count_over_time({container="trading-app"} | json [$__auto])
)
Edit: originally I did | json | label_format lat="{{ .lat }}",lon="{{ .lon }}"
, but now that I think about it I don’t think you need to do that. The json filter should take care of that for you.
here is error i got when i used your query
sum by (lat,lon) (
count_over_time({container=“trading-app”} | json | label_format lat={{ .lat }}
,lon={{ .lon }}
[$__auto])
)
pipeline error: 'JSONParserErr' for series: '{__error__="JSONParserErr", __error_details__="Value looks like object, but can't find closing '}' symbol", container="trading-app", job="containerlogs", lat="", logstream="stderr", lon=""}'. Use a label filter to intentionally skip this error. (e.g | __error__!="JSONParserErr"). To skip all potential errors you can match empty errors.(e.g __error__="") The label filter can also be specified after unwrap. (e.g | unwrap latency | __error__="" )
sum by (lat,lon) (
count_over_time({container=“trading-app”} | json [$__auto])
)
pipeline error: 'JSONParserErr' for series: '{__error__="JSONParserErr", __error_details__="Value looks like object, but can't find closing '}' symbol", container="trading-app", job="containerlogs", logstream="stderr"}'. Use a label filter to intentionally skip this error. (e.g | __error__!="JSONParserErr"). To skip all potential errors you can match empty errors.(e.g __error__="") The label filter can also be specified after unwrap. (e.g | unwrap latency | __error__="" )
Error is quite clear, your logs aren’t in json format. Can you post an example line of your logs, please?
logs from the grafana container
logger=tsdb.loki endpoint=queryData pluginId=loki dsName=Loki dsUID=c90f4191-21f6-4a3b-8357-313ee5c15ae3 uname=admin fromAlert=false t=2024-09-23T19:59:59.538163391Z level=error msg="Error received from Loki" duration=6.899217ms stage=databaseRequest statusCode=400 contentLength=494 start=2024-09-23T07:59:59.295Z end=2024-09-23T19:59:59.295Z step=1m0s query="sum by (lat,lon) (\n count_over_time({container=\"trading-app\"} | json | label_format lat=`{{ .lat }}`,lon=`{{ .lon }}` [1m])\n)" queryType=range direction=backward maxLines=1000 supportingQueryType=none lokiHost=loki:3100 lokiPath=/loki/api/v1/query_range status=error error="pipeline error: 'JSONParserErr' for series: '{__error__=\"JSONParserErr\", __error_details__=\"Value looks like object, but can't find closing '}' symbol\", container=\"trading-app\", job=\"containerlogs\", lat=\"\", logstream=\"stderr\", lon=\"\"}'.\nUse a label filter to intentionally skip this error. (e.g | __error__!=\"JSONParserErr\").\nTo skip all potential errors you can match empty errors.(e.g __error__=\"\")\nThe label filter can also be specified after unwrap. (e.g | unwrap latency | __error__=\"\" )\n\n"
logger=tsdb.loki endpoint=queryData pluginId=loki dsName=Loki dsUID=c90f4191-21f6-4a3b-8357-313ee5c15ae3 uname=admin fromAlert=false t=2024-09-23T19:59:59.538245383Z level=error msg="Error querying loki" error="pipeline error: 'JSONParserErr' for series: '{__error__=\"JSONParserErr\", __error_details__=\"Value looks like object, but can't find closing '}' symbol\", container=\"trading-app\", job=\"containerlogs\", lat=\"\", logstream=\"stderr\", lon=\"\"}'.\nUse a label filter to intentionally skip this error. (e.g | __error__!=\"JSONParserErr\").\nTo skip all potential errors you can match empty errors.(e.g __error__=\"\")\nThe label filter can also be specified after unwrap. (e.g | unwrap latency | __error__=\"\" )\n\n"
logger=tsdb.loki endpoint=queryData pluginId=loki dsName=Loki dsUID=c90f4191-21f6-4a3b-8357-313ee5c15ae3 uname=admin fromAlert=false t=2024-09-23T20:26:52.144643097Z level=error msg="Error received from Loki" duration=8.110675ms stage=databaseRequest statusCode=400 contentLength=494 start=2024-09-23T08:26:51.943Z end=2024-09-23T20:26:51.944Z step=1m0s query="sum by (lat,lon) (\n count_over_time({container=\"trading-app\"} | json | label_format lat=`{{ .lat }}`,lon=`{{ .lon }}` [1m])\n)" queryType=range direction=backward maxLines=1000 supportingQueryType=none lokiHost=loki:3100 lokiPath=/loki/api/v1/query_range status=error error="pipeline error: 'JSONParserErr' for series: '{__error__=\"JSONParserErr\", __error_details__=\"Value looks like object, but can't find closing '}' symbol\", container=\"trading-app\", job=\"containerlogs\", lat=\"\", logstream=\"stderr\", lon=\"\"}'.\nUse a label filter to intentionally skip this error. (e.g | __error__!=\"JSONParserErr\").\nTo skip all potential errors you can match empty errors.(e.g __error__=\"\")\nThe label filter can also be specified after unwrap. (e.g | unwrap latency | __error__=\"\" )\n\n"
logger=tsdb.loki endpoint=queryData pluginId=loki dsName=Loki dsUID=c90f4191-21f6-4a3b-8357-313ee5c15ae3 uname=admin fromAlert=false t=2024-09-23T20:26:52.144746455Z level=error msg="Error querying loki" error="pipeline error: 'JSONParserErr' for series: '{__error__=\"JSONParserErr\", __error_details__=\"Value looks like object, but can't find closing '}' symbol\", container=\"trading-app\", job=\"containerlogs\", lat=\"\", logstream=\"stderr\", lon=\"\"}'.\nUse a label filter to intentionally skip this error. (e.g | __error__!=\"JSONParserErr\").\nTo skip all potential errors you can match empty errors.(e.g __error__=\"\")\nThe label filter can also be specified after unwrap. (e.g | unwrap latency | __error__=\"\" )\n\n"
No, I meant a sample of your actual logs, so we can see if it’s JSON or not.
you mean the app logs we want to visualize on geomap?
i have it on the very first post of the thread
here is the log format and it is in json format
{“time_local”: “23/Sep/2024:14:08:48 +0000”, “remote_addr”: “4300:1200:2050:6050:7098:e236:5997:e734”, “request”: “GET /test HTTP/1.1”, “status”: “200”, “body_bytes_sent”: “840”, “http_referer”: “https://example.com ”, “http_user_agent”: “Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36”, “country_code”: “US”, “country”: “United States”, “city”: “Chicago”, “postal_code”: “60602”, “timezone”: “America/Chicago”, “lat”: “19.64200”, “lon”: “-57.34300”}
remember this is logs from container which am using to grab using lables on the container apps
the above log is pulled from the loki explore output of the container. i explained this in very first post above
Does your log actually pass the json filter? For example, if you just do {container="trading-app"} | json
does it work?
Your example appears to be proper JSON, except that the quotation mark isn’t ascii, which could happen from copy pasting. But if it’s actually that way in your logs then you won’t be able to parse with JSON filter, then you’d have to look at using regex.
did you read my very first post?
i have a query with that in it and the result
actually i think i know what the issue is, so some of the logs are not in json format, and but it is able to detect the lines that are in json format
so seems i need to hide or remove the log lines that are should not be included in the query
below is screenshot showing the lines that are not in the json format we want
query
{container=“trading-app”} | json | line_format “{{.lat}},{{.lon}},{{.city}},{{.country}}”
result
19.64200, -57.34300, Chicago, United States
19.64200, -57.34300, Chicago, United States
19.64200, -57.34300, Chicago, United States
i figured the issue with the query errors, i removed the log lines that are not in same format and now getting closer
so here is the updated query now
sum by (lat, lon) (
count_over_time({container=“trading-app”}
| json
| lat != “” and lon != “” [$__auto])
)
now i still dont see anything on geomap but when i switched to table then i see the query output
so what is still going on with geomap?
here is the table log type showing query result
what do you see when you clixk on the Table View
i put the screenshot in my last reply above
also posting here again
1 Like
yosiasz
September 26, 2024, 5:24pm
13
so your data needs to be in the format
time, lat,lon,value
2024-09-23,1,6.44,3.39030,1
right now it is
time,{lat="6.44",lon="3.39030"}
2024-09-23,1
So what query do i need to get to the right format?
that is why i am requesting help
thanks
also you last response seems to contain some typo
anyone willing to help with this geomap query?
will appreciate your help
yosiasz
September 29, 2024, 6:57pm
16
which post are you referring to?
i said your last response which means the response you made last before that comment
i find that i repeat myself many times not sure if you guys read the posts i make
i have posted all screenshots and queries tried and all but i get questions about things i posted already above
do you see typo here?
time, lat,lon,value
2024-09-23,1,6.44,3.39030,1
ah got it now. extra data there
but the typo is not an issue, not sure why you are fixated on that