I am attempting to configure Pomtail to send historical logs to Loki. The logs are in JSON format, but I can also get them in CSV or XML. I can change pretty much anything I need to about the logs. This is a code coverage report, so it’s just key-value pairs including a timestamp of when the report was run.
I’ve tried supplying the timestamp in both UNIX and RFC3339 formats. For the UNIX timestamp I supplied it as both a string and an integer. I’ve attempted to use the following formats:
- unix
- “unix”
- Unix
- “Unix”
- 1562708916
- “1562708916”
I thought maybe there was some problem parsing Unix timestamps so I switched to RFC3339 format but I’m still not getting anywhere.
My current job config looks like this:
- job_name: loc
pipeline_stages:
- json:
expressions:
timestamp: "time"
loc: "loc"
cloc: "cloc"
- labels:
timestamp:
loc:
cloc:
- timestamp:
source: timestamp
format: "DATE_RFC3339"
static_configs:
- targets:
- localhost
labels:
job: hosted
__path__: /logs/loc.log
The (simplified) JSON schema looks something like this:
{
"loc": 1099687,
"cloc": 119713,
"time": "2019-02-14T00:00:00+00:00"
}
At first I thought that I was having problems because the timestamp
was being loaded under “detected fields” instead of “log labels” in the Grafana interface, but after messing with it for several hours, I finally got timestamp
to show up under “log labels” by adding the - labels
pipeline stage configuration, but the timestamp
configuration still does not seem to do anything.
Googling this problem yields a lot of different weird edge cases including using double quoted strings for some fields. Is there any example of a working implementation of using a timestamp from a JSON payload instead of using the timestamp that Grafana automatically attaches to the log?