Epoch & Unix Timestamp Conversion

Hi Team,

I have “x509_cert” data from Telegraf to InfluxDB. Below is my query which I am using in dashboard.

from(bucket: "monitoring_metrics")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "x509_cert")
  |> filter(fn: (r) => r["_field"] == "enddate" or r["_field"] == "startdate" or r["_field"] == "expiry" or r["_field"] == "age")
  |> aggregateWindow(every: v.windowPeriod, fn: mean, createEmpty: false)
  |> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
  |> sort(columns: ["_time"], desc: true)
  |> last(column: "_time")
  |> group()

Screenshot 2024-10-16 at 9.46.02 am

The fields from above query “startdate”, “enddate”, “expiry” are all UnixTimestamp. I would like to show them in the human readable date format.

Can someone please provide an insight on how to handle this ?

Regards,
Pradeep

Just set correct unit for each column, use custom override for each column, e. g. Time / duration (s) for age, etc.

Hello @jangaraj ,

Its not working.

Screenshot 2024-10-16 at 3.35.53 pm

Looks like there are some posts with similar issue

Thank you.

great, this terrible problem description: no details what you configured/tried, what’s exact input/result,…

It’s just not working.

1 Like

These docus might be of help

1 Like

Have you tried playing around with different units? Seems like your timestamp is in different unit than seconds.

1 Like

Hello @jangaraj , appologize for earlier response… some have it got accidentally submitted without my complete message and attachment. When I apply the unit type, some have its displaying them 1971 as year.

I was referring some other posts and tried below… which looks be working.

Thank you.

1 Like

IMHO you can multiply by 1k on the InfluxDB level directly - that should be faster.

1 Like

good thought… @jangaraj . I will give a try .

thank you.