Transformation from Timestamp to Epochtime fails

Good evening,
I’ve had a ton of fun while getting into Grafana throughout the last weeks but also had my first touchpoints with the EPOCH-Timeunit. I’m currently using the SQL Builder within Grafana and the resulting Query always converts the Timestamp (Date & Time with time beeing 00:00:00) to an epoch-integer.

Here comes the issue which I’ve encountered a few times and which seems to be systematic in a way. The conversion seems to fail for some cases in a sense that the value of the current month is equal to the prior month’s value (please see pictures below). I’d understand this for edgecases being last day of a month or first day of the month but 2022-05-31 vs. 2022-06-09 being the same doesn’t make sense to me.

The picture is the result of a slightly modified SQL which also includes the actual “datum”/“time”-column from the source. Please also find my generated SQL below.

Some more additional details:
I’m using RasPi 4 / Ubuntu 21.10 with Grafana 8.5.3. The datasource is a PSQL Database which is being filled by a Pythonscript. All this is running within different dockervolumes on the RasPi.

The initial goal was to visualize the metric “kategorie” per month with the value being in column “betrag” and the timestamps being in column “datum”.
This has to be filtered to a specific “kategorie” and “av_identifier”.

Generated SQL:

SELECT
  floor(extract(epoch from datum)/2.6784e+06)*2.6784e+06 AS "time",
  kategorie AS metric,
  sum(-betrag) AS "betrag"
FROM haushaltsbuch
WHERE
  datum BETWEEN '2018-07-04T17:02:14.82Z' AND '2022-07-04T17:02:14.82Z' AND
  kategorie = 'Versicherungen' AND
  av_identifier = 'pension'
GROUP BY 1,2
ORDER BY 1,2

Adjusted SQL to create the upper image just fyi:

SELECT
  datum AS "time",
  floor(extract(epoch from datum)/2.6784e+06)*2.6784e+06 as "epochtime",
  kategorie AS metric,
  sum(0) AS "betrag"
FROM haushaltsbuch
WHERE
  datum BETWEEN '2018-07-04T16:39:16.655Z' AND '2022-07-04T16:39:16.655Z' AND
  kategorie = 'Versicherungen' AND
  av_identifier = 'pension'
GROUP BY 1,2,3
ORDER BY 1,2

I’m looking forward to any advice on how to fix this. :slight_smile:

Christian

Welcome

Please provide ddl and dml for that table.

Create table sample
(
Column datetime,
kategorie int
)

Insert into sample
Values()

Etc

Also why does the epoch look like ip.address.number

Hello Yosiasz,
thanks for the welcome. :slight_smile:

The formatting for the epoch is an issue with the datatype as I’ve formatted it as integer while there’s probably a better type for it. The dot for each thousands place is due to german locale but the numbers of the picture are only from the adjusted sql anyways.

The DDL is as follows:

-- public.haushaltsbuch definition

-- Drop table

-- DROP TABLE public.haushaltsbuch;

CREATE TABLE public.haushaltsbuch (
	"index" int8 NULL,
	id int8 NULL,
	datum timestamp NULL,
	typ text NULL,
	betrag float8 NULL,
	waehrung text NULL,
	kategorie text NULL,
	person text NULL,
	konto text NULL,
	gegenkonto text NULL,
	gruppe text NULL,
	notiz text NULL,
	checker text NULL,
	betrag_neu float8 NULL,
	betrag_cum float8 NULL,
	month_year text NULL,
	betrag_sparkonto float8 NULL,
	betrag_sparkonto_cum float8 NULL,
	av_identifier text NULL,
	ldom_id bool NULL,
	cat_focus bool NULL,
	year_calc text NULL
);
CREATE INDEX ix_haushaltsbuch_index ON public.haushaltsbuch USING btree (index);

The DML for the selected timeframe is as follows:

INSERT INTO public.haushaltsbuch
("index", id, datum, typ, betrag, waehrung, kategorie, person, konto, gegenkonto, gruppe, notiz, checker, betrag_neu, betrag_cum, month_year, betrag_sparkonto, betrag_sparkonto_cum, av_identifier, ldom_id, cat_focus, year_calc)
VALUES(3988, 3988, '2022-05-09 00:00:00.000', 'Ausgaben', -1.0, 'EUR', 'Versicherungen', NULL, 'C] Sparen', '0', 'Fix', 'empty', 'False', -1.0, 1.0, '2022-05', 0.0, 1.0, 'pension', false, false, '2022');
INSERT INTO public.haushaltsbuch
("index", id, datum, typ, betrag, waehrung, kategorie, person, konto, gegenkonto, gruppe, notiz, checker, betrag_neu, betrag_cum, month_year, betrag_sparkonto, betrag_sparkonto_cum, av_identifier, ldom_id, cat_focus, year_calc)
VALUES(4055, 4055, '2022-05-31 00:00:00.000', 'Ausgaben', -1.0, 'EUR', 'Versicherungen', 'Lastschrift', 'C] Sparen', '0', 'Fix', 'empty', 'False', -1.0, 1.0, '2022-05', 0.0, 1.0, 'pension', false, false, '2022');
INSERT INTO public.haushaltsbuch
("index", id, datum, typ, betrag, waehrung, kategorie, person, konto, gegenkonto, gruppe, notiz, checker, betrag_neu, betrag_cum, month_year, betrag_sparkonto, betrag_sparkonto_cum, av_identifier, ldom_id, cat_focus, year_calc)
VALUES(4096, 4096, '2022-06-09 00:00:00.000', 'Ausgaben', -1.0, 'EUR', 'Versicherungen', NULL, 'C] Sparen', '0', 'Fix', 'empty', 'False', -1.0, 1.15, '2022-06', 0.0, 1.0, 'pension', false, false, '2022');
INSERT INTO public.haushaltsbuch
("index", id, datum, typ, betrag, waehrung, kategorie, person, konto, gegenkonto, gruppe, notiz, checker, betrag_neu, betrag_cum, month_year, betrag_sparkonto, betrag_sparkonto_cum, av_identifier, ldom_id, cat_focus, year_calc)
VALUES(4147, 4147, '2022-06-30 00:00:00.000', 'Ausgaben', -1.0, 'EUR', 'Versicherungen', 'Lastschrift', 'C] Sparen', '0', 'Fix', 'empty', 'False', -1.0, 1.0, '2022-06', 0.0, 1.1, 'pension', false, false, '2022');

I’m aware that there are quite a few redundancies in there but this is the first database that I’ve set up (more or less) by myself and I’ll adress that topic afterwards :smiley:

1 Like

The same result happens in pgadmin. Seems more of a postgres “issue” then grafana. But first off what is epoch?

The number of seconds since 1970-01-01 00:00:00-00

But notice this

image

So maybe there is something going on in postgres itself that you will need to account for in grafana.

There is definitely something odd going on here. Maybe things are getting rounded?

1 Like

Ah, this sounds very reasonable when taking into account that FLOOR always rounds values to the lowest full Integer and has priorly been divided by a big integer too (2.6784 * 10^6). I bet we are “loosing” a few decimals when rounding down.

I’ll check the math later today.

And thanks for the reference to the epoch explanation - somewhat odd way to reflect time but there’s always a point zero I guess :man_shrugging:t2:

try this

extract(epoch from datum) as epochtime,

confirmed the numbers in MS SQL Server

image

But after all is said and done what are you trying to visualize??

Yes, I did and also went a step further to evaluate the degree of deviation here. I totally don’t get why grafana does this for me but your hint is the solution.

The following picture is the result from an adjusted query and

  • “epochTime_currentcalc” is the current calculation
  • “epochTime_dividedOnly” is the value which is later going to be rounded down (loosing big time on decimal precision here)
  • “epochTime” is the query as suggested by you
  • “diff” is the difference between epochTime_currentcalc and epochTime and it’s big!

Remark: Once again german locale here with a dot for each thousands place.

2.160.000 seconds in days are 25 days :smiley: This FLOOR merges 25 days into one. Wow.

All I wanted was to simply visualize costs per time on a graph while using the Query Builder in Grafana with suggested functions and macros. I’ll have to dig in deeper into different “Macros” for aggregating timevalues or write my own sql though.

What does cost per time mean? Could you show us sample ? every 5 minutes, 1 hour?

I’ve found an easy solution and figured I should share it in case someone else needs it.
The issue lies within the FLOOR function. EPOCH seems to be seconds since 1970. The division by the upper factor tries to find the number of months since 1970 (EPOCH Format) by dividing it by a number of seconds per month (see factor with 2.6 … above). It’s a bit like dividing 180 days by 30 to get out 6 months. Rounding the result down doesn’t work for certain dates. While I couldn’t really identify a pattern I’ve solved it by converting all dates for a specific month to one date, namely the 28th of each month. 01.01 and 15.01 → 28.01.

This has solved my issues and my viz does what it’s supposed to. Thanks for your support!

1 Like