Dear Grafana team/users,
What I Want to Achieve:
I want to create a Grafana query where:
- Time is grouped into 15-second intervals.
- For each of these 15-second intervals, I want to calculate the count of
BoxID
records for the last hour preceding each 15-second interval.
In simpler terms, the goal is to:
- Group data by 15-second intervals.
- For each time interval, calculate how many
BoxID
records were logged in the previous hour up to that specific time interval.
Detailed Explanation of the Query:
- Group Data into 15-second Intervals: We need to take the timestamp of each record and group it into intervals of 15 seconds. This will give us discrete blocks of time on the x-axis of the graph.
- Count the
BoxID
Records for Each 15-second Interval: For each 15-second interval, we need to calculate how many BoxID
entries were recorded in the last hour. This means for every 15-second timestamp, we look back one hour and count the total number of BoxID
records in that time period.
Currently, I have the following query, which counts BoxID
records for 15-second intervals:
SELECT
$__timeGroup(Timestamp, ‘15s’) AS “time”, – Time interval grouped by 15 seconds
count(BoxID) – Counting BoxID for each 15-second interval
FROM
ACTIN_DP.dbo.PLC – Data source (PLC table)
WHERE
BR = ‘BR08’ – Filter for BR08 branch
AND $__timeFilter(Timestamp) – Time filter applied for the selected time range in Grafana
GROUP BY
$__timeGroup(Timestamp, ‘15s’) – Grouping by 15-second intervals
ORDER BY
“time” – Ordering the results by time
Thank you very much for your help! I appreciate any suggestions or improvements you can offer to modify this query for better clarity and accuracy!
I don’t know what your data looks like, see if this works:
WITH TimeGrouped AS (
SELECT
$__timeGroup(Timestamp, ‘15s’) AS “time_group”, – Grouped into 15-second intervals
Timestamp,
BoxID
FROM
ACTIN_DP.dbo.PLC
WHERE
BR = ‘BR08’ – Filter for BR08 branch
AND $__timeFilter(Timestamp) – Time filter for Grafana range
)
SELECT
tg.time_group AS “time”,
COUNT(t1.BoxID) AS “count_boxid”
FROM
TimeGrouped tg
JOIN
TimeGrouped t1
ON t1.Timestamp BETWEEN DATEADD(SECOND, -3600, tg.time_group) AND tg.time_group – Previous hour to the current time group
GROUP BY
tg.time_group
ORDER BY
tg.time_group;
Thank you so much for your suggestion and guidance! I really appreciate your time and effort in helping me refine my query. Based on your input and a bit of additional experimentation, I was able to achieve exactly what I wanted. Here’s what I did:
WITH MinuteData AS (
SELECT
$__timeGroup(Timestamp, '15s') AS "time", -- Group timestamps into 15-second intervals
COUNT(BoxID) AS "boxCount" -- Count BoxID for each interval
FROM
ACTIN_DP.dbo.PLC
WHERE
BR = 'BR08' -- Filter for branch BR08
AND Timestamp >= DATEADD(HOUR, -1, $__timeFrom()) -- Load data starting 1 hour before
the Grafana time range
AND Timestamp <= $__timeTo() -- Limit data to the Grafana time range
GROUP BY
$__timeGroup(Timestamp, '15s') -- Group data by 15-second intervals
)
SELECT
"time", -- Each 15-second time interval
SUM(boxCount) OVER (
ORDER BY "time"
ROWS BETWEEN 239 PRECEDING AND CURRENT ROW -- Rolling sum for the last 240 intervals
(15s * 240 = 1 hour)
) AS "boxCountLastHour" -- Calculate the rolling count of BoxID records for the last hour
FROM
MinuteData
ORDER BY
"time"; -- Sort results by time
Key Adjustments I Made:
- Expanded Time Range for Accuracy:
- To calculate the rolling count for the last hour, I adjusted the
WHERE
clause to load data from 1 hour before the selected Grafana range:
AND Timestamp >= DATEADD(HOUR, -1, $__timeFrom())
- This ensures that even if the Grafana range is limited (e.g., 5 minutes), we still have enough data for accurate rolling calculations.
- Used a Common Table Expression (CTE):
- I grouped data by 15-second intervals in the CTE (
MinuteData
), which simplifies further calculations.
- Rolling Sum with
OVER
Clause:
- Using the
SUM
window function, I calculated the rolling count for the last 240 intervals (15 seconds * 240 = 1 hour).
1 Like
Very nice, now take the generated query from query inspector and run it in SQL management and see query plan and performance of it