Just recently started using TimescaleDB with Postgres to handle most requests for data.
However I'm running into an issue where I have a horribly inefficient request for time series of data.
It's a data series that can be any length of time, with specific Integer values.
Most of the time the value will be the same unless there's an anomaly. So rather than fetching +10,000 rows of data. I would like to aggregate this into "time blocks".
Let's say there 97 items in a row where the value is 100 (new item for every 5 minutes) #98 the value is 48 for 5 items in a row and then it goes back up to 100 for another 2,900 rows.
I don't want to fetch 3002 items to display this data. I should only need to fetch 3 items.
But I'm having some trouble figuring out how I can do this with timescaledb.
basically, if the value is the same as the last value, aggregate it. That's all I need it to do.
Does anyone know how to construct a VIEW for this kind of situation in timescaleDB using continuous aggregation (or if there's a faster way) to fetch this?
SQL Server LAG() is a window function that provides access to a row at a specified physical offset which comes before the current row. In other words, by using the LAG() function, from the current row, you can access data of the previous row, or the row before the previous row, and so on.
From a user's perspective, TimescaleDB exposes what look like singular tables, called hypertables. A hypertable is the primary point of interaction with your data, as it provides the standard table abstraction that you can query via standard SQL.
You can achieve the desired result with window functions and a subselect:
SELECT time, value FROM (
SELECT
time,
value,
value - LAG(value) OVER (ORDER BY time) as diff
FROM hypertable) ht
WHERE diff IS NULL OR diff != 0;
You use a window function to calculate the diff to the previous row and then filter all the rows where the diff is 0 in the outer query.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With