How Axiom Trends Data (version 24)
This article describes the process by which Axiom trends data based upon the width of the chart and the time interval being used.
Axiom doesn't trend raw data - it trends aggregated data.
If a TrendGraph is 400 pixels wide, Axiom will receive 400 values to plot. If a TrendGraph is 800 pixels wide, it will receive 800 values to plot. It receives one data value per pixel. It was designed to work this way for performance reasons.
If a tag is logging once a second, that is 86,400 values per day. If a user plots 40 of these tags and wants to view the data over the last month, that equates to: 86,400*40*30 = 103,680,000 data values to be plotted. Without aggregating the data, Axiom would not be able to send, process, and draw this data very quickly.
Because of the fixed number of values Axiom receives, the minimum aggregate duration is based on this calculation:
(TrendGraph Interval / TrendGraph Width)
For a 24 hour interval and a 400 pixel wide chart, each pixel represents 216 seconds of data. For a 24 hour interval and a 800 pixel wide chart, each pixel represents 108 seconds of data. For a 1 hour chart and a 800 pixel wide chart, each pixel represents 4.5 seconds. So changing either the TrendGraph interval or the TrendGraph width will result in a pixel representing a different interval. The data drawn is still accurate according to the aggregate interval specified and the underlying raw data, but it may be presented with more or less resolution, causing fluctuations in values drawn depending on the volatility of the data because the aggregated calculations will be performed on different slices of data.
HOW AGGREGATES ARE CALCULATED
EXAMPLE RAW DATA:
DateTime/Value
00:00: 1
00:01: 2
00:02: 3
00:03: 6
00:04: -4
00:05: 2
00:06: 2
00:07: 12
EXAMPLE 1: (4 second aggregate interval)
We split the raw data into 4 second slices and perform the aggregate calculation on the data. This reduces 8 raw values down to 2 aggregate values.
Interval 1 uses the following raw data in its aggregate calculation:
00:00: 1
00:01: 2
00:02: 3
00:03: 6
Interval 2 uses the following raw data in its aggregate calculation:
00:04: -4
00:05: 2
00:06: 2
00:07: 12
Aggregate: Average
Description: Calculates average value
Calculations:
Interval 1 = (1 + 2 + 3 + 6) / 4 = 3
Interval 2 = (-4 + 2 + 2 + 12) / 4 = 3
Aggregated Data Results:
00:00: 3
00:04: 3
Aggregate: Minimum2
Description: Calculates min value
Calculations:
Interval 1 = Min(1, 2, 3, 6) = 1
Interval 2 = Min(-4, 2, 2, 12) = -4
Aggregated Data Results:
00:00: 1
00:04: -4
Aggregate: Maximum2
Description: Calculates max value
Calculations:
Interval 1 = Max(1, 2, 3, 6) = 6
Interval 2 = Max(-4, 2, 2, 12) = 12
Aggregated Data Results:
00:00: 6
00:04: 12
EXAMPLE 2: (2 second aggregate interval)
We split the raw data into 2 second slices and perform the aggregate calculation on the data. This reduces 8 raw values down to 4 aggregate values.
Interval 1 uses the following raw data in its aggregate calculation:
00:00: 1
00:01: 2
Interval 2 uses the following raw data in its aggregate calculation:
00:02: 3
00:03: 6
Interval 3 uses the following raw data in its aggregate calculation:
00:04: -4
00:05: 2
Interval 4 uses the following raw data in its aggregate calculation:
00:06: 2
00:07: 12
Aggregate: Average
Description: Calculates average value
Calculations:
Interval 1 = (1 + 2) / 2 = 1.5
Interval 2 = (3 + 6) / 2 = 4.5
Interval 3 = (-4 + 2) / 2 = -1
Interval 4 = (2 + 12) / 2 = 7
Aggregated Data Results:
00:00: 1.5
00:02: 4.5
00:04: -1
00:06: 7
CONCLUSION
The same raw data returns different results based on the aggregate interval and the aggregate calculations that are used. The aggregates basically change the resolution of the data. It is still accurate, but will draw differently depending on the aggregate interval used to reduce the raw data down to less values. If users need the full resolution of the data without peaks or valleys averaged out, we suggest using Min or Max or a combination of Min/Max or Min/Max/Avg aggregates.
If a user wants to do calculations based on raw data, they should use the Calculations service as Axiom's ad-hoc calc engine only performs calculations against aggregated data which may present differently depending on the data and the aggregate interval that a pixel represents.