For the purposes of this document, the terms and definitions given in OPC 10000-1, OPC 10000-3, OPC 10000-4, and OPC 10000-11 as well as the following apply.

timespan for which derived values are produced based on a specified Aggregate

Note 1 to entry: The total time domain specified for ReadProcessed is divided by the ProcessingInterval. For example, performing a 10-minute Average over the time range 12:00 to 12:30 would result in a set of three intervals of ProcessingInterval length, with each interval having a start time of 12:00, 12:10 and 12:20 respectively. The rules used to determine the interval Bounds are discussed in 5.4.2.2.

data that is calculated from data samples

Note 1 to entry: Data samples may be historical data or buffered real time data. An interpolated value is calculated from the data points on either side of the requested timestamp.

time immediately before endTime

Note 1 to entry: All Aggregate calculations include the startTime but exclude the endTime. However, it is sometimes necessary to return an Interpolated End Bound as the value for an Interval with a timestamp that is in the interval. Servers are expected to use the time immediately before endTime where the time resolution of the Server determines the exact value (do not confuse this with hardware or operating system time resolution). For example, if the endTime is 12:01:00, the time resolution is 1 second, then the EffectiveEndTime is 12:00:59. See 5.4.2.4.

If time is flowing backwards, Servers are expected to use the time immediately after endTime where the time resolution of the Server determines the exact value.

data constructed from a discrete data set but is outside of the discrete data set

Note 1 to entry: It is similar to the process of interpolation, which constructs new points between known points, but its result is subject to greater uncertainty. Extrapolated data is used in cases where the requested time period falls farther into the future than the data available in the underlying system. See example in Table 1.

simple linear interpolation

Note 1 to entry: Compare to curve fitting using linear polynomials. See example in Table 1.

holding the last data point constant or interpolating the value based on a horizontal line fit

Note 1 to entry: Consider the following Table 1 of raw and Interpolated/Extrapolated values:

Table 1 – Interpolation examples

Timestamp

Raw Value

Sloped Interpolation

Stepped Interpolation

12:00:00

10

12:00:05

15

10

12:00:08

18

10

12:00:10

20

12:00:15

25

20

12:00:20

30

SlopedExtrapolation

SteppedExtrapolation

12:00:25

35

30

12:00:27

37

30

values at the startTime and endTime needed for Aggregates to compute the result

Note 1 to entry: If Raw data does not exist at the startTime and endTime a value shall be estimated. There are two ways to determine Bounding Values for an interval. One way (called Interpolated Bounding Values) uses the first non-Bad data points found before and after the timestamp to estimate the bound. The other (called Simple Bounding Values) uses the data points immediately before and after the boundary timestamps to estimate the bound even if these points are Bad. Subclauses 3.1.8 and 3.1.9 describe the two different approaches in more detail.

In all cases the TreatUncertainAsBad (see 4.2.1.2) flag is used to determine whether Uncertain values are Bad or non-Bad.

If a Raw value was not found and a non-Bad bounding value exists the Aggregate Bits (see 5.3.3) are set to ‘Interpolated’.

When calculating bounding values, the value portion of Raw data that has Bad status is set to null. This means the value portion is not used in any calculation and a null is returned if the raw value is returned. The status portion is determined by the rules specified by the bound or Aggregate.

The Interpolated Bounding Values approach (see 3.1.8) is the same as what is used in Classic OPC Historical Data Access (HDA) and is important for applications such as advanced process control where having useful values at all times is important. The Simple Bounding Values approach (see 3.1.9) is new in this standard and is important for applications which shall produce regulatory reports and cannot use estimated values in place of Bad data.

bounding values determined by a calculation using the nearest Good value

Note 1 to entry: Interpolated Bounding Values using SlopedInterpolation are calculated as follows:

  • if a non-Bad Raw value exists at the timestamp then it is the bounding value;
  • find the first non-Bad Raw value before the timestamp;
  • find the first non-Bad Raw value after the timestamp;
  • draw a line between before value and after value;
  • use point where the line crosses the timestamp as an estimate of the bounding value.

The calculation can be expressed with the following formula:

Vbound = (Tbound – Tbefore)x( Vafter – Vbefore)/( Tafter – Tbefore) + Vbefore

where Vx is a value at ‘x’ and Tx is the timestamp associated with Vx.

If no non-Bad values exist before the timestamp the StatusCode is Bad_NoData. The StatusCode is Uncertain_DataSubNormal if any Bad values exist between the before value and after value. If either the before value or the after value are Uncertain the StatusCode is Uncertain_DataSubNormal. If the after value does not exist the before value shall be extrapolated using SlopedExtrapolation or SteppedExtrapolation.

The period of time that is searched to discover the Good values before and after the timestamp is Server dependent, but if a Good value is not found within some reasonable time range then the Server will assume it does not exist. The Server as a minimum should search a time range which is at least the size of the ProcessingInterval.

Interpolated Bounding Values using SlopedExtrapolation are calculated as follows:

  • find the first non-Bad Raw value before timestamp;
  • find the second non-Bad Raw value before timestamp;
  • draw a line between these two values;
  • extend the line to where it crosses the timestamp;
  • use the point where the line crosses the timestamp as an estimate of the bounding value.

The formula is the same as the one used for SlopedInterpolation.

The StatusCode is always Uncertain_DataSubNormal. If only one non-Bad raw value can be found before the timestamp then SteppedExtrapolation is used to estimate the bounding value.

Interpolated Bounding Values using SteppedInterpolation are calculated as follows:

  • if a non-Bad Raw value exists at the timestamp then it is the bounding value;
  • find the first non-Bad Raw value before timestamp;
  • use the value as an estimate of the bounding value.

The StatusCode is Uncertain_DataSubNormal if any Bad values exist between the before value and the timestamp. If no non-Bad Raw data exists before the timestamp then the StatusCode is Bad_NoData. If the value before the timestamp is Uncertain the StatusCode is Uncertain_DataSubNormal. The value after the timestamp is not needed when using SteppedInterpolation; however, if the timestamp is after the end of the data then the bounding value is treated as extrapolated and the StatusCode is Uncertain_DataSubNormal.

SteppedExtrapolation is a term that describes SteppedInterpolation when a timestamp is after the last value in the history collection.

bounding values determined by a calculation using the nearest value

Note 1 to entry: Simple Bounding Values using SlopedInterpolation are calculated as follows:

  • if any Raw value exists at the timestamp then it is the bounding value;
  • find the first Raw value before timestamp;
  • find the first Raw value after timestamp;
  • if the value after the timestamp is Bad then the before value is the bounding value;
  • draw a line between before value and after value;
  • use point where the line crosses the timestamp as an estimate of the bounding value.

The formula is the same as the one used for SlopedInterpolation in Clause 3.1.5.

If a Raw value at the timestamp is Bad the StatusCode is Bad_NoData. If the value before the timestamp is Bad the StatusCode is Bad_NoData. If the value before the timestamp is Uncertain the StatusCode is Uncertain_DataSubNormal. If the value after the timestamp is Bad or Uncertain the StatusCode is Uncertain_DataSubNormal.

Simple Bounding Values using SteppedInterpolation are calculated as follows:

  • if any Raw value exists at the timestamp then it is the bounding value;
  • find the first Raw value before timestamp;
  • if the value before timestamp is non-Bad then it is the bounding value.

If a Raw value at the timestamp is Bad the StatusCode is Bad_NoData. If the value before the timestamp is Bad the StatusCode is Bad_NoData. If the value before the timestamp is Uncertain the StatusCode is Uncertain_DataSubNormal.

If either bounding time of an interval is beyond the last data point then the Server may use extrapolation or return an error. If extrapolation is used by the server the type [SteppedExtrapolation or SloppedExtrapolation] of extrapolation is server specific.

In some Historians, the last Raw value does not necessarily indicate the end of the data. Based on the Historian's knowledge of the data collection mechanism, i.e. frequency of data updates and latency, the Historian may extend the last value to a time known by the Historian to be covered. When calculating Simple Bounding Values the Historian will act as if there is another Raw value at this timestamp.

In the same way, if the earliest time of an interval starts before the first data point in history and the latest time is after the first data point in history, then the interval will be treated as if the interval extends from the first data point in history to the latest time of the interval and the StatusCode of the interval will have the Partial bit set (see 5.3.3.2).

The period of time that is searched to discover the values before and after the timestamp is Server dependent, but if a value is not found within some reasonable time range then the Server will assume it does not exist. The Server as a minimum should search a time range which is at least the size of the ProcessingInterval.