Description
The ADI Prediction server is a standard ADI server offering a service allowing:
- A client application to load models that are visible to any or selected applications.
- Many applications to request concurrently different predictions from the same or multiple models.
- Many models to reside at the same time in the ADI Prediction server cache, speeding up the prediction evaluation.
- The pre-loaded models definitions to be discovered by navigating the Address Space of the ADI Prediction server.
Main success scenario
Scenario steps |
Interface used |
|
OPC UA CreateSession |
|
OPC UA AddNode and write services |
|
OPC UA address space + ADI data types |
|
OPC UA CloseSession |
|
OPC UA CreateSession |
|
OPC UA Browse address space + ADI data types |
|
OPC UA/ADI method call service + ADI data types |
|
Vendor specific native libraries |
|
OPC UA/ADI method call service + ADI data types |
|
|
|
OPC UA CloseSession |
Many applications may execute Steps 5 to 11 concurrently.
- The client application may pass the input values and received predicted values using OPC UA/ADI address space. However, this approach does not scale up well in multi-user environment because a prediction server is expected to serve many users concurrently which make synchronization almost impossible.
- In custom / dedicated configurations, the ADI Prediction server loads a predefined set of models, so client applications can use them immediately.