EnOS Edge provides an edge computing service which can perform the pre-processings and calculations required for measurement points.
EnOS Edge has frequently used IoT stream analytics algorithms in store, with which developers can develop and maintain stream analytics tasks by configuring simple templates. EnOS Edge also provides a series of encapsulated StreamSets for developing custom stream analytics tasks to meet complicated business requirements.
In general, data is a series of distributed events. If you plot the events on a coordinate axis, they will form an event/data flow. Contrary to traditional offline data, the data flow is generated by continous data sources and the size of the data flow is usually smaller than offline data. Common cases of data flow include data of devices connected to a data center, device telemetry data, and log data generated by a mobile app or web app.
The edge computing service can be used fpr the following scenarios.
- Aggregation of raw asset data, where you need to filter the raw device data by a specific algorithm and save the aggregated result for further analysis.
- Calculating the device state, where you need to obtain the state parameters of a device by stream analytics for maintenance purposes.
Continuous Real-time Data Flow
The data that the stream analytics engine deals with is real-time and continuous where the data flow is subscribed and consumed by the stream analytics engine in chronological order. The generated data is also continuous, so that it can be continuously integrated into the stream analytics system.
Continous and Efficient Calculation
The stream analytics service is triggered by events, which are the continous data flow previously mentioned. Each time a new piece of data is sent to the stream analytics system, EnOS Edge initiates the stream analytics tasks.
Real-time Data Flow Integration
The calculated result of a stream analytics task is stored into the target data storage as per the pre-configured policies.
Stream Analytics Workflow¶
Raw Data Processing
The protocol program collects data points from target devices and converts them into measurement point data. The data is then sent to edge computing modules where it is filtered by pre-configured thresholds. The data filtered out by the threshold is processed by the interpolation algorithm.
The filtered data is aggregated using the algorithm defined in the data processing policy.
The aggregated result flows into the downstream modules and is stored in the time series database (TSDB) or other target storages as per the pre-configured policies. Users can query the stored data by calling APIs.