While it’s commonly agreed that data is the currency of the IoT, there’s no ready-to-cook recipe on how this journey from data to value works, especially in the industrial IoT (IIoT) domain, characterized not only by the high number of dispersed and diverse devices, but also by the high volumes of non-homogeneous device data.
IIoT demands both timeliness of such device data in the operational technology (OT) side of the system, as well as exploiting the real-time awareness stemming from the aggregation and analysis of these various data streams at the edges of the IIoT system. To ultimately create business value, best-in-class edge and/or cloud-based information technology (IT) exploits this awareness and turns it into actionable insights for new business models, optimized asset utilization, employee productivity, etc.
Developing business-critical IIoT systems is challenging, as 'multi-dimensional engineering' is required to address often conflicting requirements with regard to complexity, scalability, timeliness, security, maintainability and evolvability.
As there's no out-of-the-box solution for building your IIoT system—addressing your business goals and exploiting your devices interconnected by your OT/IT infrastructure—a generic approach to the system architecture is needed that balances the required degrees of freedom with regard to (1) exploiting specific devices, (2) transforming their specific data into common awareness (aka information) and (3) allowing for real-time control of the system's assets based on best-in-class edge and/or cloud-based information technologies.
IIoT systems require a robust, secure architecture that stands the test of time while also addressing:
- Connectivity (for dispersed devices, edge-based real-time analytics-processing & control, cloud-based information technologies)
- Syntactic interoperability (common/understood awareness allowing for edge and/or cloud-based processing leading to actionable insights and value)
- Ubiquitous and fault-tolerant data availability (allowing the exploitation of any device data to create real-time awareness of assets and shared awareness/information anywhere, whether at the edge or in any private/public cloud)
- Unconstrained innovation (incremental change with regard to supporting devices, new target platforms, new engines for visualization/analytics, new control/optimization strategies, etc. without jeopardizing the currently running system)
Moving Towards a Solution
To overcome the identified challenges of critical end-to-end IIoT implementations, the ADLINK Data River™ exploits the Industrial Internet Consortium (IIC) multi-tier reference architecture, supported by our Vortex DDS standards-based connectivity and data-sharing framework. The ADLINK Data River™ can be characterized by the following viewpoints.
All data, when aggregated over time and across multiple things, provides the basic awareness of the system, its data and its behavior and ultimately allows business-value creation by improving efficiencies or even creating new revenue streams. From the data perspective, this is realized through:
- Data Centricity: Modeling the system in data (devices/observations, awareness/analytics) and making such data ubiquitous, e.g., discoverable and available when/where it’s needed. Applying an architecture and underlying infrastructure that treats data appropriately.
- Layered Data Bus Architecture: Self-forming, logical data planes on machine (subsystem), system (factory), system-of-systems (site) or Internet (inter-site) level. Reduced system complexity by QoS-aware data management outside (timely distribution and robust availability) and transparency to the applications that produce and/or consume data. Maximized system flexibility by distinguishing between different planes of data, including raw device or I/O data, normalized and filtered awareness data and actionable business data.
- Common Data Model for Things: The common thing model describes all things in the system, allowing to capture the system as a hierarchy of things ranging from a simple sensor up to a complex, multi-machine system, and eventually up to assets distributed across the globe. This model takes into account metadata and various forms of descriptive data, such as state, telemetry and event data.
The Data River reduces the complexity of distributed processing, providing the right data at the right time in the right place with the right compute to make the right decisions. From a processing perspective, this is achieved by:
- Digital Twin Modeling: Exploiting virtual projections of real-world assets in the Data River to provide real-time control over asset behavior and allow physical equipment to be replaced by simulators either in advance of actual availability or to utilize earlier stored data for training and analysis.
- Distributed Processing: A dynamic set of applications or micro-services with maximum autonomy and minimal dependencies with the end goal of providing derived-value and business insights. Features include real-time data exchange and discovery of both data and users to enable spontaneous integration into a running system and QoS-driven data management for scalable, robust and real-time composition of the IIoT system components.