No matter where you’re at in your digital transformation process, the core requirement of connecting automated machines or systems for any kind of Industry 4.0 or Internet of Things integration or analyses requires data standardization. The reason for this is that data from these various systems is created in many ways. Therefore, to make each system’s data understandable, transferable, and digestible in real time by any other system, standardization of the originating system’s data is a must.
Aja adds that “UNS is not only here to stay but is becoming the default deployment option. The majority of our projects in the past several years have included partial or complete utilization of a UNS.”
While a UNS serves as the critical data connector, it is just an architecture designed to be used in this manner. How the data within it is handled can be done in several different ways depending on how the UNS is built.
MQTT broker
Over the past several years now, MQTT has become a preferred method for making plant floor data accessible to multiple sources in a way that does not impact the performance of the plant’s equipment. According to Harrington, the most frequently deployed UNS architecture involves building it in an MQTT broker. In this manner, data from multiple systems is published to the UNS via a topic structure in the MQTT broker. This basic structure usually follows the broad ISA-95 outline of company, site, area, line, work cell, and asset.
“The Intelligence Hub can consume data from multiple sources, assemble the data into logical payloads that follow ISA-95 structure or other standards, and publish the payload to the UNS,” he says. “The Intelligence Hub can also be used to subscribe to data in the UNS and then publish it to systems via MQTT or through REST APIs, SQL databases, or OPC namespaces for those systems that don’t communicate over MQTT.”
Data lakes
Rather than adjust existing system integrations, some companies prefer to send their data to a cloud-based data lake for use by analytics or dashboard visualization applications. Even here, data standardization and contextualization are still required due to the varied origins of the data as noted above. Here, Intelligence Hub can serve as a UNS gateway.“Models in the Intelligence Hub consolidate and standardize data into contextualized information objects, while flows in the Intelligence Hub target data to the specific location in the data lake or cloud broker where they are needed,” explains Harrington.
Integration hub
This is type of UNS implementation referenced above by Aja. Here the focus is not on creating a central broker, but on integration—which helps explains the interest from system integrators—as it standardizes data for transfer among systems.
To further clarify why this hub-and-spoke approach is preferred to other methods of data centralization and transference for integration projects, Aja says, “Data historians, for example, often involve a complex series of drivers and connectors to roll the data up into the centralized storage, whereas the UNS model allows for a single connection to the UNS to collect all the data. The reporting platform can then query the UNS, which knows to query the data historian (and other databases if required) to return the desired data. This helps eliminate the complexity, points of failure, and maintenance costs required to keep vast amounts of connectors updated and operable.”