Digital twin technology has garnered a great deal of attention recently for its ability to digitally render all the components in a machine, device, or component based on direct data inputs from the real-world physical device. Benefits of digital twin technology include the ability to:
- Deliver early warning signs of issues indicating the need for maintenance before unplanned downtime occurs;
- Test equipment/system capabilities under differing operational parameters before implementing changes on the plant floor;
- Commission plant floor equipment virtually; and
- Apply insights for research and development.
Now we’re beginning to see a new application of digital twin technology where it’s being applied to broader sets of asset-related data and information, i.e., an information digital twin.
To better understand the information digital twin and its potential impact on the process industry and industrial manufacturing operations, we connected with Sean Gregerson, vice president of asset performance management at Aveva, for a recent episode of the “Automation World Gets Your Questions Answered” podcast series.
Gregerson noted that a major driver behind the development of information digital twin technology is the fact that most manufacturing and processing companies still have a disconnected data environment in their plants where “we have these siloed functional islands of data connectivity and communication, as well as divisional and stakeholder silos across engineering, operations, and maintenance.”
As a result, people at all levels of industrial business operations “still struggle to find the information they need to make timely, informed, and accurate decisions [which contributes to] asset failures, an inability to always deliver on commitments to customers, loss of profits, and safety incidents,” he said.
An information digital twin can solve a lot of these problems, Gregerson said. “You construct the information digital twin by taking all the information you have about your industrial assets today—the design information, the operations information, the commissioning information, the asset management and financial information—and fuse this together into an information data model. Then you link that information data model back to the physical asset itself in 3D in the context of its connectivity within the plant. This allows for more informed, more timely, and more accurate decisions to be made.”
Data democratization
A term closely associated with the information digital twin and Industry 4.0 in general is data democratization. Referencing recent research on industrial data, Gregerson said 50% of all the industrial data available today has been created in the last two years. And researchers estimate that 96 zettabytes worth of data has been captured, copied, and consumed in the last 12 months alone.
Though software is helping industry improve its use of data via data democratization, there's still a long way to go, Gregerson said. “Unfortunately, simply collecting more data does not translate to better decision making or improved profitability. A recent study by Seagate indicates that only 32% of all the industrial data we have available today is actually being put to work. And it's reasonable to expect, based on how quickly the amount of data that we have available is continuously growing, that this amount of unleveraged data will continue growing in the same way, unless we take some decisive steps to apply the advanced technologies we have available today.”
Defining data use
To shrink this gap between used and unused industrial data, Gregerson said it’s important for industrial users to not apply technology for technology’s sake.
“We see a lot of industrial operators building things like data lakes in the cloud to store data, but not first defining how that data is going to be used or who is going to use it,” he said. “How can that data be transformed into something that's meaningful for the consumer of the information if you don’t know who they are. Similarly, we see some industrial operators deploying AI platforms that are touted to solve any and every problem without first understanding what the problems are they're trying to solve and how that translates into something that's meaningful for a reliability engineer or a performance engineer.”
In addition to these factors, Gregerson pointed out that industrial companies are still not very good at sharing data. A study conducted by Gartner shows that industrial operators who do share information within their ecosystem of partners, suppliers, OEMs, and customers, receive 3x the economic benefit of those that do not share information.
Dominion Energy’s experience
Providing an example of a company reaping the benefits of better data analysis and sharing, Gregerson pointed to Dominion Energy, a supplier of electricity and natural gas with operations across 16 states in the U.S.
He explained that Dominion Energy is currently transforming its energy generation operations and adding more renewables. As part of this transformation, they've developed a new service for their residential and industrial customers which allows these customers to understand how their energy is being sourced and what their energy consumption patterns are.
A key technology enabling Dominion Energy to provide this information is the Aveva PI System, which collects operational data from Dominion’s assets at the edge.
“The PI System archives this information in high fidelity—second by second—compresses it, and then organizes it through an asset information model that organizes and contextualizes the data so that it's consumable by all the applications and users connected to it. The contextualization of the data at the source level helps users understand what this information is and how it can be used to make better decisions,” said Gregerson.
As part of the PI System, Dominion Energy has also implemented self-service analytics and event management. Gregerson said self-service analytics allows the company to track any KPIs (key performance indicators) or analytics they need to drive more informed and timely decisions across the business. And with event management, Dominion can create events that respond to specific conditions. “For example, if they want to notify a group of people by email or text when a certain event occurs, or automatically trigger a work order in the enterprise asset management system based on a certain condition being met,” he said.
On top of these data collection, analysis, and communication layers is a “rich visualization layer that brings all this information together and puts it in the context of the consumer of the information and their role within the business,” Gregerson said. And these visual representations are automatically updated whenever new assets are added to the system.
Another step taken by Dominion Energy to better manage and understand all the data it collects is the use of Aveva Data Hub. Gregerson explained that Aveva Data Hub takes data collected at edge and securely transports it to the cloud where the data is further contextualized.
“This information is then made available to Dominion’s customers in a very secure way and on a selective basis,” said Gregerson. Companies can now selectively make this type of information available within their own ecosystem of partners, suppliers, and OEMs, who can then take that information and translate it back into some value proposition for their business.