Is the digital twin ready for widespread adoption?

The technology world is abuzz with talk of the digital twin. Gartner identified the digitaleitan twin as one of the top 10 strategic technology trends in 2017. Not only have GE and Siemens committed massive investments in this arena, but they are publicly tying their future earnings to the success of these initiatives.  

What is a digital twin? In simple terms, a digital twin is the virtual simulation of a physical asset using CAD 3D and physical models. Real-time sensor data is streamed from the physical equipment to the virtual equipment so that it is “live.” In its early days, some of the more enthusiastic industry analysts considered the digital twin to be the Holy Grail of predictive asset maintenance.

Although the digital twin has tremendous potential, there are some factors that are likely to limit its widespread integration into the industrial sphere.

The digital twin is most appealing (and is likely to have the biggest financial impact) for mass-manufactured, high-ticket items. Let us take the automobile industry as an example. A digital twin in a vehicle can use sensors to track electrochemical changes in the vehicle and transmit this data to a service center. Depending on the scenario, repairs can be done via remote or can be scheduled proactively.

In the automobile scenario, each new car model is designed with the one digital twin. For instance, Tesla uses a digital twin of every Vehicle Identification Number (VIN) they manufacture. The data continuously moves from the car to the factory. If there is a problem with a car’s electrical system, Tesla can download software updates to the customer’s car and fix the problem remotely. Creating the computerized (or digital) car model and deploying a machine-learning analysis tool is a one-time exercise needed for each new model. There may be some level of model customization or collaboration for each individual VIN, but the bulk of the work is done at a model level. 

In the case of the industrial plant, it is much harder to scale the deployment of a digital twin because it requires the 3D and physical process modeling of a heterogenous manufacturing facility. In most cases, it is hard to scale because no two facilities are the same. This modeling requires accurate access to the facility blueprints. If alterations to the facility were made and blueprints were not updated, it would result in inaccuracies in the digital twin.

Similarly, industrial plants do not source from a single OEM supplier. For instance, an oil refinery has multiple sourcing options for gas processing, vacuum distillation and hydrotreating. Even within the same facility, as equipment is retired, its replacement can come from a new OEM. Procurement is typically decentralized and different organizations mandate different levels of standardization: in some cases, the purchase of new equipment is made at the discretion of the local facility, whereas some companies negotiate global sourcing agreements.

It is possible that, in the future, vendors will offer off-the-shelf machine assets with embedded digital twins. But we are stuck in the present. Today, the problem is that without significant investments in creating a virtual model from machine blueprints, it will be difficult to realize the value of a digital twin.

The good news is that the Gartner report mentioned above contains a second trend for 2017 that warrants investigation. As Gartner stated at the top of its list, along with digital twin, AI and advanced machine learning top 10 strategic technology trends in 2017. In other words, even if manufacturing facilities cannot tap into the potential of the digital twin, advanced machine learning to predictive asset maintenance is already accessible.  

Eitan Vesely is CEO of Presenso

 

Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

  • Process unit-level process model is already available in some plants and the process data is already being historized in the historian so this level of digital twin is already in use in some process plants such as for process simulation and optimization. Now we are looking at finer granularity digital twin, down to individual pieces of process equipment such as pumps, compressors, fans, blowers, cooling towers, heat exchangers and air cooled heat exchangers etc. Therefore plants are now being instrumented to a much greater degree with many more sensors to collect multiple data points on each piece of equipment enabling equipment-level digital twin. The historian should have the ability to logically group data such that all data associated with a plant area, process unit, or piece of equipment is grouped together. Learn more from this essay: https://www.linkedin.com/pulse/iiot-platform-middleware-integrate-dont-duplicate-decimate-berge

    Reply

RSS feed for comments on this page | RSS feed for all comments