You may not have heard of digital twins, but this intriguing example of technological convergence is actually a well-established concept. One of the first recorded uses of the phrase was in a 2002 white paper written by a senior director at Michigan State University, while NASA has been using the principle for monitoring and modeling space travel since the ill-fated Apollo 13 mission in 1970. However, in recent years the Internet of Things has brought this concept into the mainstream.
Real-world representations in real time
It’s important to understand that the phrase ‘digital twins’ refers to a concept, rather than a product or service. It’s unrelated to the ongoing process of digitization, where real objects are digitally remastered prior to being disposed of. Instead, it involves physical assets or entities having a virtual equivalent – replicating a product or service. The convergence of these physical and digital worlds creates a living model where one supports the other.
Some examples of digital twins:
- Wearable tech is packed with sensors, relaying data to a mainframe where algorithms analyze it before generating personalized feedback.
- Modeling software creates virtual copies of real-world objects to permit stress testing, simulations, and prognostics.
- Engines generate extensive diagnostic data, enabling mechanics and engineers to identify failing components or potential improvements.
Data is both the driver and the output. Digital twin data allows companies to redesign or optimize inefficient products, experiment with hypothetical scenarios, and provide greater customer support. Testing is always conducted on the virtual twin first before successful outputs are rolled out across its real-world sibling. It supports consequence-free destruction testing in specialist fields like air wake modeling – establishing the wind force needed to overturn a turbine or pylon, for instance. That could drive down insurance costs or satisfy regulatory tests, while improving durability may also reduce long-term warranty and repair costs.
Innovation without disruption
The principle of digital twins is now being supported by technological advances. Augmented reality enables virtual data to be overlaid on physical environments, and in the same way, electronic data supports hardware development. The explosion in sensor usage (allied to their declining unit price and superior reporting capabilities) has also been crucial. Sensors are now capable of providing reams of feedback on both mechanical and electrical processes. Data may also be used to identify weaknesses or test what-if scenarios. Efficiency is often at the heart of these processes, supporting cost-effective innovation and dependable rollouts of new products or services.
Things aren’t what they used to be
The most significant event in the development of this concept has been the Internet of Things. This has given previously passive and offline devices the ability to engage in two-way communications with a server or database. This is normally actioned using a combination of WiFi, 4G and Bluetooth connectivity, though hardwiring is still occasionally necessary. The impending 5G rollout should streamline communications even further, giving devices a single (and always accessible) communication protocol, whether they’re inside or outside the home. Signal dropouts ought to be eliminated altogether, making 5G an ideal conduit for transferring data between digital twins.
Generating reams of data doesn’t intrinsically mean a product or service has a digital twin, however. The process really begins when companies use accumulated data to deliver specific outcomes, namely greater efficiencies, or more durable products. It’s estimated that half of large industrial companies will have adopted digital twin processes by 2021, achieving an average 10% improvement in operating effectiveness. Indeed, manufacturing and engineering are predicted to be the industries which will benefit most from this greater visibility and insight into daily operations.