Dr Rajiv Desai

An Educational Blog

Digital Twin

Digital Twin: 

Above is an image from the digital twin (DT) of BMW’s factory in Regensburg, Bavaria, created in NVIDIA’s Omniverse. There are two versions of a BMW factory in the medieval town of Regensburg, Germany. One is a physical plant that cranks out thousands of cars a year. The other is a virtual 3-D replica, accessed by screen or VR headset, in which every surface and every bit of machinery looks exactly the same as in real life. Soon, whatever is happening in the physical factory will be reflected inside the virtual one in real time: frames being dipped in paint; doors being sealed onto hinges; avatars of workers carrying machinery to its next destination. The latter factory is an example of a “digital twin”: an exact digital re-creation of an object or environment. The concept might at first seem like sci-fi babble or even a frivolous experiment: Why would you spend time and resources to create a digital version of something that already exists in the real world?  

_____

_____

Section-1  

Prologue:

For centuries, people have used pictures and models to help them tackle complex problems. Great buildings first took shape on the architect’s drawing board. Classic cars were shaped in wood and clay. Over time, our modelling capabilities have become more sophisticated. Computers have replaced pencils. 3D computer models have replaced 2D drawings. Advanced modelling systems can simulate the operation and behavior of a product as well as its geometry. Until recently, however, there remained an unbridged divide between model and reality. No two manufactured objects are ever truly identical, even if they have been built from the same set of drawings. Computer models of machines don’t evolve as parts wear out and are replaced, as fatigue accumulates in structures, or as owners make modifications to suit their changing needs. That gap is now starting to close. Fueled by developments in the internet of things (IoT), big data, artificial intelligence, cloud computing, and digital reality technologies, the recent arrival of digital twins (DTs) heralds a tipping point where the physical and digital worlds can be managed as one, and we can interact with the digital counterpart of physical things much like we would interact with physical things themselves in 3D space around us. Led by the engineering, manufacturing, automotive, and energy industries in particular, digital twins are already creating new value. They are helping companies to design, visualize, monitor, manage, and maintain their assets more effectively. And they are unlocking new business opportunities like the provision of advanced services and the generation of valuable insight from operational data. 

_

A Digital Twin is a virtual world that matches the real world in its complexity, scale, and accuracy. It’s an exact digital re-creation of an object or environment, like a road network or underground water infrastructure— and there are so many things you can do in it. The ‘measure twice, cut once’ proverb in carpentry teaches that measurements should be double-checked to ensure accuracy before cutting the wood. That is, before taking any action, we must carefully plan so that we do not waste time, energy, or resources correcting mistakes. With a Digital Twin, city planners can see what would happen if they modified a city’s layout, planned a road, or changed the traffic systems. They can compute not just one possible future but many possible futures. And if it doesn’t work in the Digital Twin, it won’t work in the real world. Testing it out first means we prevent bad decisions. And the more ‘what-if’ situations we test, the more creative and effective the solution will be. It’s how we measure twice and cut once. Digital twins play the same role for complex machines and processes as food tasters for monarchs or stunt doubles for movie stars: They prevent harm that otherwise could be done to precious assets. Having made their way to the virtual world, duplicates save time, money, and effort for numerous businesses — protecting the health and safety of high value resources.

_

Marshall McLuhan once famously observed, “First we build the tools, then they build us.” Building on artificial intelligence (AI), the Internet of Things (IoT), and 5G communications, advanced software systems are remaking the nature and complexity of human engineering. In particular, digital twin technology can provide companies with improved insights to inform the decision-making process. Digital Twin can be defined as a software representation of a physical asset, system or process designed to detect, prevent, predict and optimize through real time analytics to deliver business value. There are many definitions of a digital twin but the general consensus is around this definition “a virtual representation of an object or system that spans its lifecycle, is updated from real-time data and uses simulation, machine learning, and reasoning to help decision-making.” A digital twin is a virtual instance of a physical system (twin) that is continually updated with the latter’s performance, maintenance, and health status data throughout the physical system’s life cycle. A digital twin leverages artificial intelligence, the Internet of Things, big data, blockchain, virtual reality technologies, collaborative platforms, APIs, and open standards. By having better and constantly updated data related to a wide range of areas, combined with the added computing power that accompanies a virtual environment, digital twins are able to give a clearer picture and address more issues from far more vantage points than a standard simulation can, with greater ultimate potential to improve products and processes.

_

It is a common misconception that Digital Twins only come into use once the physical system has been created but Digital Twins are digital replicas of real and potential physical assets (i.e., “Physical Twins”). The ability to develop and test new products using virtual models (digital twins) before producing them physically has saved companies valuable time and money. Digital twins are used throughout the programme lifecycle and have reduced engineers’ reliance on physical prototypes. The digital twin can be any of the following three types: 1) digital twin prototype (DTP); 2) digital twin instance (DTI); and 3) digital twin aggregate (DTA). A DTP is a constructed digital model of an object that has not yet been created in the physical world, e.g., 3D modelling of a component. The primary purpose of a DTP is to build an ideal product, covering all the important requirements of the physical world. On the other hand, a DTI is a virtual twin of an already existing object, focusing on only one of its aspects. Finally, a DTA is an aggregate of multiple DTIs that may be an exact digital copy of the physical twin; for example, the digital twins of a spacecraft structure. The digital twin concept is a perfect example of how the physical and virtual worlds converge. Essentially, a digital twin makes use of real-world data to create a simulation via a computer program that predicts how a product or process will fare. These programs are easily integrated into the Internet of Things (IoT), artificial intelligence (AI), as well as software analytics–all in a bid to enhance output.    

_

Some examples of digital twin:   

-1. Twin earth – NVIDIA Earth 2:

NVIDIA’s recently launched Earth-2 initiative aims to build digital twins of the Earth to address one of the most pressing challenges of our time, climate change. Earth-2 aims to improve our predictions of extreme weather, projections of climate change, and accelerate the development of effective mitigation and adaptation strategies — all using the most advanced and scientifically principled machine learning methods at unprecedented scale. Combining accelerated computing with physics-informed machine learning at scale, on the largest supercomputing systems today, Earth-2 will provide actionable weather and climate information at regional scales.

-2. Tesla – a digital twin for every car made:

Tesla creates a digital simulation of every one of its cars, using data collected from sensors on the vehicles and uploaded to the cloud. These allow the company’s AI algorithms to determine where faults and breakdowns are most likely to occur and minimize the need for owners to take their cars to servicing stations for repairs and maintenance. This reduces cost to the company of servicing cars that are under warranty and improves user experience, leading to more satisfied customers and a higher chance of winning repeat business.

-3. Digital twin of a city

Do you know that the densely populated city of Shanghai has its own fully deployed digital twin? This was created by mapping every physical device to a new virtual world and applying artificial intelligence, machine learning and IoT technologies to that map. The “digital twin” technique has been used well in pandemic control and prevention following the COVID-19 outbreak. Similarly, Singapore is bracing for a full deployment of its own digital twin.

_

Digital twins are now proving invaluable across multiple industries, especially those that involve costly or scarce physical objects. Created by feeding video, images, blueprints or other data into advanced 3-D mapping software, digital twins are being used in medicine to replicate and study internal organs. They’ve propelled engineers to devise car and plane prototypes—including Air Force fighter jets—more quickly. They allow architects and urban planners to envision and then build skyscrapers and city blocks with clarity and precision. And in 2021, digital twins began to break into the mainstream of manufacturing and research. Chipmaker Nvidia launched a version of its Omniverse 3-D simulation engine that allows businesses to build 3-D renderings of their own—including digital twins. Amazon Web Services announced a competing service, the IoT TwinMaker. Digital twins could have huge implications for training workers, for formulating complicated technical plans without having to waste physical resources—even for improving infrastructure and combatting climate change. Health care, construction, education, taking city kids on safari: it’s hard to imagine where digital twins won’t have an impact. In the future, everything in the physical world would be replicated in the digital space through digital twin technology.

In a lighter vein, my digital twin will keep writing articles for you after my death. 

______

______

Abbreviations and synonyms:

DT = digital twin

MDT = mobile digital twin

CDT = cognitive digital twin

CAD = computer aided design

CAE = computer aided engineering

CAM = computer aided manufacturing

AI = artificial intelligence

ML = machine learning

IoT = internet of things

VR = virtual reality

AR = augmented reality

XR = extended reality  

MR = mixed reality

CPS = cyber-physical systems

PLM = product lifecycle management

NPP = nuclear power plant

NRC = nuclear regulatory commission

O&M = operating and maintenance

PRA = probabilistic risk assessment

MBSE = model-based systems engineering

FEA = finite element analysis

PDM = product data management

ERP = enterprise resource planning

MRP = material requirements planning

CAFM = computer aided facility management

RUL = remaining useful life

BOM = bill of materials

ROI = return on investment

CVE = common vulnerabilities and exposures

BIM = building information modelling

OT = operational technology

DTO = digital twin of an organization

OEM = original equipment manufacturer

_____

_____

Section-2

Digital engineering, Industry 4.0 and digital twins:

_

Digital Engineering and digital twins:   

Digital Engineering is an umbrella term referring to the synergistic application of electronic and software technologies to facilitate the architecture, analysis, design, simulation, building, and testing of complex software-intensive systems-of-systems. The ultimate goal of Digital Engineering is to produce Digital Twins, or digital replicas of real and potential physical assets (i.e., “Physical Twins”) that include both inanimate and animate physical entities, with broad usages and where for Systems Integration & Testing purposes the former are largely indistinguishable from the latter.

_

The increasing reliance of our information-age economies and governments on software-intensive Systems-of-Systems makes us progressively more vulnerable to their failures. Due to advances in Artificial Intelligence (AI) and Machine Learning (ML), these Systems-of-Systems are exponentially increasing in size and complexity, and traditional Model-Based Systems Engineering technologies (MBSE) are insufficient to manage them. Compare and contrast recent Lockheed-Martin F-35 project problems and Boeing 737 MAX MCAS project problems. Consequently, we must seek more robust and scalable solutions to the System-of-System Sprawl (cf. System Ball-of-Mud) anti-pattern, which is a fractal problem. Fractal problems require recursive solutions. Digital Engineering’s recursive “Simulation-of-Simulations” approach to the System-of-Systems fractal problem represents a quantum improvement over traditional MBSE approaches.

_

Digital engineering and digital twins are closely related fields, as digital engineering techniques and tools are often used to develop and maintain digital twins. Digital engineering is an interdisciplinary field that uses a variety of techniques and tools, such as modelling, simulation, and analytics, to design, develop, and analyze complex systems and Systems-of-Systems. Digital twins are digital representations of physical systems or processes, and can be used to simulate, monitor, and optimize the performance and behavior of these systems in real time. In other words, digital engineering provides the tools and techniques that are used to create, manage, and analyze digital twins, while digital twins provide a means by which to apply these techniques and tools to specific systems and applications. Digital engineering and digital twins are often used together in a variety of applications, such as engineering design, manufacturing, supply chain management, and maintenance. Overall, digital engineering and digital twins are distinct yet interrelated fields, and they often intersect and inform each other in various ways. Digital engineering provides the methods and tools used to develop and maintain digital twins, whereas digital twins provide a means by which to apply said methods and tools.

_

A Digital Twin is a real-time virtual replica of a real-world physical system or process (i.e., a “Physical Twin”) that serves as its indistinguishable digital counterpart for practical purposes, such as systems simulation, integration, testing, and maintenance. In order to pragmatically achieve indistinguishability between Digital Twins and their Physical Twin counterparts, massive coarse- and fine-grained Modelling & Simulation (M&S) is typically required. From a pragmatic Systems Engineering perspective, Physical Twins represent “Systems-of-Systems” and Digital Twins represent “Simulations-of-Simulations” of “Systems-of-Systems”.

Digital Twins are produced by systematically applying Digital Engineering technologies, including both dynamic (behavioral) and mathematical (parametric) M&S, in a synergistic manner that blurs the usual distinctions between real physical systems and virtual logical systems. From a Systems Integration & Testing perspective, a Digital Twin should be indistinguishable from its Physical Twin counterpart. Consequently, Digital Twins can be Verified & Validated (V&V) by the following pragmatic test:

A reasonable litmus test for a “Digital Twin” is analogous to the Turing Test for AI. Suppose you have a System Tester (ST), a Physical System (PS), and a Digital Twin (DT) for PS. If the ST can run a robust system Verification & Validation (V&V) Test Suite on both PS and DT, but cannot reliably distinguish between them with a probability > 80%, ∴ DT is a bona fide Digital Twin of PS.

_

Digital Engineering Technical Architecture Framework™ (DETAF™) is a reference Enterprise Architecture Framework (EAF) for architecting, designing, building, testing, and integrating Digital Twins (virtual Simulations-of-Simulations) that are indistinguishable from their Physical Twin (physical Systems-of-Systems) counterparts. The architecture infrastructure of DETAF is the 6D System M-Model™ Enterprise Architecture Framework for Agile MBSE™, which has been customized to support the following core Digital Engineering technologies (figure below), which are listed as related technology pairs: Massive Mathematical & Dynamic M&S; Artificial Intelligence (AI) & Machine Learning (ML); 3D CAD & 3D Printing; and Virtual Reality (VR) & Augmented Reality (AR). The DETAF is designed to be highly scalable, simulatable, and customizable.

_

Digital Engineering and its associated Digital Twins promise to revolutionize how we architect future Systems-of-Systems. The following interrelated Engineering sub-disciplines will likely play key roles in the Digital Twin technology revolution:

  • Agile MBSE technologies provide the architecture infrastructure for recursively scaling and simulating Digital Twins. Agile MBSE technologies are the distillation of a quarter century of lessons-learned from Model-Driven Development (MDD), Model-Based Systems Engineering (MBSE), and Agile/Lean Development technologies, and can provide the recursive architecture and design patterns needed to address the fractal Digital Twin System-of-System challenge.
  • Massive Mathematical & Dynamic M&S technologies provide the large-scale Modelling & Simulation (M&S) infrastructure necessary for recursively simulating Physical Twins as Digital Twins. It is essential that mathematical (parametric) and dynamic (behavioral) simulations for Digital Twins are fully integrated and applied on a massive scale. Compare and contrast how massive data sets and big data fuel Machine Learning (ML) technology. Stated otherwise, if a Physical Twin is a “System-of-Systems”, then a Digital Twin is a “Simulation-of-Simulations” for a “System-of-Systems”.
  • Artificial Intelligence (AI) & Machine Learning (ML) technologies provide the software infrastructure needed for constructing intelligent (“smart”) Physical Twins, and by extension their Digital Twin counterparts. Since AI & ML technologies are rapidly transforming how software-intensive Systems-of-Systems interact with humans, these complementary technologies will play a critical role in the evolution of Digital Twin technology.
  • 3D CAD & 3D Printing (a.k.a. Additive Manufacturing) technologies provide the electro-mechanical infrastructure needed for the physical construction of Physical Twins, and by extension their Digital Twin counterparts. Since [3D CAD] Design = Implementation [3D Printed Product] is rapidly transforming how we design and manufacture Systems-of-Systems, these complementary technologies will also play a vital role in the evolution of Digital Twin technology.
  • Virtual Reality (VR) & Augmented Reality (AR) technologies provide the simulation infrastructure for the massive simulations (Simulation-of-Simulations) required by Digital Twins in order to be indistinguishable from their Physical Twin counterparts. Since VR/AR technologies are rapidly transforming how we interact with Systems-of-Systems, they will additionally play a critical role in the evolution of Digital Twin technology.

______

______

Industrial Revolutions:

First Industrial Revolution: 

The First Industrial Revolution was marked by a transition from hand production methods to machines through the use of steam power and water power. The implementation of new technologies took a long time, so the period which this refers to was between 1760 and 1820, or 1840 in Europe and the United States. Its effects had consequences on textile manufacturing, which was first to adopt such changes, as well as iron industry, agriculture, and mining although it also had societal effects with an ever stronger middle class.

Second Industrial Revolution:

The Second Industrial Revolution, also known as the Technological Revolution, is the period between 1871 and 1914 that resulted from installations of extensive railroad and telegraph networks, which allowed for faster transfer of people and ideas, as well as electricity. Increasing electrification allowed for factories to develop the modern production line. It was a period of great economic growth, with an increase in productivity, which also caused a surge in unemployment since many factory workers were replaced by machines.

Third Industrial Revolution:

The Third Industrial Revolution, also known as the Digital Revolution, occurred in the late 20th century, after the end of the two world wars, resulting from a slowdown of industrialization and technological advancement compared to previous periods. The production of the Z1 computer, which used binary floating-point numbers and Boolean logic, a decade later, was the beginning of more advanced digital developments. The next significant development in communication technologies was the supercomputer, with extensive use of computer and communication technologies in the production process; machinery began to abrogate the need for human power.

Fourth Industrial Revolution:

Industry 4.0 is used interchangeably with the fourth industrial revolution and represents a new stage in the organization and control of the industrial value chain. Cyber-physical systems form the basis of Industry 4.0 (e.g., ‘smart machines’). In essence, the Fourth Industrial Revolution is the trend towards automation and data exchange in manufacturing technologies and processes which include cyber-physical systems (CPS), IoT, industrial internet of things, cloud computing, cognitive computing, and artificial intelligence. CPSs are defined as systems that work by using various sensors to understand physical components, automatically transferring captured data to cyber components, analyzing the data, and then converting the data into required information via cyber processes that can be used to make required decisions and actions. Intelligent building systems are one of the examples of CPSs. The machines cannot replace the deep expertise but they tend to be more efficient than humans in performing repetitive functions, and the combination of machine learning and computational power allows machines to carry out highly complicated tasks. 

_

Industry 4.0 is a concept that refers to the current trend of technology automation and data exchange, which includes cyber-physical systems (CPSs), the Internet of things (IoT), cloud computing, cognitive computing and developing smart businesses. The Fourth Industrial Revolution, 4IR, or Industry 4.0, conceptualizes rapid change to technology, industries, and societal patterns and processes in the 21st century due to increasing interconnectivity and smart automation. The term has been used widely in scientific literature, and in 2015 was popularized by Klaus Schwab, the World Economic Forum Founder and Executive chairman. Schwab asserts that the changes seen are more than just improvements to efficiency, but express a significant shift in industrial capitalism.

The technological basis of Industry 4.0 roots back in the Internet of Things (IoT), which proposed to embed electronics, software, sensors, and network connectivity into devices (i.e., “things”), in order to allow the collection and exchange of data through the internet. As such, IoT can be exploited at industrial level: devices can be sensed and controlled remotely across network infrastructures, allowing a more direct integration between the physical world and virtual systems, and resulting in higher efficiency, accuracy and economic benefits. Although it is a recent trend, Industry 4.0 has been widely discussed and its key technologies have been identified, among which Cyber-Physical Systems (CPS) have been proposed as smart embedded and networked systems within production systems. They operate at virtual and physical levels interacting with and controlling physical devices, sensing and acting on the real world. According to scientific literature, in order to fully exploit the potentials of CPS and IoT, proper data models should be employed, such as ontologies, which are explicit, semantic and formal conceptualizations of concepts in a domain. They are the core semantic technology providing intelligence embedded in the smart CPS and could help the integration and sharing of big amounts of sensed data. Through the use of Big Data analytics, it is possible to access sensed data, through smart analytics tools, for a rapid decision making and improved productivity.  

_

The Importance of Digital Twins in Industry 4.0:

The fourth industrial revolution or Industry 4.0 which embraces automation, data exchange and manufacturing technologies is at the talking point of the business world. Digital Twins is at the core of this new industrial revolution bringing in unlimited possibilities. The traditional approach of building something and then tweaking it in new versions and releases is now obsolete. With a virtually-based system of designing, the best possible efficiency level of a product, process, or system can be identified and created simply by understanding its specific features, its performance abilities, and the potential issues that may arise.

The development cycle of a product with digital twins makes a complex process much simpler. Right from the design to the deployment phase, organizations can create a digital footprint of their creation. Each aspect of these digital creations are interconnected and will be able to generate data in real time. This helps businesses to better analyze and predict possible challenges in implementation, right from the initial design stage. Problems can be corrected in advance, or they can give early warnings to prevent any downtime.

This process also opens up possibilities to create newer and improved products in a more cost-effective manner, as a result of the simulations in real world applications. The end result will be a better customer experience. Digital twin processes incorporate big data, artificial intelligence, machine learning as well as the Internet of Things, and represent the future of the engineering and manufacturing spaces. In the Industry 4.0 era, the Digital Twin (DT), virtual copies of the system that are able to interact with the physical counterparts in a bi-directional way, seem to be promising enablers to replicate production systems in real time and analyse them. A DT should be capable to guarantee well-defined services to support various activities such as monitoring, maintenance, management, optimization and safety.

______

______

Section-3

Origin and history of Digital Twin:

_

Digital twins were anticipated by David Gelernter’s 1991 book Mirror Worlds. The concept and model of the digital twin was first publicly introduced in 2002 by Michael Grieves, at a Society of Manufacturing Engineers conference in Troy, Michigan. Grieves proposed the digital twin as the conceptual model underlying product lifecycle management (PLM).

Figure above shows early digital twin concept by Grieves and Vickers.

The digital twin concept, which has been known by different names (e.g., virtual twin), was subsequently called the “digital twin” by John Vickers of NASA in a 2010 Roadmap Report. The digital twin concept consists of three distinct parts: the physical object or process and its physical environment, the digital representation of the object or process, and the communication channel between the physical and virtual representations. The connections between the physical version and the digital version include information flows and data that includes physical sensor flows between the physical and virtual objects and environments. The communication connection is referred to as the digital thread.

_

The concept of the “twin” dates to the National Aeronautics and Space Administration (NASA) Apollo program in the 1970s, where a replica of space vehicles on Earth was built to mirror the condition of the equipment during the mission (Rosen et al., 2015; Miskinis, 2019). This replica was the first application of the “twin” concept. In 2003, DT was proposed by Michael Grieves in his product lifecycle management (PLM) course as “virtual digital representation equivalent to physical products” (Grieves, 2014). In 2012, DT was applied by NASA to integrate ultra-high-fidelity simulation with a vehicle’s on-board integrated vehicle health management system, maintenance history, and all available historical and fleet data to mirror the life of its flying twin and enable unprecedented levels of safety and reliability (Glaessgen and Stargel, 2012; Tuegel et al., 2011a). The advent of IoT boosts the development of DT technology in the manufacturing industry. Enterprises such as Siemens and GE developed platforms of DT for real-time monitoring, inspection and maintenance (Eliane Fourgeau, 2016). Recently, Tao and Zhang (2017) proposed a five-dimensional DTS framework, which provides theoretical guidance for the digitalization and intellectualization of the manufacturing industry. From 2017 to 2019, DT has been continuously selected as one of the top 10 technological trends with strategic values by Gartner (Panetta, 2016, 2017, 2018). The history of the DT is briefly summarized in figure below.

_

Digital manufacturing has brought considerable values to the entire industry over the last decades. Through virtually representing factories, resources, workforces and their skills, etc., digital manufacturing builds models and simulates product and process development. The progress in information and communication technologies (ICTs) has promoted the development of manufacturing greatly. Computer-aided technologies are developing quickly and playing more and more critical as well as typical role in industry, including CAD, CAE, CAM, FEA, PDM, etc. Big data, Internet of things (IoT), artificial intelligence, cloud computing, edge computing, the fifth-generation cellular network (5G), wireless sensor networks, etc. are developing rapidly and show big potentials in every aspect of the industry field. All these technologies provide opportunities for the integration of the physical world and the digital world, which is an inevitable trend for addressing growing complexities and high demands of the market. However, the full strategic advantage of this integration is not exploited to its full extent. The process of this integration is a long way to go and the newest developments focus on digital twin.

_

NASA’s Apollo space program was the first program to use the ‘twin’ concept. The program built two identical space vehicles so that the space vehicle on earth can mirror, simulate, and predict the conditions of the other one in space. The vehicle remained on earth was the twin of the vehicle that executed mission in the space. The Apollo 13 mission quickly met its end as oxygen tanks failed and exploded two days into the mission. The implementation of a digital twin on the ground level contributed to the success of the rescue mission, as engineers were able to assess and test out all possible mission outcomes, troubleshooting, and solutions. The first use of the “digital twin” terminology appeared in Hernández and Hernández’s work. Digital twin was used for iterative modifications in the design of urban road networks. However, it is widely acknowledged that the terminology was first introduced as “digital equivalent to a physical product” by Michael Grieves at University of Michigan in 2003. The concept of “product avatar” was introduced by Hribernik et al. in 2006, which is a similar concept to digital twin. The product avatar concept intended to build the architecture of information management that supports a bidirectional information flow from the product-centric perspective. Research regarding product avatar can be found before 2015. However, it seems that the product avatar concept was replaced by digital twin since then.

_

For many years, scientists and engineers have created mathematical models of real-world objects and over time these models have become increasingly sophisticated. Today the evolution of sensors and network technologies enables us to link previously offline physical assets to digital models. In this way, changes experienced by the physical object are reflected in the digital model, and insights derived from the model allow decisions to be made about the physical object, which can also be controlled with unprecedented precision. At first the complexity and cost involved in building digital twins limited their use to the aerospace and defense sectors (see the timeline in figure below) as the physical objects were high-value, mission-critical assets operating in challenging environments that could benefit from simulation. Relatively few other applications shared the same combination of high-value assets and inaccessible operating conditions to justify the investment. That situation is changing rapidly. Today, as part of their normal business processes, companies are using their own products to generate much of the data required to build a digital twin; computer-aided design (CAD) and simulation tools are commonly used in product development, for example. Many products, including consumer electronics, automobiles, and even household appliances now include sensors and data communication capabilities as standard features.

While the digital twin concept has existed since the start of the 21st century, the approach is now reaching a tipping point where widespread adoption is likely in the near future. That’s because a number of key enabling technologies have reached the level of maturity necessary to support the use of digital twins for enterprise applications. Those technologies include low-cost data storage and computing power, the availability of robust, highspeed wired and wireless networks, and cheap, reliable sensors. As corporate interest in digital twins grows, so too does the number of technology providers to supply this demand. Industry researchers expect the digital twins market to grow at an annual rate of more than 38 percent over the next few years, passing the USD $26 billion point by 2025.

_____

_____

Section-4

Digital Twin versus other closely related technologies:

_

3D model versus digital twin:

Digital twins started as basic CAD documentation of the physical world in the late 1900s, but with the growth of BIM working processes the concept of digital twin has become a representation much closer to reality. With the ability to assign parametric data to objects it was possible to make the representations move beyond just a physical description, but a functional representation as well. Recently, with the growth of IoT technologies it became possible to live stream data to the objects and systems in the physical world to a remote location, analyze the data, and react to modify conditions of the physical object in its actual location. This moves the CAD object from being just a 2D/3D representation randomly positioned in space to being a representation of the physical object that demonstrates not only the form of the physical object, but its behavior as well. Digital twins can be made using 3D models, 3D scans, and even 2D documentation as seen in the figure below. The requirement to qualify as a digital twin is the link between the physical and virtual worlds where data is transmitted bi-directionally between the two worlds. 

CAD (3D model) is used for both simulation technology and digital twin technology. Both simulations and digital twins utilize digital models to replicate a system’s various processes. But optional one-time data exchange builds the simulation from physical counterpart using 3D model while continuous bidirectional data exchange builds digital twin from physical counterpart using 3D model.

_

Nowadays there are many players in the market that can create accurate 3D models of the real world structures – a process that is commonly called “reality capture”. Using laser scanners or photogrammetry, detailed three-dimensional, geometric representations of captured objects are created. The results of this is referred to as a “digital twin”. But such a three-dimensional model is by no means a digital twin. The models certainly represent what is physically present and all the associated visible information. Though any building or technical installation has a large amount of information and data that is not visible. These invisible information are for example asset information stored in an ERP system, process data used by the CAFM as well as consumption and meter data from the building management system. Also the energy management system or some other control system, as well as the multitude of sensors in modern “smart” buildings or factories yield additional information. Only when the 3D geometry is combined with all the additional information in a seamlessly integrated, easy to use and understand photorealistic model, we have a real digital twin.

_

3D models and digital twins are easily confused because they look similar at first glance. In both cases. what you see on the screen is a detailed visualisation of your physical asset in three dimensions. The difference – and it’s a big one – is the data that appears on the 3D model. A barebones 3D model without data has some use as a point of reference on a sprawling gas plant or oil rig. It basically does the same job as a traditional paper plan by providing orientation on the asset. With more advanced 3D visualisations, you might get some additional static data attached to the model. Static here means that the data cannot be updated easily. An example would be a pdf document containing an equipment spec sheet. Now in theory static data ought to help people make better decisions and complete tasks more efficiently. For example, you might be able to locate a piece of equipment on the 3D model and see its id number on the attached pdf. Armed with this identifier you can then chase up back office to investigate when it was installed, who installed it, and when it needs to be replaced. This is clearly more useful than a dog-eared paper map.

But is it going to transform efficiency and safety on the asset?

The answer is no.

3D models populated by static data simply aren’t that useful, particularly in operations. Quite often the data is out of date because busy staff and contractors have other priorities and don’t really see the point in spending time and energy on curating the model. In their eyes it’s just another digital initiative that adds to their workload. Because the data isn’t always updated, it isn’t 100% trusted. Because it isn’t trusted, people don’t bother to consult the 3D model. The net result is that people in operations quietly revert to their old ‘tried-and-tested’ processes.

Dynamic data:

The (huge) difference between the 3D model and a digital twin can be expressed in two words: dynamic data.

Unlike static data, dynamic data changes and updates in real-time on the 3D model as soon as new information becomes available. One example would be a sensor on a pipe that shows real-time temperature or pressure measurements on the 3D model, and is able to trigger an alert. From the point of view of someone on site with a maintenance task, this is extremely useful information that makes their job safer and more efficient. If that alert can be actioned promptly before a small problem becomes a big one, big savings can be achieved. This is the sort of thing digital workflows are for. They carry real time data insights from the central data repository to where they’re needed in operations, and carry back updates once a task has been completed. In this way the 3D model is constantly being updated with the best and most reliable information that everyone else on the asset can see.

With dynamic data running through its workflows, the digital twin suddenly comes to life. The lights come on and people start using it, because it is making everybody’s lives easier, safer and more productive. With reliable data at their fingertips in one place, people can ‘connect the dots’ to solve problems that previously eluded them.

It doesn’t stop there. As confidence in the digital twin grows, the culture of the organisation becomes ripe for improvement and change. Enlightened operators grab the opportunity to re-organise staff into multi-disciplinary teams and experiment with new approaches.

The truth is that 3D models incorrectly labelled as digital twins have often disappointed operators, which has unfairly damaged the reputation of this transformative technology. But has the money spent on 3D models been entirely wasted? Fortunately not. The good news is that your existing 3D model can be upgraded to channel dynamic data by following a few logical steps.

First, you need to establish a central data repository that cleans and harmonises the data so that it can be accessed. Second, you need to install digital workflows. These act as the ‘wiring’ that carry data insights to where they’re needed in operations. Third, you need to give your workforce the chance to understand how useful a digital twin can be in their daily lives. This usually happens by allowing them to experiment with the technology in a safe space.

_

It is clear from above discussion that DT is different from computer models (CAD/CAE) and simulation. Even though many organizations use the term ‘Digital Twin’ synonymously to 3D model, a 3D model is only a part of DT. DT uses data to reflect the real world at any given point of time and thus can be used for observing and understanding the performance of the system and for its predictive maintenance. Computer models, just like DT, are also used for the generic understanding of a system or for making generalized predictions, but they are rarely used for accurately representing the status of a system in real time. A lack of real-time data makes these models or simulations static, which means that they do not change or cannot make new predictions unless new information is fed to them. However, having real-time data is not enough for DT to operate—the data also need to be loaded automatically to DT and the flow from physical to digital should be bidirectional as seen in the figure below:

The digital twin configuration above represents a journey from the physical world to the digital world and back to the physical world. This physical-digital-physical journey communicates, analyses, and uses information to drive further intelligent action back in the physical world.

_

A digital twin fuses design CAD, simulation and sensor data to create actionable data—data that can be used to predict outcomes based on certain scenarios—and to help businesses make productive decisions. It works by having the 3D geometry of a part or system, connecting it with sensor data coming from operation, so that we can see a digital replica in 3D of the behavior of the part or system in service. With this it is possible to run 2D and 3D simulations to predict behavior. 2D uses machine learning that looks at historical data to make predictions which is used for better control of the process and predictive maintenance. 3D uses physics to run innovative virtual scenarios to find new ways of improving efficiency of operation and reduce cost. Combining 2D and 3D gives the user a complete remote control of their physical asset. Digital twins have traditionally been very expensive to operate and only used in defense and aerospace industries where there is fierce competition, there are few players, and these few players have relatively large resources.

_

Traditionally, engineering design teams use computer-aided tools to design assets. A bicycle manufacturer might use Computer Aided Design (CAD), Computer Aided Engineering (CAE) and Computer Aided Manufacturing (CAM) across the entire product life cycle. These representations of assets are not new. Early applications of CAE date back to the late 1960s, most notably, by Dr. Swanson who developed the first version of ANSYS – a CAE tool – in 19703. Designs of complex machines such as gas turbines and jet engines, internal combustion engines, locomotives, automobiles, and so on rely upon CAx technologies to limit costly experimentation and testing, so designers can improve their confidence on how the part will behave in the field even before building a single part. Such computer-aided representations, in addition to insights gleaned from controlled experiments, allow us to evaluate how the asset will perform in the field, as well as optimize how it is manufactured – thereby serving as an important digital twin representation of the asset. These form the ‘as-designed’ and ‘as-built’ digital representations of the asset.

3D CAD (3D Computer Aided Design), 3D CAE (Computer Aided Engineering) and 3D CAM (Computer Aided Manufacturing) technologies are used to specify precise 3D models or mechanical, electrical, and electro-mechanical products for manufacturing. 3D Printing (a.k.a., Additive Manufacturing) technologies are capable of taking 3D CAD specifications and automatically constructing (“printing”) them directly, with neither human intervention nor a separate traditional human-intensive manufacturing process (a.k.a., Subtractive Manufacturing). 3D CAD & 3D Printing technologies provide the electro-mechanical infrastructure for the physical construction of Physical Twins, and by extension their Digital Twin counterparts. Since the [3D CAD] Design = Implementation [3D Printed Product] principle is rapidly transforming how we design and manufacture Systems-of-Systems, these complementary technologies will play a vital role in the evolution of Digital Twin technology.

3D Maps vs. 3D Models:

Many believe that a Digital Twin is built on two parts: 3D Models and Process Models integrated with plant data. But some believe a successful Digital Twin should consist of 3D Maps and Digital P&IDs, integrated with plant data silos. Why? Fundamentally, 3D models are a flawed approach for creating a successful Digital Twin of the plant. Sure, the vision that’s often pitched with model-based digital twins seems incredible. A 3D model provides a fully interactive simulated virtual plant. We’ve seen countless flashy demos and proof of concepts around augmented 3D process models. These demos often show 3D models of the plant, where operators and engineers can don augmented reality glasses, tweak process conditions, change pressures and temperatures, or see the impacts of opening a valve or resizing a pump — all within a 3D model. The vision of these demos is to provide an interactive 3D model of the plant for engineers and operators to experiment and find ways to drive more efficiency out of the operations. Basic process optimization. But the question is, does all of that flashy 3D simulation actually produce results? If you ask most process engineers to debottleneck a process, they’ll pull up the two most common process models in industry — the Process Flow Diagram (PFD) and the Piping and Instrumentation Diagram (P&ID). These two diagrams, the PFD and the P&ID, are ubiquitous and foundational elements of the process engineer’s toolkit. The PFD and the P&ID give the easiest way to understand view of how a complicated process unit operates. All engineers and operators understand the PFD and P&ID. Only the PFD can summarize all of the complicated flows of the plant into a single sheet of paper. And only the P&ID can show all of the tens of thousands of connections, pieces of equipment, and metadata in an easy to read format that the engineer’s eye can use to debottleneck the plant. In a 3D Model, the users field of view is just too narrow — you can’t get an overall picture for how the unit operates. Successful Digital Twin solution is one that’s integrated with digital P&IDs and PFDs; and that’s foundationally based on 3D Mapping technology (not 3D Modelling). For most users, 3D maps are easier to understand, navigate, and use than 3D models. And they can be updated for a fraction of the cost of a 3D model. 3D Maps are the easiest to use, visually rich representation of the real 3D world at the plant.

_

A digital twin is different from the traditional Computer Aided Design/Computer Aided Engineering (CAD/CAE) model in the following important ways:

(a) It is a specific instance that reflects the structure, performance, health status, and mission-specific characteristics such as miles flown, malfunctions experienced, and maintenance and repair history of the physical twin;

(b) It helps determine when to schedule preventive maintenance based on knowledge of the system’s maintenance history and observed system behavior;

(c) It helps in understanding how the physical twin is performing in the real world, and how it can be expected to perform with timely maintenance in the future;

(d) It allows developers to observe system performance to understand, for example, how modifications are performing, and to get a better understanding of the operational environment;

(e) It promotes traceability between life cycle phases through connectivity provided by the digital thread;

(f) It facilitates refinement of assumptions with predictive analytics-data collected from the physical system and incorporated in the digital twin can be analyzed along with other information sources to make predictions about future system performance;

(g) It enables maintainers to troubleshoot malfunctioning remote equipment and perform remote maintenance;

(h) It combines data from the IoT with data from the physical system to, for example, optimize service and manufacturing processes and identify needed design improvements (e.g., improved logistics support, improved mission performance);

(i) It reflects the age of the physical system by incorporating operational and maintenance data from the physical system into its models and simulations.

_____

_____

Digital twin versus simulation:

_

A simulation is the imitation of the operation of a real-world process or system over time. Simulations require the use of models; the model represents the key characteristics or behaviors of the selected system or process, whereas the simulation represents the evolution of the model over time. Computers are used to execute the simulation.  A simulation is a model that mimics the operation of an existing or proposed system, providing evidence for decision-making by being able to test different scenarios or process changes. The terms simulation and digital twin are often used interchangeably, but they are different things. A simulation is designed with a CAD system or similar platform, and can be put through its simulated paces, but may not have a one-to-one analog with a real physical object. A digital twin, by contrast, is built out of input from IoT sensors on real equipment, which means it replicates a real-world system and changes with that system over time. Simulations tend to be used during the design phase of a product’s lifecycle, trying to forecast how a future product will work, whereas a digital twin provides all parts of the business insight into how some product or system they’re already using is working now.

_

Although simulation technology and digital twins share the ability to execute virtual simulations, they are not the same. While traditional simulation capabilities found in computer-aided design and engineering (CAD-CAE) applications are powerful product design tools, a digital twin can do much more. In both cases, the simulation is happening on a virtual model, but the model becomes a digital twin once the product is produced. When a digital twin is powered by an Industrial Internet of Things (IoT) platform, it can receive real-world data quickly and process it, enabling the designer to virtually “see” how the real product is operating. Indeed, when powered by an IoT platform, the model becomes an integrated, closed-loop digital twin that, once fully deployed and connected via the digital thread, is a business simulation tool that can drive strategy at every stage of the business.

_

Because of the similarities between simulation technology and digital twins, too many business and technology leaders fail to understand the profound differences between the two. Though both sound the same, find below the fine differences in this table:

Features/Attributes

Digital Twin

Simulation

Scale

Smaller business systems to massive manufacturing plants. 

Usually, businesses use the simulation for specific needs on a small scale. For example, flying airplanes or simulating a cyberattack.  

Live Data

Real-time data collected from the environment where the digitized object performs is at the core of digital twin technology.

A simulation does not require real-time data inputs. 

Scope of Study

You can study a whole business or a minute task.

The simulation focuses on one task at a time. 

Insight Sharing

A digital twin automatically shares output insights with the physical object or process.

A simulation does not share insights automatically. You need to implement the results manually.  

CAD simulations are theoretical, static and limited by the imaginations of their designers, while digital twin simulations are active and use actual dynamic data. Simulation is an important aspect of digital twin. Digital twin simulation enables virtual model to interact with physical entity bi-directionally in real-time. This interaction means information can flow in either direction to create a cyber-physical system, where a change in one affects the other. Multi-physics, multi-scale simulation is one of the most important visions of digital twin. You can’t have a digital twin without simulation. The advantages of digital twin over a more basic, non-integrated CAD-based simulation are evident for monitoring valuable products such as wind turbines. However, digital twins can be costly, requiring the fitting of sensors and their integration with analytical software and a user interface. For this reason, digital twins are usually only employed for more critical assets or procedures, where the cost is justifiable to a business.  

_

Case Studies:

To better understand the difference between simulation and digital twin it is useful to look at some real life case studies.

For example, while an advanced simulation can analyse thousands of variables a digital twin can be used to assess an entire lifecycle. This was demonstrated by Boeing, who integrated digital twin into design and production, allowing them to assess how materials would perform throughout an aircraft’s lifecycle. As a result, they were able improve the quality of some parts by 40%.

Tesla also use digital twins in their vehicles to capture data that can be used to optimise designs, enhance efforts to create autonomous vehicles, provide predictive analytics and deliver information for maintenance purposes. This actual, rather than theoretical, two-way flow of data could lead to a future where a vehicle could deliver data directly to a garage ahead of a service detailing performance statistics, parts that have been replaced, service records, and potential problems picked up by the sensors. This would deliver time and cost savings as a mechanic could hone in on any problems based on the data rather than having to fully inspect each vehicle for problems.

_____

_____

Digital Twins versus BIM:

The architecture, engineering and construction (AEC) sector is notorious for its resource planning, risk management and logistic issues, which frequently lead to design flaws, project delays, cost overruns and contractual conflicts. Building information modelling (BIM) has been used to increase the efficiency of the construction process, eliminate waste during construction and enhance the quality of AEC projects throughout the last several decades. BIM enables the early identification and correction of possible issues before they reach the construction site. BIM provides a way of accessing detailed information about buildings and infrastructure assets, enabling the creation of 3D models which include data associated with the functional and physical characteristics of an asset. BIM is beneficial but insufficient, and the AEC sector needs something more substantial (Bakhshi et al., 2022; Rahimian et al., 2021). The existing constraints have prompted research into the use of powerful machine learning (ML) techniques, such as deep learning (DL), to diagnose and predict causes and preventive measures. Digital twins have quickly established themselves as the go-to method for developing reliable data models of all components of a building or city at various phases of its lifecycle. The AEC industry has likely struggled to distinguish between these two critical technologies. The misconception stems primarily from BIM software’s emphasis on digitally represented physical space and Digital Twin’s early categorization as a digital replica of a physical object or location. The main distinction is in how each technology is implemented. Building maintenance and operations are best served by Digital twins whereas construction and design are better addressed by BIM.

_

BIM stands for Building Information Modelling and as the name implies, the core of BIM is INFORMATION. BIM is an intelligent model-based process/methodology for creating 3D building design models that incorporate all of the information required for a building’s construction. It allows for easier coordination and collaboration among the project’s many teams. AEC experts can design, construct, and operate buildings and infrastructures more effectively.

Building Information Modeling (BIM) is the integrated process built on coordinated reliable information used to create coordinated digital design information and documentation, predict performance appearance and cost, deliver projects faster more economically, and with reduced environmental impact.

The cost-saving benefits of having a central point of building reference in a 3D digital model are now being explained by leading BIM software providers to the AEC (architects, engineers, and contractors) industry. BIM allows collaboration and much easier design recalibration in progressing projects.

_

It’s helpful to think about BIM as a layered view of a tower structure. The first layer might include design information about the tower structure, the next layer might be overlaid with an electrical diagram, and then information about manufacturer equipment, and so forth until every system is part of a complete model. BIM offers an up-to-date view of how a tower is designed, allowing personnel to peel back the layers to see how systems interact with one another. When a design change affects multiple tower systems, BIM provides a view of how the modification affects each layer and can simultaneously update information across layers after the change is complete.

_

BIM is a fundamental part of building a digital twin, but they should not be confused, because they are different. In many ways, digital twins are an evolution of BIM, enabling users to evolve their outputs and deepen the use of collaborative data. Over the years, BIM has proved a valuable tool for predictive maintenance, asset tracking, and facilities management. The highly-detailed 3D models help to improve process visibility, establish a clear project vision, and enable contractors, engineers and stakeholders to collaborate and gain richer insights. For these reasons, BIM and digital twins are built on common principles, and both enable teams to look at assets as ongoing projects. But, to improve and adapt projects for greater value, real-time insights are vital. And this is where digital twins come into play.

_

Often used solely as a 3D-modelling tool for design and construction phases, BIM builds static models which do not include the dynamics of a live digital twin. In comparison, digital twins harness live data to evolve and replicate the real world. In construction, a digital twin not only looks like the real building, but also acts like it – providing greater value through the asset lifecycle. At its core, a digital twin can be an output of a BIM process and is essentially a ‘living’ version of the project or asset view that BIM processes exist to create – able to evolve and transform using real-time data once the asset is in use.

The fundamental difference between digital twins and BIM is that the latter is a static digital model, while the former is a dynamic digital representation. In other words, BIM lacks spatial and temporal context elements that characterize digital twins. Digital twins provide a realistic environment in which the model resides instead of the model alone. It is kept up-to-date based on real-time data, context, and algorithmic reasoning. BIM, in contrast, offers up a fixed model in isolation, independent from its surroundings and the people who use it. So, BIM is not a Digital Twin. BIM is a small sub-set of a Digital Twin, frozen in time – typically during the design and construction phase. BIM is a finely tuned tool for more accurate design, collaboration, visualization, costing and construction sequencing phases of a building’s life. BIM is not designed for the operation and maintenance phases of a building’s life, nor is it designed to ingest nor action on the vast amounts of live IoT data generated by a ‘breathing’ and ‘thinking’ Smart Building. Its primary purpose is to design and construct a building, and post-construction, it serves to provide a digital record of a constructed asset. BIM is only focused on buildings – not people or processes. However, BIM is a small but very useful input into a Digital Twin, as it provides us with an accurate digital asset register and location data and is a great starting point for both a Smart Building and a Digital Twin. Digital twin technology, combined with the Internet of Things (IoT) and Industrial IoT (IIoT), may eventually render BIM as we know it obsolete. For now, however, BIM isn’t going anywhere. Between 2020 and 2027, its market size has a projected compound annual growth rate (CAGR) of 15.2%.

_____

_____

Types of digital twin: physics-based vs data-driven vs hybrid models:

Generally, there are two types of DTs — physics-based twins and data-based twins.

-1. Physics-based twins (also called virtual twins by some authors):

Physics-based twins rely on physical laws and expert knowledge. They can be built from CAD files and used to simulate the work of comparatively simple objects with predictable behavior — like a piece of machinery on the production line. A virtual twin is an ideal model of the product based on simulation. Here, physics-based simulation uses analytical and numerical methods to model the behavior of products and systems. Finite Element Analysis (FEA) for structural simulation is one example.

The key downside is that updating such twins takes hours rather than minutes or seconds. So, the approach makes sense in areas where you don’t need to make immediate decisions. There are limits to physics-based simulation. It requires computer resources proportional to the size and complexity of the problem. Large systems (or a system of systems) cannot be practically simulated. This is because the simulations take too long and the required IT infrastructure costs too much. Furthermore, the simulation setup may not exactly match the operating conditions of the real world, introducing a source of divergence between the simulation and the real, operating product. Recent advances in numerical techniques, cloud computing, and GPU processing have helped address this issue to some extent. But limitations still exist. Solving large systems (or a system of systems) is still impractical for many physics-based simulations.

_

-2. Data-based twins (machine learning based twin):

Contrasted with the physics-based type, data-based twins don’t require deep engineering expertise. Instead of understanding the physical principles behind the system, they use machine learning algorithms (typically, neural networks) to find hidden relationships between input and output. For example, you could have a digital twin of an operational aircraft engine sitting in the Pratt and Whitney data center. Here, the engine is the “product.” Sensors in the real engine send data to the digital twin. This data is stored and processed, creating a digital replica of the physical system. The sheer volume of transmitted data can be overwhelming.

The data-based method offers more accurate and quicker results and is applicable to products or processes with complex interactions and a large number of impact factors involved. On the other hand, to produce valid results it needs a vast amount of information not limited to live streams from sensors.

Algorithms have to be trained on historical data generated by the asset itself, accumulated from enterprise systems like ERP, and extracted from CAD drawings, bills of material, Excel files, and other documents.

Today, various combinations of two methods — or so-called hybrid twins — are often used to take advantage of both worlds.

_

-3. Hybrid twin: Marrying physics-based and data-based twins:

At the ESI Live 2020 event, ESI Group Scientific Director Francisco Chinesta introduced a new concept: the hybrid twin. The hybrid twin is a simulation model working at the intersection of virtual and digital twins.

It’s a clever idea. Physics-based simulation (which Chinesta also called the “virtual twin”) is often used to study the behavior of products and systems. But simulation has its limits as an emulator of reality. It represents an idealized version of an item. This is where data-based digital twin comes into play.

Sensor data from smart, connected products is gathered and analyzed for anomalies, offering up-to-date insight into behaviors. But can simulation learn from this real-world data to move closer towards reality? Maybe—and the answer lies in the hybrid twin.

A virtual twin has limitations, deviating from reality with margins of error that can be significant. When you create a hybrid twin by using the data from a digital twin as an input to a virtual twin, you can drastically reduce or even eliminate these errors. According to Chinesta, even a small sample of the digital twin’s data can reduce errors significantly.

A hybrid twin increases the accuracy of simulations by reducing the errors to near-zero. As a result, it enables you to study large systems or a system of systems. This would be completely impractical using physics-based simulation models alone.

A hybrid twin model is a powerful design and product development asset. You can further improve and expand it by integrating artificial intelligence and machine learning algorithms. Such an approach will facilitate the development of improved, reliable, and resilient products of the future.

But who actually benefits from hybrid twins?

First, companies building complex and connected products stand to benefit. Many of today’s products are now embedded with sensors, which collect and transmit all kinds of data in real time. Unlike a digital twin alone, a hybrid twin can make real use of this data to improve a company’s simulation models and move them closer to reality.

Second, the hybrid twin is a boon to companies adopting simulation-driven product development. They can now use their products in the field to improve their simulation models. They can use more accurate simulations to make more informed decisions, boosting innovation. They can also verify product and system behaviors more precisely, reducing the reliance on prototyping and testing.

Lastly, companies building large systems or a system of systems can now use simulation to study those systems. This helps them better understand their systems and stress-test them, virtually.

______

______

Thin Digital Twin:

In information technology “thin” is commonly associated with data storage components that do not consume space. A thin digital twin provides an exact replica of your IT infrastructure, applications, and data without the overhead or complexity of traditional simulation and modelling solutions. To make the twins thin the automation engine needs to have a very deep integration with advanced data cloning technology in modern storage systems and the ability to map the zero footprint clones to hypervisors, software defined networking, and container platforms.

_____

Digital Twin versus virtual commissioning:

Virtual commissioning is the process of creating and analyzing a digital prototype of a physical model to predict its performance in the real world. This means you will test the behaviour before physical commissioning. There is a blurry line between Digital Twins and Virtual Commissioning and many definitions and interpretations exist. One thing they have in common, they both use simulation technology. Virtual commissioning is the simulation of a production system to develop and test the behavior of that system before it is physically commissioned. Digital Twin captures sensor data from a production system and feeds that information into the simulation in real-time to copy the operation of the system in a virtual world. However, it is a common misconception that Digital Twins only come into use once the physical system has been created but Digital Twins are digital replicas of real and potential physical assets (i.e., “Physical Twins”). Digital twin prototype (DTP) is a constructed digital model of an object that has not yet been created in the physical world. So virtual commissioning may be considered as a type of digital twin.

____

Digital twins versus digital twinning:

The terms “digital twin” and “digital twinning” are often used interchangeably. But in reality, they’re very different concepts—and it’s important to recognise the difference.

Digital twinning is the process of digitising the physical environment. It’s a continuous journey that requires collecting and analysing large volumes of data to improve the physical environment around us. It’s a strategic transformation, and a process managed at the highest levels of a business or organisation.

Digital twins on the other hand are the outputs of digital twinning. Each digital twin is a digital reflection of an individual asset, process or system generated through the digital twinning process. By pursuing a strategy of digital twinning, you’ll eventually build up a vast ecosystem of digital twins.

_____

How DT differs from existing technologies: 

Technology

How the technology differs from DT

Simulation

No real-time twinning

Machine Learning

No twinning

Digital Prototype

No IoT components necessarily

Optimisation

No simulation and real-time tests

Autonomous Systems

No self-learning (learning from its past outcomes) necessarily

Agent-based modelling

No real-time twinning

____

____

Digital Twin: Common Misconceptions:

Almost half of the world’s population has no access to the internet, and only a few countries are exploiting the frontier edge of technological advancement. A digital divide has been witnessed between developed and developing countries/regions, different income levels, genders, ages and exclusions further exacerbate already existing inequalities, especially for those digitally disadvantaged groups. Despite the continuous efforts to bridge the digital divide, rapid technological advances, like the digital twin, may have further accelerated human vulnerability in technology evolution. A lack of understanding, knowledge, and access to the frontier edge of developments increases the chances of possible misconceptions, false or inflated expectations of the public, adverse reputational outcomes, and misunderstandings, as well as more significant risks of the most digitally vulnerable groups being exploited. Therefore, this section aims to clarify the common misconceptions about the digital twin. There are some common misconceptions regarding digital twins as can be seen in Table below. In brief, common digital twin misconceptions arise from the closely related technologies of digital twins, 2D/3D modelling, system simulation, validation computation, digital prototyping, and so on. Without a comprehensive understanding of the digital twin and its related technologies, confusion with one of its rooting technologies is common, often confusing elements or steps of digital twin with the digital twin itself. Digital twin’s dynamic, real-time, and bi-directional data connection features are keys to distinguishing the digital twin, but also the most common source of misconception.

_

Common misconceptions of the digital twin:  

Term

Reasons and Differences

Digital shadow

A digital shadow contains a physically existing product and its virtual twin, but it has only a unidirectional data connection from the physical entity to its virtual representative, meaning the virtual twin only digitally reflects the physical product.

Digital modelling

Modelling is the essential aspect of a digital twin but is not an alternative term to represent digital twin as a whole. There are bi-directional data connections between the physical product and its virtual twin; however, the data is exchanged manually, meaning the virtual twin represents a certain status of the physical product with the manually controlled process of synthesis.

Digital thread

The digital thread represents the continuous lifetime digital/traceable record of a physical product, starting from its innovation and designing stage to the end of its lifespan, and it plays an important role in the digitalisation process and functions as the enablers of interdisciplinary information exchange.

Simulation

Simulation refers to the important imitating functionality of digital twin technology from the virtual twin’s perspective, and simulation indicates a broader range of models; it is an essential aspect of the digital twin rather than an alternative term representing digital twin, as it does not consider the real-time data exchange in between the physically existing object.

Fidelity model/ Simulation

Fidelity refers to the level of imitation state of a simulation model compared with the physical product it is reproducing. It is common to find terms like high/low/core/multi fidelity model/simulation, which describe different fidelity levels or considerations while building up the simulation model. It is also frequently found that researchers use high fidelity or even ultrahigh fidelity to describe the common feature of the digital twin considering its real-time dynamic data exchange between the physical object and virtual twin.

Cyber twin

Some researchers referred to cyber twin and digital twin interchangeably as a result of understanding “cyber” as another alternative term for “digital”. It is also common to see terms like cyber digital twin, cyber twin simulation, cyber-physical system, and so on. The key aspect the cyber twin or cyber-physical system would like to address is a network (internet architecture), closely related to the advancements and implementations of IoE (Internet of Everything). It is also common to mix the cyber twin or cyber-physical system network architecture with a digital thread.

Device shadow

It is common to find research on device shadow in areas of cloud computing platforms and the Internet of Things (IoT). Device shadow highlights the virtual representation of the physically existing object; in brief, it refers to the service of maintaining a copy of information extracted from the physical object, which is connected to IoT.

Product Avatar

It is a distributed and decentralised approach for product information management with no feedback concept; it may capture information of only parts of the product.  

Product Lifecycle Management (PLM)

PLM are focused more on ‘managing’ the components, products and systems of a company across its lifecycles, whereas a DT can be a set of models for real-time data monitoring and processing.

_____

_____

Data integration in Digital Model, Digital Shadow and Digital Twin:

 

Data flow from physical object to digital object

Data flow from digital object to physical object

Digital Model

Manual

Manual

Digital Shadow

Automatic

Manual

Digital Twin

Automatic

Automatic

Digital Model:

A digital model has the lowest level of data integration. The term indicates a digital representation of an existing and physical object characterised by the absence of automated data flow between the physical and digital object. This suggests that the data flow from a physical object to a digital object and vice versa is provided manually. Consequently, any change occurred in the physical element does not impact the digital element and at the same way any modification of the digital element does not affect the physical element. A digital model ranges from the simple building component to the whole building considering the construction sector. In this case, it is used to represent and describe digitally a concept, to compare different options avoiding the application in the physical. In addition, the term refers to simulation models of planned factories or mathematical models of new products.

Digital Shadow: 

Starting from the concept of digital model, if there is an automated data flow from the physical element to the digital element the digital, representation takes the name of digital shadow. Hence, a change in the physical object contributes to a change in the digital object but not vice versa. According to a study carried out, most of digital twin research articles in the manufacturing area stops at the digital shadow level of integration. The term digital shadow can be associated to the building information modelling concept in the construction sector. It can be characterised and enriched by simulations but their output is not associated to automatic modifications in the building.

Digital Twin:

The highest level of integration is reserved to a digital twin. The data flow is automatic in both directions between the physical and digital object. In this context, a modification in the state of a physical object determines a modification in the state of a digital object and vice versa. Unlike the digital shadow, the digital twin enables to verify physical processes and activities prior the execution to reduce failures and it can outline differences between the actual and simulated performances to optimise and predict the behaviour. Furthermore, since data within the digital twin are derived not only from the physical environment but also from virtual models with data elaborated through processes such as statistics and regression, the digital twin is more abundant in data than the digital shadow.

Examples:

Applying the concept of digital twin to the construction environment, a building digital twin is not limited to its 3D visual modelling, that is called digital model or digital shadow according to the degree of data integration. It can become a digital twin if it has automated or semi-automated thermal management control, or the procurement of construction site components with physical delivery and sensor devices by the use of satellites, or by optimising and scheduling the renovation or construction processes using connected on-site smart devices.

_____

_____

Section-5

Overview of Digital Twin:     

Ever made a machine? If yes, then how many attempts it took to make it function flawlessly, to make it the ideal one? Plenty of unsuccessful attempts. It’s not only you but every manufactory faces this troublesome situation. At times, a defect in a certain fragment might result in the non-functioning of the device. This will require dismantling the fragments, figuring out the corrupted part, fixing it and there you go back to day one. Ever wanted if you could find out how the machine is going to function before assembling all components? What if we say that you can simulate your device on your desktop as the same as it is going to perform in the real world? Yes, the exact replica of the device with all of its components from the micro atomic level to the macro geometric level. Yes, this lies within the realm of possibility and can be realized with the help of a Digital Twin. The next significant thing in industrial services will be about accurately foretelling the future of physical assets through their digital twins.  

_

There are plenty of definitions of a “Digital Twin” flooding all over but the simplest is: A Digital Twin is a real-time digital clone of a physical device. A Digital Twin of any device/system is a working model of all components (at micro level or macro level or both) integrated and mapped together using physical data, virtual data and interaction data between them to make a fully functional replica of the device/system and that too on a digital medium. This digital twin of the physical system is not intended to outplace the physical system but to test its optimality and predict the physical counterparts’ performance characteristics. You can know of the system’s operational life course, the implication of design changes, the impact of environmental alters and a lot more variables using this concept.

_

For the creation of a digital twin of any system, the engineers collect and synthesize data from various sources including physical data, manufacturing data, operational data and insights from analytics software. The sensors are connected to the physical product that helps to collect data and send it back to the digital twin, and their interaction helps to optimize the product’s performance using a maintenance team. The Engineers integrate Internet of Things, Artificial Intelligence, Machine Learning, and Software Analytics with Spatial Network Graphs to gather all the relevant information and map it into a physics-based virtual simulating model and then by applying Analytics into these models, we get the performance characteristics of the physical asset. For most of the devices, the seamless exchange of data helps in getting the best possible analysis, the same is the case for digital twin. Therefore, a digital twin continuously updates itself from multiple sources to represent its near real-time status, working condition or position. It’s learning system, learns from itself, using sensors that conveys data of various aspects of its operating condition; from human experts, such as engineers with deep and relevant industry domain knowledge; from other similar machines; and from the larger systems and environment which it may be a part of. A digital twin also uses the data from past machine usage to factor into its digital model. The digital model created is then applied with analytics such as environmental conditions or interaction analytics with other devices to detect anomalies and the lifecycle of the physical counterpart. The twin then determines an optimal process that boosts some key performance metrics and provides forecasts for long-term planning which helps in optimizing the business outcome.

_

Without any doubt, constructing a digital twin would be purposeless if there were no practical reasons for pursuing it. It is already noted that one cannot make a perfect machine in just one try and it costs bags of money and a whole lot of time experimenting on physical products. On another note, Digital twins and IoT together with artificial intelligence help us analyze data and monitor systems to scrutinize and solve these problems. Where making a change in a physical product could be backbreaking, a digital replica can be swiftly revised to demonstrate amendments and to run simulations. If the outcome of the revised system does not comply with our needs after testing it on Digital Twin prior to physical machine, this would help us with no wastage of physical resources along with the time savings. By monitoring the status of a system or process and using multiple tides of data in real-time to study its digital twin, engineers gain deep knowledge on how to enhance product lifecycles, streamline maintenance and sharp optimization. Using a digital replica of the physical system not only accelerates development in various aspects but also helps to analyze, observe and navigate to every minute detail with so much precision that there is no space for errors and inaccuracies ensuring the optimal production output. Yet another benefit is that digital twins allow experts to work on projects even when they are not in direct contact with the physical twin. It ensures the safety of a wellbeing with no risk of tragedy. Digital twin also helps engineers to work on equipment that’s already in space and completely inaccessible to them, without the hassle that comes with the physical accessibility of such types of equipment. Any update or alteration can be first tested for its outcomes and repercussions to avoid any calamity by directly implementing it in the physical world. In a nutshell, digital twins have the power to reshape the universe.

_

A Digital Twin (DT) is a living digital representation of an individual physical system that is dynamically updated with data to mimic the true structure, state, and behavior of the physical system, to drive business outcomes.

The four key elements of a Digital Twin are the physical system, the digital representation, the connectivity between the two, and the business outcome. The first element, the physical system itself, can be an individual physical entity, an assembly of physical entities, a physical process, or even a person. It also doesn’t have to be an industrial system, as it could be biological, chemical, ecological, or any other system. The second is the digital representation which is the model itself. In this case, by model, we don’t mean just a collection of data such as a data model, which is needed to represent the structure (or configuration) of the physical system, or an IoT data dashboard, which is helpful to represent the current state of the physical system. We mean a model that emulates the behavior of the physical system, such as a simulation, so that when you give it an input, the model returns a response output. This leads to the third element, connectivity, which is emphasized by the reference to “living.” The model must be regularly updated with data from the physical system (say, from sensors) to be a Digital Twin. A validated model provides a snapshot of behaviour of the physical system at a moment in time, but a Digital Twin extends the model to timescales where the physical system’s behaviour changes significantly from the original time. The frequency of the updates is dictated by the rate at which the underlying phenomena evolves. Some use cases require near real-time updates, whereas other use cases require only weekly updates. Lastly, the Digital Twin must drive a specific outcome – some kind of economic or business value.

_

The key difference between a Digital Twin and existing modelling methods such as traditional 3D modelling (CAD), physics-based simulations, virtual worlds (3D/AR/VR), IoT dashboards of streaming sensor data, and realistic gaming environments is the information flow between the digital and physical systems. A common misconception is that a more complex, higher fidelity virtual representation is what makes a Digital Twin. Rather, it is the regular updating that is key, and directly impacts how data is collected throughout the life cycle and how the Digital Twins are constructed. A Digital Twin must consume the data streams to understand the present state of the system, learn from and update itself (or be updatable) with new observations of the system, and be able to make predictions of the current and future behavior of the system.

For example, a Digital Twin of a gas turbine blade ingests temperature and pressure IoT data to predict crack length, a non-observable quantity during operation. Visual borescope inspection results from periodic maintenance are used to update the Digital Twin. The Digital Twin is then used to make predictions of crack growth rate and remaining useful life (RUL) under different operational conditions and maintenance scenarios, enabling the operator to select the best dispatch schedule and maintenance plan. Output from the Digital Twin such as the crack length or RUL can then be shown to the user via a dashboard, a 3D rendering showing the crack in-situ, or some other context-relevant manner. Although the CAD models, IoT dashboards, 3D renderings/immersive walkthroughs, and gaming environments are not Digital Twins in themselves, they represent useful visualization and building blocks of Digital Twin solutions, and often represent the first steps in a customer’s Digital Twin journey.

_

While digital twins have great potential, their use is not a necessity for every fabricated product. Not all objects can be considered as complicated enough in order to need the intense and tactic flow data sensors required by digital twins, neither is it always worth it from a financial point of view to invest important resources for the creation of a digital twin. Therefore, the industrial sectors that have a need to employ digital twins are those that develop and produce products of the niche sector. The rapidly growing digital twin market suggests that while digital twins are already used in many industries, their demand will continue to escalate in the immediate future. The major benefits of a non-static and highly realistic digital model of a physical object are practically unlimited. Understanding individual aspects of real-life objects as well as designing new ones are endorsed and the competence to simulate and optimize is elevated. Via recording and digitally acquiring the operating data of the model (digital twin), in a real-time situation, the recording of behavioral patterns of the real-world system becomes more achievable. Therefore, users can predict them in a greater degree.

_

Historically, the only way to gain knowledge of buildings was to have direct physical contact with the building itself. All the data about the building and its performance was directly contained within the building. The data about the buildings has been stored in static documentation formats such as paper or computer files. Digital twins build the bridge between the physical and digital worlds to allow for data to flow in real time or near-real time, so the data becomes alive. The digital twin concept was first noted in 2002 by Michael Grieves at the University of Michigan as part of Product Lifecycle Management (PLM). His idea was the real space and virtual space worlds would be linked throughout the lifecycle of the system. Initially it was known as a Mirrored Spaces Model. Later, it was known as a Mirrored Information Model before being known as digital twin. Digital twin is described as the bi-directional flow of data between “virtual space” (the digital representation) and “real space” (the physical asset). The data then needs to be accessible in real time or near-real time to build a complete digital picture of the physical asset.

_

Digital Twin Software:

Digital twin software refers to the tool necessary to create and manage an object’s virtual representation. This software often integrates a wide range of technologies including the Internet of Things and sensor data.

To be as efficient as they are today, digital twins require the combination of a range of technologies. These include the following ones:

  • CAD
  • 3D modelling tools
  • Sensor data
  • IoT devices
  • Connected devices
  • Game engines
  • Version control
  • The Metaverse
  • Augmented reality
  • Virtual Reality
  • NFTs

Digital twins also rely on blockchain-based systems to record information about an asset and guarantee the authenticity of the information collected and stored.

_

Classification of digital twins according to levels of abstraction:

There are three basic different types of Digital Twins, the DTP, DTI and DTA. These three kinds are shown in figure below.

DTP is the Digital Twin Prototype, which is an image of physical products in the prototype phase. The special feature here is that the virtual twin is created before a physical twin is produced. This allows tests and simulations to be performed on the virtual twin until it has optimal design and behaves as desired throughout its lifecycle. If so, the physical prototype can be produced. The use of Digital Twin Prototypes is particularly useful for products with high complexity.

The second type of Digital Twins is the DTI, the Digital Twin Instance. In this case, a specific physical product is assigned to the virtual twin, which maps over its entire lifecycle. This type of Digital Twin is used, for example, to transmit information about necessary maintenance work.

The third type of Digital Twins, the Digital Twin Aggregate (DTA), is the collection of multiple DTIs of products. The DTA does not itself collect data, but collects the data of the DTIs, in order to bundle and analyse them, so as to give an overall view of the performance of a particular product in its individual embodiments. DTA gathers DTI information to determine the capabilities of a product, run prognostics and test operating parameters.

Besides these three main types of Digital Twins, there is another type that occurs in the field of artificial intelligence, the Intelligent Digital Twin (IDT). Not only does this digital twin mirror information, like the types of Digital Twins we’ve already introduced, it’s also capable of making predictions based on data we collect. The use of intelligent digital twins could predict how this product will behave in the near or distant future based on the current state of a product. This would avoid errors in advance and simulate the complete lifecycle of a product.

_

Virtual copies can be applied at all stages of product creation, including design, production, operation and disposal. At the design stage, engineers create versions of a computer model for the product under development to assess and select possible technical solutions. Then, they select the most appropriate model, called the Digital Twin Prototype (DTP), which contains the information needed to describe and create physical versions of product instances. At the production stage, the DTP helps achieve the required characteristics of the outcome. At the operation stage, Digital Twin Instance (DTI) is used. This is a virtual copy of a specific sample of a product with which the twin remains associated throughout its lifecycle. Twins of this type are created on the basis of DTP and additionally contain information on the history of product manufacturing, the use of materials and components, statistics of failures, repairs, replacement of units and assembly, quality control, etc. DTI undergoes similar changes as its physical instance during its operation. At the disposal stage, DTI is also used. There is also the Digital Twin Aggregate (DTA) — an information management system that has access to numerous digital twins of a product family.

These various types can offer a variety of uses including logistics planning, product development and re-design, quality control / management, and systems planning. A digital twin can be used to save time and money whenever a product or process needs to be tested, whether in design, implementation, monitoring or improvement. 

_

In order to utilize “digital twin” technology, three distinct steps are basically needed. At first, the digital twin prototype should be made, taking into account data along with processes carried out in the real-word system. After that stage, a digital twin of the properties of every object must be compiled. At the end, every relevant systemic property should be attributed, so that, via all the data acquired, a further understanding and prognosis can be achieved. In conclusion, the digital twin should be able to realistically represent its physical real-world twin and be able to show its condition in real-time. In order for this to be feasible, the employment of innovative technologies is requested, which are considered as the foundations of today’s industrial revolution context. Digital Twin embraces four technologies to create visual representation, capture, store, analyse data and provide valuable insights. These technologies are the IoT, Extended Reality (XR), Cloud, and Artificial Intelligence (AI).

  • IoT sensors to gather data about the physical object
  • XR to visualize physical objects in 3D
  • Cloud to store data gathered from IoT sensors
  • AI and ML-based manufacturing tools to analyze object data, generate insights, and make predictions

_

Digital twin is characterized by the following traits:

  • Driven by Sensor Data: Sensor data is captured from an existing physical product by an Internet of Things (IoT) system.
  • Digital Model: Sensor data is fed into a digital model of the product. This model can be a numerical, 1D simulation, or 3D simulation. It may focus on a single engineering domain or span many. This model emulates or simulates the behavior of the existing physical product.
  • Predictive or Deeper Insights: The model mimics the performance of the existing physical product, offering insights into its future operation, or deeper insights into its current or past operation.
  • Real-Time or Offline: Digital Twins can run in real-time, meaning they mimic the behavior of existing physical products almost simultaneously. Digital Twins, especially ones that are more computationally intensive and thus cannot keep up in real-time, can also be run after the sensor data has been captured.

_

As we look at the definition of Digital Twin, we begin to understand several technologies needed to develop and deploy Digital Twins at-scale: data from the physical system, IoT connectivity, modelling methods, and at-scale computing. Each of these have been developed in parallel over the past 20 years, and its only in the 2020s, however, that we’re seeing the convergence of these technologies needed for Digital Twins at scale.

The first technology has to do with measurements. With IoT sensors in particular, the average cost has dropped 50% from 2010 to 2020, and continues to decrease. Measurements that were cost-prohibitive just 10 years ago are now becoming a commodity. This will continue to drive more sensors gathering even more data.

Second, is the ability to transmit this data so it can be analyzed and actioned on. If we look at wireless connectivity as a proxy, in 2010, 3G was the de-facto standard at less than 1 Mbps. Throughout the 2010s, it was replaced with 4G at 100 Mbps, and now 5G at 10 Gbps is becoming the norm. That is more than a 10000x increase in transmission speed. And 10 Gbps happens to be a milestone threshold for IoT devices as it is fast enough to gather IoT data in near-real time (<10ms latency).

The value of Digital Twins is using this data to derive actionable insights, which is achieved by modelling and at-scale computing, representing the third and fourth key technologies. The term “model” here is used in multiple contexts. For applications involving predicting future states and what-if scenario planning, we need scientific modelling techniques for predicting various phenomena (its behaviour) such as fluid flow, structural deformation, biochemical processes, weather, and logistics. Methods including machine learning, high performance computing, and hybrid approaches such as physics-inspired neural networks are becoming practical to deploy at scale because of compute power available. Another type of modelling is used for visualization and creating realistic immersive environments. Over the past decade, the advancements in the algorithms for spatial computing to create and manipulate 3D content is enabling immersive augmented reality, virtual reality, and the metaverse.

Lastly, the power of at-scale computing has been greatly enabled by the cloud. We’ve seen compute power grow exponentially, both at the chip level itself, as well connecting the chips all together for massively scalable cloud computing, to the point where massive-scale, on-demand compute is becoming a commodity. No longer limited to governments and large corporations, now small startups and even individuals can access the necessary compute to innovate, invent new products and services, and improve our daily lives.

_____

Currently, there is no particular standard that solely focuses on the technical aspects of digital twinning. Standardization efforts are under-development by the joint advisory group (JAG) of ISO and IEC on emerging technologies. However, the ISO standard ISO/DIS 23247-1 is the only standard that offers limited information on digital twins. In addition, there are other related standards that may facilitate DT creation. For example, the ISO 10303 STEP standard, the ISO 13399 standard, and the OPC unified architecture (OPC UA) technically describe ways to share data between systems in a manufacturing environment.

_____

Five-Dimensional Digital Twin (5D-DT):  

The five-dimension digital twin model can be formulated as formula:

MDT = (PE, VM, Ss, DD, CN)

where PE are physical entities, VM are virtual models, Ss are services, DD is DT data, and CN are connections. According to formula, the 5-dimension DT model is shown in figure below: 

-1. Physical entities in digital twin:   

DT is to create the virtual models for physical entities in the digital way to simulate their behaviors. The physical world is the foundation of DT. The physical world may consist of device or product, physical system, activities process, even an organization. They implement activities according to physical laws and deals with uncertain environments. The physical entities can be divided into three levels according to function and structure, which are unit level, system level, and system of system (SoS) level.

-2. Virtual models in digital twin:

Virtual models ought to be faithful replicas of physical entities, which reproduce the physical geometries, properties, behaviors, and rules. The 3-dimension geometric models describe a physical entity in terms of its shape, size, tolerance, and structural relation. Based on physical properties (e.g. speed, wear and force), physics model reflects the physical phenomena of the entities, such as the deformation, delamination, fracture and corrosion. Behavior model describes the behaviors (e.g., state transition, performance degradation and coordination) and responding mechanisms of the entities against changes in the external environment. The rule models equip DT with logical abilities such as reasoning, judgement, evaluation, and autonomous decision-making, by following the rules extracted from historical data or come from domain experts.

-3. Digital twin data:

Twin data is a key driver of DT. DT deals with multi-temporal scale, multi-dimension, multi-source, and heterogeneous data. Some data is obtained from physical entities, including static attribute data and dynamic condition data. Some data is generated by virtual models, which reflects the simulation result. Some data is obtained from services, which describes the service invocation and execution. Some data is knowledge, which is provided by domain experts or extracted from existing data. Some data is fusion data, which is generated as a result of fusion of all the aforementioned data.

-4. Services in digital twin:

Against the background of product-service integration in all aspects of modern society, more and more enterprises begin to realize the importance of service. Service is an essential component of DT in light of the paradigm of Everything-as-a-Service (XaaS). Firstly, DT provides users with application services concerning simulation, verification, monitoring, optimization, diagnosis and prognosis, prognostic and health management (PHM), etc. Secondly, a number of third-party services are needed in the process of building a functioning DT, such as data services, knowledge services, algorithms services, etc. Lastly, the operation of DT requires the continuous support of various platform services, which can accommodate customized software development, model building, and service delivery.

-5. Connections in digital twin:

Digital representations are connected dynamically with their real counterpart to enable advanced simulation, operation, and analysis. Connections between physical entities, virtual models, services, and data enable information and data exchange. There are 6 connections for DT, which are connection between physical entities and virtual models (CN_PV), connection between physical entities and data (CN_PD), connection between physical entities and services (CN_PS), connection between virtual models and data (CN_VD), connection between virtual models and services (CN_VS), connection between services and data (CN_SD). These connections enable above mentioned four parts to collaborate.

_____

_____

Definitions of digital twins:  

A first glance at the websites of companies that offer Digital Twin technology shows that many definitions for digital twins already exist, however there is no commonly accepted definition.

  • Siemens, for example, defines Digital Twins as “a virtual representation of a physical product or process, used to understand and predict the physical counterpart’s performance characteristics”.
  • Dassault Systems takes a narrower view and defines Digital Twin technology as “a virtual representation of what has been produced.”
  • IBM limits the scope to real-time data and defines it as “a virtual representation of a physical object or system across its lifecycle, using real-time data to enable understanding, learning and reasoning”.

The commonality in most definitions is that Digital Twins are a “digital abstraction or representation of a physical system’s attributes and/or behavior”. The devil is in the details because one could still refer to a fully-automated dynamically-recalibrating virtual representation of a physical system or alternatively just one software element of that setup – or something else.

_

  • Achalesh Pandey, GE Digital: “A digital twin is a living, learning digital representation of an asset, process, system, or network used to achieve specific business outcomes. The twin harnesses the power of data and models to provide early warning detection, continuous prediction, and dynamic optimization capabilities. It also allows users to run simulations of potential future events and plan for a specific set of likely outcomes, increasing important metrics like efficiency, productivity, and, ultimately, profitability.”
  • Carsten Baumann, Schneider Electric: “A digital twin is not an end-result, product, outcome, or technology in and of itself. Instead, it’s a dynamic data-supported framework that functions as a business enabler leading to results, products, outcomes, or new technologies. It’s a means to an end, and the end is solving a real-world problem with real-world data. Digital twins give users the ability to model certain operations in a digital environment. Ideally, a digital twin represents the entire life cycle of a project. It incorporates all seven dimensions of digital contraction. 3D represents the physical dimensions, the 4th dimension represents cost, the 5th schedule, the 6th sustainability, and the 7th the operation and maintenance.”

_

According to the Digital Twin Consortium, a digital twin is “a virtual representation of real-world entities and processes, synchronized at a specified frequency and fidelity.”

More specifically, a digital twin is a digital version of a real-world component, machine, or process that exactly replicates the real-world system. The digital twin is created from data regarding the physical and operational characteristics of the product, machine, or process — including everything from the bill of material and mechanical properties, to control logic and operational status, to machine performance and diagnostics. This data is transferred between the real-world system and its digital twin via a digital thread, allowing the digital twin to not only replicate the physical product or process, but also to exactly mimic its behavior.

Figure above shows that digital twin is an exact, virtual representation of a physical product or process that exactly replicates the real-world system and its behavior.

A key component of the Digital Twin Consortium’s definition of the digital twin is synchronization at a specific frequency and fidelity. In other words, a digital twin is regularly updated — preferably in real-time — to ensure it remains in sync with the physical product, machine, or process. This is especially important for digital twins that are developed to help with design and commissioning, before a product is created or a process is implemented. In these scenarios, once the real-world version is in place, the digital twin will likely need to be updated to reflect any changes made in the real-world that weren’t captured by the digital version. Otherwise, the digital twin becomes an inaccurate representation of the real-world situation, and using it for future updates, modifications, or maintenance could lead to wasted effort, time, and cost.

_

The definitions of DT have been continuously evolving, as DT-enabling technologies (e.g., sensing technology, modelling technology, data management method, DT service technology and data connection technology, Qi et al., 2021) have been developed since the 2000s. Although the concept of DT was proposed by Michael Grieves as the “virtual digital representation equivalent to physical products” in 2003 (Grieves, 2014), the development of DT was stagnant until 2012, when NASA defined DT as an integrated multi-physics, multi-scale, probabilistic simulation of an as-built vehicle or system that employs the best available physical models, sensor updates, fleet history, etc., to mirror the life of its corresponding flying twin (Glaessgen and Stargel, 2012). Subsequently, the aerospace field became an important research branch of the DT. In 2015, Ríos et al. (2015) substituted “vehicle” with “product”, which extended the definition of DT for more general purposes. The DT has been defined specifically in many industrial fields. For instance, Tao et al. (2019), considered DT “a real mapping of all components in the product lifecycle using physical data, virtual data and interaction data between them” in product design engineering. In IoT engineering, DT is defined as an evolving digital profile of the historical and current behavior of a physical object or process that helps optimize business performance (Parrott and Warshaw, 2017).

_

Table below lists various DT definitions and their corresponding reference and applied fields.

Definitions

Applied fields

The virtual digital representation equivalent to physical products

General

The DT is an integrated multiphysics, multiscale, probabilistic simulation of an as-built vehicle or system that employs the best available physical models, sensor updates, fleet history, etc., to mirror the life of its corresponding flying twin

Aerospace engineering

The DT is a life management and certification paradigm, where model and simulations consist of as-built vehicle states, as-experienced loads and environments, and other vehicle-specific history to enable high-fidelity modeling of individual aerospace vehicles throughout their service lives

Aerospace engineering

An integrated multiphysics, multiscale, probabilistic simulation of an as-built product that employs the best available physical models, sensor updates, history data, etc., to mirror the life of its corresponding physical twin

General

An evolving digital profile of the historical and current behavior of a physical object or process that helps optimize business performance

IoT

A replication of real physical production system, which are utilized for system optimization, monitoring, diagnostics and prognostics via integration of artificial intelligence, machine learning and software analytics with large volumes of data from physical systems

Manufacture engineering

The DT is a digital representation of a physical item or assembly that uses integrated simulations and service data

General

A real mapping of all components in the product lifecycle using physical data, virtual data and interaction data between them

Design engineering

A set of virtual information constructs that fully describes a potential or actual physical manufacturing product from the micro atomic level to the macro geometrical level

General

A virtual instance of a physical system (twin) that is continually updated with the physical system’s performance, maintenance, and health status data throughout the physical system’s lifecycle

General

_

One thing that binds most definitions of DT other than being a virtual representation of a physical object is the bidirectional transfer or sharing of data between the physical counterpart and the digital one, including quantitative and qualitative data (related to material, manufacturing, process, etc.), historical data, environmental data, and most importantly, real-time data. Using these data, DT can perform such tasks as:

  • In-depth analysis of physical twin;
  • Design and validation of new or existing product/process;
  • Simulate the health conditions of physical twin;
  • Increase safety and reliability of physical twin;
  • Optimization of part, product, process, or production line;
  • Track the status of physical twin throughout its lifetime;
  • Predict the performance of physical twin;
  • Real-time control over physical twin.

_

Definitions of DT tend to overlook its longevity; however, some authors consider DT as a cradle-to-gravel model, meaning that DT can be used over the entire life cycle, from the time of inception of the product until its disposal. Grieves and Vickers, who conceptualize the idea of DT, have defined a type of DT that is created even before its physical twin exists. Martin and Nadja in their review found eleven papers in which DT prequels the physical twin. Grieves and Vickers also suggest that DT technology can have information related to the safe decommissioning of the product during its disposal phase. In addition, after the product is disposed, DT of one generation can help in the design and production of the next generation.

____

____

Digital Twin System:

Figure below depicts a characteristic systemic layout of a digital twin.

The model of figure above specifically finds expression through five enabling components—sensors and actuators from the physical world, integration, data, analytics, and the continuously updated digital twin application. These constituent elements are explained below:

  • Sensors—Sensors distributed throughout the manufacturing process create signals that enable the twin to capture operational and environmental data pertaining to the physical process in the real world.
  • Data—Real-world operational and environmental data from the sensors are aggregated and combined with data from the enterprise, such as the bill of material, enterprise systems, and design specifications. Data may also contain other items such as engineering drawings, connections to external data feeds, and customer complaint logs.
  • Integration—Sensors communicate the data to the digital world through integration technology (which includes edge, communication interfaces, and security) between the physical world and the digital world, and vice versa.
  • Analytics—Analytics techniques are used to analyze the data through algorithmic simulations and visualization routines that are used by the digital twin to produce insights.
  • Digital twin—an application that combines the components above into a near-real-time digital model of the physical world and process. The objective of a digital twin is to identify intolerable deviations from optimal conditions along any of the various dimensions. Such a deviation is a case for business optimization; either the twin has an error in the logic (hopefully not), or an opportunity for saving costs, improving quality, or achieving greater efficiencies has been identified. The resulting opportunity may result in an action back in the physical world.
  • Actuators—Should an action be warranted in the real world, the digital twin produces the action by way of actuators, subject to human intervention, which trigger the physical process.

Clearly, the world of a physical process (or object) and its digital twin analogue are vastly more complex than a single model or framework can depict. And, of course, the model of figure above is just one digital twin configuration that focuses on the manufacturing portion of the product life cycle. But what our model aims to show is the integrated, holistic, and iterative quality of the physical and digital world pairing. It is through that prism that one may begin the actual process that serves to create a digital twin.

_

A good example of a Digital Twin is a system that makes route recommendations to drivers of electric vehicles, including stop points at available charging stations. See figure below. For these recommendations, the system will need a representation of the vehicle itself (including charging status), as well as the charging stations along the chosen route. If this information is logically aggregated as a Digital Twin, the AI in the backend can then use this DT to perform the route calculation, without having to worry about technical integration with the vehicle and the charging stations in the field.

Similarly, the feature responsible for reserving a charging station after a stop has been selected can benefit if the charging station is made available in the form of a Digital Twin, allowing us to make the reservation without having to deal with the underlying complexity of the remote interaction.

The Digital Twin in this case provides a higher level of abstraction than would be made available, for example, via a basic API architecture. This is especially true if the Digital Twin is taking care of data synchronization issues.

_____

_____

Characteristics of Digital Twin:

Depending on the type of DT, it can possess distinctive properties from others, but regardless, all DTs have a few characteristics in common:

  • High-fidelity: A DT needs to be a near-identical copy of its physical counterpart in terms of appearance, contents, functionality, etc., with a very high degree of accuracy. A super-realistic digital model helps DT in mimicking every aspect of its physical twin. Ultra-high fidelity computer models are considered the backbone of the DT. This level of detail allows DT simulation and prediction tools to be more reliable when presented with a set of alternative actions or scenarios.
  • Dynamic: The physical is dynamic, meaning it changes with respect to time. Thus, a DT also needs to change as the physical system changes. This is achieved through the seamless connection and continuous exchange between the physical and virtual worlds. The data exchanging can be dynamic data, historical static data, as well as descriptive static data. DT has been described as a ‘living model in 3D’. The objective of DT being dynamic is to mirror the physical twin and its behaviour realistically in the digital world.
  • Self-evolving: A DT evolves along with its physical counterpart throughout its life cycle. Any changes in either the physical or Digital Twin are reflected in its counterpart, creating a closed feedback loop. A DT is self-adapting and self-optimizing with the help of the data collected by physical twin in real time, thus maturing along with its physical counterpart throughout its lifetime.
  • Identifiable: Every physical asset needs to have its own DT. During different stages of the product lifecycle, the data and information related to it evolve and so does the model, including 3D geometric models, manufacturing models, usage models, functional models, etc. Due to the existence of such models created for DT, a DT can be uniquely identified from its physical twin or vice versa anywhere in the globe and for the entirety of its life cycle.
  • Multiscale and Multiphysical: DT, being the virtual copy of its physical twin, needs to incorporate the properties of the physical twin at multiple scales or levels. Thus, the virtual model in DT is based on macroscopic geometric properties of the physical twin such as shape, size, tolerance, etc., as well as on microscopic properties such as surface roughness, etc. In other words, DT contains the set of information about the physical twin from micro atomic level to macro geometric level. DT is also multiphysical because, besides the aforementioned geometric properties, the model is also based on physical properties of the physical twin such as structural dynamics models, thermodynamic models, stress analysis models, fatigue damage models, and material properties of physical twin such as stiffness, strength, hardness, fatigue strength, etc.
  • Multidisciplinary: Industry 4.0 revolves around many disciplines, and DT being the backbone of Industry 4.0 sees the fusion of disciplines such as computer science, information technology, and communications; mechanical, electrical, electronic, and mechatronic engineering; automation and industrial engineering; and system integration physics, just to name a few.
  • Hierarchical: The hierarchical nature of DT comes from the fact that the different components and parts that make up the final product all have their corresponding DT model, e.g., DT of an aircraft is comprised of rack DT, DT of the flight control system, DT of the propulsion system, etc. Therefore, a DT can be seen as a series of integrated submodels.

_____

_____

Digital Twin Functions:

Digital twins can have various complexities, depending on requirements and the needed amount of information to be processed. So, when creating a digital twin, you should decide on its functions: whether it will just monitor the prototype, or alert you about abnormalities and suggest solutions based on advanced data analytics.

In general, virtual models can be used to monitor, analyze and optimize the performance of their physical prototypes. Figuratively, we can distinguish three groups of their functions:

  • To see — basic-level digital twins perform monitoring, enabled by the data obtained from sensors and devices and a software program that visualizes the situation.
  • To think — middle-level DTs equipped with what-if models can change operational settings in order to find the best asset or process configuration.
  • To do — advanced-level digital twins utilize intelligent algorithms to learn from collected data, detect issues, find several possible solutions to each one and choose the most appropriate solution. They provide predictive maintenance.

_

DT has many strategic benefits. In particular, DT provides a unique way to reflect a physical entity in the digital world with respect to its shape, position, gesture, status, and motion. Together with the sensory data acquisition, big data analytics, as well as AI and machine learning, DT can be used for monitoring, diagnostics, prognostics and optimization. Through the assessment of ongoing states, the diagnosis of historical problems, and the prediction of future trends, DT can provide more comprehensive supports for the decision-making of a wide spectrum of operations. Once integrated with the digital representation of facilities, environments, and people, DT can be used for the training of users, operators, maintainers, and service providers. Through DT, it is also possible to digitize expert experience, which can be documented, transferred, and modified throughout an enterprise to reduce the knowledge gap. Through simulation tools and virtual reality tools, DT can deepen the operator’s understandings of complex physical entities and processes. DT is an effective means to improve enterprises productivity and efficiency, as well as to reduce cost and time.

_____

_____

Implementation of digital twins:

One way to classify digital twins is by looking at their use case. For example, three classes: product, production and performance. Digital twins can be applied to discrete manufacturing ecosystems in three distinct areas: product, production, and performance.

-1. The product digital twin is used to enable more efficient design, and to improve the product. In some cases, the product is the actual equipment and assets used in the production system. Virtual simulation modelling can validate product performance, while simulating how the product is currently behaving in a physical environment. This provides the product developer with a physical-virtual connection that allows them to analyze how a product performs under various conditions and make changes in the virtual design model to ensure that the physical product will perform as designed in the field. This eliminates the need for physical prototypes and reduces development time.

-2. Production digital twins are used in manufacturing and production planning. They can help to validate how well a manufacturing process will perform on the shop floor before the physical production equipment and work cells go into actual production. Today, the virtual commissioning of production automation – an established technology and process – is merging with the more expansive scope of the digital twin. Virtual commissioning is typically a one-time validation of an automated production system. In contrast, the digital twin represents an ongoing analytical and optimizing process that takes place in real time. By simulating the production process using a digital twin and analyzing the physical events across the digital thread, manufacturers can create a production environment that remains efficient under variable conditions.

-3. Performance digital twins are used to capture, analyze, and act on operational data. An important initial step when developing and implementing a digital twin is to identify the exact operational configuration of the product, asset, or production equipment that represent the physical components. When implementing, companies need to include context within the digital twin configuration. For predictive analytics or Industrial IoT to be effective, the context (physical configuration) of the asset and system are required to know exactly what is needed to collect the relevant operational and performance data. Companies implementing any digital twin project should begin by capturing and managing the actual physical configuration of the asset.

______

______

Classification of digital twins according to level of hierarchy:

From a hierarchal perspective, DT can be divided into three different levels as well (see figure below), according to the magnitude involved in manufacturing:

  • Unit level: It is the smallest participating unit in manufacturing and can be a piece of equipment, material, or environmental factors. Unit-level DT is based on the geometric, functional, behavioural, and operational model of unit-level physical twin.
  • System level: It is an amalgamation of several unit-level DTs in a production system such as production line, shop floor, factory, etc. Interconnectivity and collaboration among multiple unit-level DT lead to a wider flow of data and better resource allocation. A complex product, e.g., aircraft, can also be considered as system-level DT.
  • System of Systems (SoS) level: A number of system-level DT are connected together to form SoS-level DT, which helps in collaborating different enterprises or different departments with an enterprise such as supply chain, design, service, maintenance, etc. In other words, SoS-level DT integrates different phases of the product throughout its life cycle.

_

Figure above shows Hierarchical levels of DT in manufacturing.

The hierarchy of DT can also be classified as (i) Part/Component twin, (ii) Product/Asset twin, (iii) System twin, and (iv) Process twin, with part twin being the simplest. More sophisticated and comprehensive systems/processes can be achieved by putting together the lower-level twins.

_____

Digital Twins classification according Maturity/Sophistication:

Based on the sophistication level of DT, i.e., the quantity and quality of data obtained from the physical twin and its environment, DTs can be grouped into:

  • Partial DT: It contains a small number of data points, e.g., pressure, temperature, humidity, etc., which is useful in determining the connectivity and functionality of DT.
  • Clone DT: It contains all significant and relevant data from the product/system that can be used for making prototypes and categorizing development phases.
  • Augmented DT: It utilizes data from the asset along with its historical data and at the same time derives and correlates the useful data using algorithms and analytics.

______

The sophistication level of DT can be improved with the accumulation of bigger sets of data over times of operation. For Azad M. Madni et al., the level of maturity of DT is not just limited to data but it also encompasses the sophistication level of the virtual representation/model. On this basis, DT is divided into four levels:

  • Pre-Digital Twin: This is level 1, where DT is created prior to the physical asset for the purpose of making decisions on prototype designs to reduce any technical risk and resolve issues upfront by using a generic system model.
  • Digital Twin: Level 2 incorporates the data from the physical asset related to its performance, health, and maintenance. The virtual system model uses these data to assist high-level decision-making in the design and development of the asset, along with scheduling maintenance. The data transfer at this level is bidirectional.
  • Adaptive Digital Twin: This level 3 provides an adaptive user interface between physical and Digital Twin, and has the capability to learn from the preferences and priorities of human operators using supervised machine learning. Using this DT, real-time planning and decision-making during operations is possible.
  • Intelligent Digital Twin: In addition to the features from level 3, level 4 has unsupervised machine-learning capability, making it more autonomous than level 3. It can recognize patterns in the operational environment and using that along with reinforced learning allows for a more precise and efficient analysis of the system.

Table below presents these different levels along with the characteristics that define each level:

Level

Model Sophistication

Physical Twin

Data

Acquisition from Physical

Twin

Machine

Learning

(Operator

Preferences)

Machine Learning (System/Environment)

1

Pre-Digital Twin

virtual system model with emphasis on

technology/technical-risk

mitigation

does not exist

Not applicable

No

No

2

Digital Twin

virtual system model of the physical twin

exists

performance, health status, maintenance; batch updates

No

No

3

Adaptive

Digital Twin

virtual system model of the physical twin with adaptive UI

exists

performance, health status, maintenance; real-time updates

Yes

No

4

Intelligent

Digital Twin

virtual system model of the physical twin with adaptive UI and reinforcement learning

exists

performance, health status, maintenance, environment; both

batch/real-time updates

Yes

Yes

_

The excellent report by ARUP uses yet another classification of digital twins by looking at their sophistication. This sophistication is expressed in terms of autonomy, intelligence, learning and fidelity. The levels range from 1, being a non-intelligent, non-autonomous digital twin, to 5, which is a digital twin that replaces human beings for certain non-trivial tasks. The framework moves through five levels, beginning with a simple digital model. As the model evolves, feedback and prediction increase in importance. At higher levels, machine learning capacity, domain-generality and scaling potential all come into play. By the highest levels, the twin is able to reason and act autonomously, and to operate at a network scale (incorporating lower-level twins, for example).

_____

_____

DT deployment throughout product life cycle: 

Industry 4.0 and digitalization provide countless subject areas that are constantly evolving. In addition to cross-company system integration, the virtualization of workpieces, components and assets – via digital twins – is a particular driver of the trend towards tight integration. In principle, every component and every process can be virtualized by a digital twin. The combination of both would then enable complex images of entire systems. However, this is only possible if the digital images go beyond traditional simulation models and are capable of permanent feedback with their real counterpart. The digital twins (DT) will now be deployed throughout the product lifecycle and will be able to develop the new business models for different companies with different functions.

Figure above shows product life cycle.

A digital twin is a digital representation of an intended or actual real-world physical product, system, or process (a physical twin) that serves as the effectively indistinguishable digital counterpart of it for practical purposes, such as simulation, integration, testing, monitoring, and maintenance. The digital twin has been intended from its initial introduction to be the underlying premise for Product Lifecycle Management and exists throughout the entire lifecycle, create, build, operate/support, and dispose, of the physical entity it represents. Since information is granular, the digital twin representation is determined by the value-based use cases it is created to implement. The digital twin can and does often exist before there is a physical entity. The use of a digital twin in the create phase allows the intended entity’s entire lifecycle to be modelled and simulated. Once the product Digital Twin has been created, most system errors are already embedded within the product, limiting the ability to optimise for performance and financial success. To take full advantage of the benefits that Digital Twin technology has to offer, we must shift our focus towards developing Digital Twin technology which can be utilised across the whole system lifecycle.

_____

_____  

DT models:

Associated with the varied DT definitions, various DT models (or frameworks) have been raised for different engineering fields. The early DT model proposed by Grieves (2014) consists of three components: physical product, virtual product and their connection. The virtual product contains not only geometrical information but also behavioral characteristics that show the system performance in response to external stimuli. The NASA and US Air Force Research Laboratory have applied a DT framework to their aircrafts to achieve a more efficient design, greater ability of aircraft, reduction in unexpected cracks and better structural inspection (Tuegel et al., 2011a; Gockel et al., 2012).

Based on the original 3D model of DT, Qi et al. (2021) and Tao and Zhang (2017) proposed a five-dimensional DTS model for intellectual shop-floor design, including the physical entities, virtual models, services, DT data and connections. While in reconfigurable manufacturing, Zhang et al. (2019b) proposed the reconfigurable digital twin (RDT) model, including the geometry, physics, capability, behavior and rule. The DTS model and RDT model are specified for shop-floor configuration to quickly implement CPS in smart manufacturing.

Other DT models, e.g. DT for waste electrical and electronic equipment (WEEE) (Wang and Wang, 2019), DT-enabled fault diagnosis framework (Tao et al., 2018d) and DT-driven product design (DTPD) framework (Tao et al., 2019) are referenced in the corresponding literature. Table below provides a concise summary of various DT models or frameworks, their key components and the corresponding references.

Summary of different DT models:

DT models

Key components

Original DT

Physical products, virtual products, and connection between physical and virtual products

Airframe digital twin (ADT)

Structural definition, structural models, material state evolution models, and flight dynamics

Five-dimensional DT model

Physical entities, virtual models, services, DT data, and connections

Digital twin shop-floor (DTS)

Physical shop-floor, virtual shop-floor, shop-floor services, shop-floor DT data, and connection

Product manufacturing digital twin (PMDT)

Product definition model (PDM), geometric and shape model (GSM), manufacturing attribute model (MAM), behavior and rule model (BRM), and data fusion model (DFM)

Reconfigurable digital twin (RDT)

Physical layer, model layer, data layer and service layer

Digital twin for waste electrical and electronic equipment (WEEE)

Cyber world, service flow, DT knowledge, and physical flow

Digital twin enabled fault diagnosis framework

Physical system, enabling technology, DT model, and predictive maintenance

DT-driven product design (DTPD) framework

Planning and task clarification, conceptual design, embodiment design, detail design, virtual verification

________

________

Digital Twin Data:

While many digital twins utilize a 3D visual representation of some kind, this is not essential. The data is essential, and many organizations choose to visualize that data through a 2D or 3D model of some kind – both on PC displays as well as through augmented reality (AR), but strictly speaking you do not have to do this. Deploying a digital twin isn’t as simple as purchasing and installing software. In part because, while digital twins are products, it is also beneficial to see them as ecosystems, since each twin requires physical and virtual components working together. Not all ecosystems are healthy, and problematic environments do not function nearly as well as their less troubled counterparts. For instance, the key to a successful digital twin is data, specifically data clarity. Digital twins need information to function so – if they cannot properly read and extract what they need – they will not deliver any ROI, at least nothing meaningful. Sourcing data, or a lack of data standardization, is actually a struggle for many organizations when it comes to digital twins. Before you build a tower, you need a strong foundation and information is the foundation for digital twin initiative. Investing in an accessible data infrastructure is part of the upfront cost of digital twin deployment and should never be overlooked. Even one silo can greatly reduce the potential ROI – and when you’re already spending millions, you likely can’t afford to leave money on the table.

_

The 6 Types of Digital Twin Data for industrial DT:

-1. Physics-based models (FEM, Thermodynamic, Geological)

-2. Analytical Models (Predictive Maintenance)

-3. Time series data & historians

-4. Transactional data (ERP, EAM)

-5. Master Data (EAM, AF, BPM)

-6. Visual Models (CAD, AR, VR, BPM, BIM, GIS, and GEO)

You will select data from each of these categories based on the problem you want your twin to solve. By combining data from multiple types and sources, you can create a digital twin that gives you a holistic real-time view of the entity you are monitoring.

_

Digital Twins & Heterogeneous Data Sources:

There’s no single repository or database that contains all the information for a digital twin. There are too many data types generated by different systems and stored in various formats. Creating a single massive repository would also lead to duplication and potential errors in the digital twin data. However, it is vital to create a link between the different data sources for the digital twin and to understand their relationships. The relationship between the data from various sources could be linear, in which you can use a point-to-point view to connect them. Alternatively, they could use a graph-based structure to map how the data relates to the entity and what the relationships are in the data.

Data Standardization Initiatives:

There are several standardization initiatives currently being developed for digital twins. You don’t want to create a mega database, because there are many different data formats, structures, and standards emerging.

In manufacturing, for example, Industry 4.0’s Asset Administration Shell is looking at how to represent this data from a manufacturing perspective. In aerospace, there are StEP initiatives, and in oil and gas, the OSDU is describing subsurface data in a standardized format.

_

A digital twin relies on three key elements:

  • Past Data: This is a historical log of performance data for specific systems, individual machines and overall processes.
  • Present Data: This comprises real-time data coming from various sensors, manufacturing systems, platforms, distribution chains and so on. Present data also includes information from other business units like customer support and sales teams.
  • Future Data: This is modelled performance based on machine learning algorithms and includes inputs from engineers.

A digital twin receives data from a physical asset, process or system in real time. All the tests, analyses and assessments performed by engineers are based on real-world conditions. It’s not a synthetic test environment. So, digital twins allow for testing machines and systems in the digital environment but under real-world conditions.

With further advancements in machine learning and big data, these virtual machine models are becoming a key ingredient in modern engineering, as they will continue to drive future innovations.

_

Digital twin and Lifecycle data:

The lifecycle data of industrial devices can be classified as engineering technology (ET) data, information technology (IT) data and operational technology (OT) data. This lifecycle data is often stored in different places and in different formats due to requirements of functionality, needs of diverse users, company mergers, etc. These data silos lead to a lack of interoperability at multiple data access levels and require error-prone and time-consuming manual data exchanges. Combining data for harnessing by analytics applications is also made difficult.

These problems can be solved by digital twins, which may be deployed locally or in the cloud. Here, the digital twin can offer a common information model for defining otherwise incompatible ET, IO and OT data. This model serves as the basis for application programming interfaces (APIs) to access data and to define semantic correlations between data sets that would ordinarily be dispersed. The digital twin can offer unified APIs for querying various types of lifecycle data, regardless of whether the data is stored in the cloud or in external data sources.

The maturity level of digital twins can be increased further by expressing correlations between different models embodied within the digital twin and deriving more reasoning from this information. Digital twin content can be extended even further by using machine learning and simulation models. Such enhancement increases the intelligence of the digital twin and allows better reasoning with regard to the status of the physical twin. It also provides support to real-time simulation models. More advanced use cases can be achieved if multiple models are combined – for example, to have intelligent simulation models to predict the status of a device.

In the IIoT era, technologies such as the cloud, edge computing, 5G connectivity and augmented reality move the digital twin concept to the next level by enabling improvements in digital technologies, development and standardization of architectures, creation of innovative interactions between systems or users, and establishment of business models.

_

Virtual and physical data flow:

During the system lifecycle, there are two flows of data. The first is by the creation of the physical system where data flows forward as traditionally from creation to production to operation to disposal. However, data for a digital twin flow in reverse. Data from the future phase informs the previous stage. This data can be used to improve the performance of the systems by finding the weaknesses and failures that need refinement.

Figure above shows virtual and physical data flow.

_

Drones Data and Digital Twins:

Data is currency when it comes to building digital twins. These ‘living’ and constantly evolving models need to have a high-level of validity, and up-to-date data is crucial to this. Drones have a role to play here, as they provide an effective data capture tool, collecting highly-accurate information, safely and efficiently. This is especially important for hard-to-reach assets, or in environments where asset downtime needs to be kept to the absolute minimum. Drones are capable of collecting complete, reliable and well-organised baseline datasets to flow into the complex and layered process of building a digital twin. The ease of deployment and cost effectiveness of drones enables organisations to conduct the regular surveys needed to help digital twins constantly evolve to stay relevant and up to date. GIS data is also fundamental for building a digital twin, especially if the goal of the twin is to represent historical accuracy, view performance, or predict future state. And drones have become an essential tool for collecting GIS and survey-grade data. Key tools include the DJI M300 RTK with P1 photogrammetry camera or L1 LiDAR sensor, or the low-altitude mapping drone, the DJI Phantom 4 RTK. So, drones can be a key tool in the digital twin process, helping to collect data quickly, safely and accurately.

Drones can collect following data and these outputs be used to build and evolve a digital twin:

-1. Point Clouds   

A point cloud dataset is a digital representation of a survey site, or asset, which is made up of thousands of points; each one being a geometric coordinate. When put together, this mass of single spatial measurements come together to form a fleshed out model in 3D space. From a point cloud, you can make observations (and measurements) about an object’s depth, elevation, geometry, and location in space. Point clouds tend to be generated by using 3D laser scanners and LiDAR (light detection and ranging), with each point representing a single laser scan measurement. However, photogrammetry can also be used to build point clouds. When it comes to point clouds, the more points collected, the denser the model. The DJI surveying ecosystem of the M300 RTK drone, L1 sensor, and DJI Terra software package is an end-to-end mapping workflow for LiDAR and photogrammetry point cloud generation, while the P1 camera (45MP full-frame sensor) is a higher-resolution photogrammetry sensor. Point clouds provide the basis for detail to be added and made into a digital twin.

-2. 3D Mesh

A 3D mesh is the evolution of a point cloud, whereby small triangles are created between the points in the point cloud. Mapping software then creates a texture for each of the thousands of triangles by extracting a small portion of the original photograph of each point, and builds a 3D mesh or model. While a 3D mesh is not a digital twin, such a model can provide detailed and valuable visual representations, and can also be viewed via augmented reality – covered later in this blog.

-3. 2D Orthomosaic

Drones can be used for 2D orthomosaic generation. A 2D orthomosaic is a top-down map of a survey site, or asset, built by stitching hundreds or thousands of digital photos – collected by the drone – together. Drones and high-resolution cameras, such as the DJI M300 RTK and P1 45MP full-frame photogrammetry sensor, can achieve sub 1cm ground sample distance, to create highly detailed and crystal clear orthomosaics. These real-world, geo-referenced 2D maps offer a stepping stone in the generation of 3D meshes that can be incorporated to provide an alternative perspective within a digital twin.

-4. DTM Data

Drones – typically with LiDAR capabilities – can collect data to build a Digital Terrain Model (DTM). A DTM is a model of the bare earth (without any objects/buildings in it) containing elevation data of the terrain. This is different to a Digital Surface Model (DSM), which represents the earth’s surface and includes all objects on it. LiDAR and photogrammetry can be useful applications for this. DTM drone data can be fed into digital twins to ensure the topographical elevation changes are accurately modelled.

______

______

The Working of Digital Twins:

Theoretically, you can build a digital twin for almost everything. In practice, it’s far from feasible to create a replica that covers every single aspect of a product or manufacturing process. If you’ve already jumped to the conclusion that your business will benefit from DTs or at least want to test the idea, choose a single component or operation that is most vulnerable or crucial for your business. Once you understand what you are going to twin in the first place, the next steps may be the following.

Choose the type of digital twin: physics-based vs data-driven vs hybrid models:

Generally, there are two types of DTs — physics-based twins and data-based twins. The former rely on physical laws and expert knowledge. They can be built from CAD files and used to simulate the work of comparatively simple objects with predictable behavior — like a piece of machinery on the production line.

The key downside is that updating such twins takes hours rather than minutes or seconds. So, the approach makes sense in areas where you don’t need to make immediate decisions.

Contrasted with the physics-based type, data-based twins don’t require deep engineering expertise. Instead of understanding the physical principles behind the system, they use machine learning algorithms (typically, neural networks) to find hidden relationships between input and output.

The data-based method offers more accurate and quicker results and is applicable to products or processes with complex interactions and a large number of impact factors involved. On the other hand, to produce valid results it needs a vast amount of information not limited to live streams from sensors.

Algorithms have to be trained on historical data generated by the asset itself, accumulated from enterprise systems like ERP, and extracted from CAD drawings, bills of material, Excel files, and other documents.

Today, various combinations of two methods — or so-called hybrid twins — are often used to take advantage of both worlds.

_

To build digital twins, conceptual models are imported (through BIM, CAD, or GIS) or physical objects are scanned in the real world for visualisation and analysis with enterprise and Internet of Things (IoT) data. Real-time 3D, a computer graphics technology that creates interactive content more quickly than human perception, can power a digital twin that can filter, arrange, and present various data sources (both information and models) as realistic, interactive visualisations. Digital twins are virtual depictions of the forces, interactions, and movements that assets may experience in the real world. This enables users to interact with three-dimensional, dynamic content that responds in real time to their actions. They may accurately mimic real-world situations, what-if scenarios, and any situation imaginable in this virtual environment, and instantaneously view the results on any platform, including mobile devices, computers, and augmented, mixed, and virtual reality (AR/MR/VR) devices.

_

To create a digital twin, the engineers collect and collate all kinds of data. This includes physical, manufacturing and operational data, which is then synthesized with the help of Artificial Intelligence (AI) algorithms and analytics software. The end result is a virtual mirror model for each physical model that has the capability of analyzing, evaluating, optimizing and predicting. It is important to constantly maintain synchronous state between the physical asset and the digital twin, so that the consistent flow of data helps engineers analyze and monitor it. The basic architecture of digital twin consists of the various sensors and measurement technologies, IoT and AI. From a computational perspective, the key technology to propel a digital twin is integration of data and information that facilitates the flow of information from raw sensory data to high-level understanding and insights.

Every deployment of a digital twin is different. Deployments frequently happen in stages, with the complexity and commercial effect increasing with each step. A digital twin is an exact depiction of a network or system that is as large as a city, with each of its components constantly linked to engineering, construction, and operating data.

_

Tools needed to develop Digital Twins:

The tools you need to build digital twins in a manufacturing setting will depend on the purpose of the digital twin. Some tools that are useful to have include:

CAD or 3D Modelling Tools:

Engineering and design teams will already have CAD or other 3D design programs as a foundation of their manufacturing process. Companies building digital twins create pipelines in which CAD designs are exported into other tools in the pipeline. Such a pipeline has many advantages. One of them is not having to make the same changes twice in two different tools. If a change has to be made in a CAD design, it can be brought over seamlessly into the enhanced digital twin. Another benefit is the speed of overall development.

IoT/Connected Devices:

IoT & connected devices can be combined to enhance digital twins. They supply continuous, real-time data that is emulated in the digital twin. Combined with digital twin technology, IoT & connected devices can make a huge difference in communication, especially between teams.

Game Engines:

Because of their powerful rendering abilities and advanced physics engines, game engines like Unreal supercharge the capabilities of digital twins. They make visualizations easier to understand by bringing them to life and enabling an emotional connection with the thing they represent.

Version control:  

All of the data collected and created in the digital twin process (and there is a lot of it) needs to be managed carefully. A version control system allows you to manage changes to files over time and store these modifications in a database.

The key functionality of digital twin implementation through physics based models and data driven analytics is to provide accurate operational pictures of the assets. The Industrial IoT system carries out real-time data acquisition through its smart gateway and edge computing devices. The digital twin thus combines modelling and analytics techniques to create a model of specific target. 

_

Steps involved in creating a digital twin: 

  • Physical to digital: Engineers capture all sorts of data from the physical asset with the help of various sensors and convert it into digital record
  • Digital to digital: use advanced analytics, scenario analysis and Artificial Intelligence (AI) to gain and share meaningful information
  • Digital to physical: apply algorithms to translate digital world decisions to effective data in order to spur action and change in the physical world

A few popular software tools used for this purpose are PTC ThingWorx for Industrial IoT and PTC Vuforia for Augmented Reality. 

With the converging of digital technologies like AR, VR and AI in IIoT, digital twins are gaining importance, especially in the context of Industry 4.0. They are drivers of innovation and performance, providing technicians with the most advanced monitoring, analytical and predictive capabilities.

_

The Digital Twin Consortium, an industry association working to build the market and recommend standards, adds an important phrase to the basic definition: “synchronized at a specified frequency and fidelity.”

These qualifiers refer to three key aspects of the technology.

  • Synchronization is about making sure the digital twin and the represented entity mirror each other as closely as possible.
  • The frequency, or speed, at which data gets updated in a digital twin can vary enormously, from seconds to weeks to on demand, depending on the purpose.
  • Fidelity is the degree of precision and accuracy of the virtual representation and the synchronization mechanism.

_

Digital twin interoperability:  

To be useful, digital twins must work in a variety of enterprise applications.

While they usually start in CAD and PLM, some digital twins are also managed in ERP and material requirements planning (MRP) software. ERP and MRP store the bill of materials (BOM), a comprehensive inventory of the materials and parts needed to make a product and typically a major contributor of digital twin data. ERP and MRP together also run many of the supply chain and production processes that go into manufacturing a product; along the way, they collect much of the data that goes into the digital twin.

Another common source of digital twin data is the manufacturing execution system (MES) that many companies use to monitor, control and optimize production systems on the factory floor.

Enterprise asset management software, which is increasingly the preferred system companies use to manage the purchasing, monitoring and maintenance of their most valuable equipment, also has to integrate with digital twins.

In addition, cloud providers that offer digital twin services, such as Amazon Web Services and Microsoft Azure, must have some sort of integration with the other enterprise applications. It’s even possible for digital twins that reside in different levels of the hierarchy — asset twins and process twins, for example — to be spread across different cloud services.

_

What to consider before implementing digital twins:

-1. Security protocols need to be updated. According to Gartner’s estimation, 75% of the digital twins for IoT-connected OEM products will utilize at least five different kinds of integration endpoints by 2023. The amount of data collected from these numerous endpoints is huge, and each of the endpoints represents a potential area of security vulnerability. Therefore, companies should assess and update their security protocols before adopting digital twin technology. The areas of highest security importance include:

  • Data encryption
  • Access privileges, including a clear definition of user roles
  • Least privilege principles
  • Addressing known device vulnerabilities
  • Routine security audits

-2. Make sure your data is high quality. Virtual models of digital twins depend on data transmitted by thousands of sensors. These sensors are remote and can communicate over unreliable networks. You will need to be able to eliminate bad data and manage gaps in data flows.

-3. Your team must be well qualified. If you intend to use a digital twin, your team of engineers must be prepared to completely change the way they work, which can potentially lead to problems in creating new technical capabilities. You need to make sure that your employees have the necessary skills and tools to work with DT models.

_

Consider investments to be made:

Let’s presume that your company already has sensors and CAD or CAE software to create a basic representation of your assets. At the next phase, you will need to invest in

  • additional hardware — for example, edge computing devices to process data on the periphery, closer to IoT sensors;
  • services of a data management or IoT platform or other middleware to ingest and process data from disparate systems and store it in one place;
  • simulation software;
  • analytics solutions;
  • domain experts to run physics-based simulations; and/or
  • data scientists if you opt for data-driven or hybrid methods.

It’s worth noting that large cloud providers and leaders in digital twinning offer services that cover many aspects of digital twinning.

_

The digital twin trade-off:

Digital twins allow us to carry out simulations at multiple points in the product lifecycle to improve decision making, but there is a trade-off. Simulations are, by necessity, bounded and an approximation, so we need to understand the business value and impact of creating digital twins before we invest a great deal of time and money. We must first answer the simple question of “why are we doing this?” Sometimes the answer is obvious: to reduce project costs, get a product to market faster to achieve regulatory compliance. But, in other instances, the value is not so clear. In all cases, the three considerations before undertaking a digital twin project should be:

-1. Complexity: How expensive (time and cost) will it be to create a digital twin?

-2. Breadth: How generic or how specific will the digital twin be?

-3. Depth: How detailed and accurate will the results from the digital twin be?

In reality, due to the wide variety of problems that digital twins can address and due to the trade-offs that are inherent in each problem, it is likely that you will end up with multiple, federated digital twins, addressing different needs such as representing various phases in the product lifecycle or answering different “what-if” questions. And, areas of your business, and possibly other organizations in your ecosystem, will need to share data, and that data will potentially need to be integrated in real time to ensure changes to any one digital twin are correctly represented in another.

_

Explore ready-to-use solutions:

Here’s a short overview of the DT products from industry leaders that will save you time and effort.

IBM Digital Twin Exchange works as a virtual shop where organizations may search, purchase, and download digital twins or data related to them from different manufacturers. The assortment includes 3D CAD models, bill of material (BOM) lists, engineering manuals, etc. They can be teamed with IBM Maximo asset management solution to predict asset performance and schedule maintenance operations based on fresh data.

Azure Digital Twins is a platform as a service (PaaS) for visualizing physical environments with all connected devices, locations, and occupants involved. The relationships between these objects are represented with spatial graphs. The service is paired with Azure IoT Hub that collects data from IoT sensors and other Azure services.

GE Digital Twins software allows companies to rapidly create digital twins and get value from them, using “blueprints” from their catalogue. The three core areas covered by GE are manufacturing assets, grid networks, and production processes. DT software is powered by machine learning and integrated with GE’s IIoT platform called Predix.

Oracle IoT Digital Twin Framework enables you to create both physics-based and data-driven or predictive digital twins. The former compares observed and desired parameters to detect existing problems. The latter run machine learning algorithms to forecast future issues and prevent or prepare for them.

Keep in mind, though, that DT tools, elements, and blueprints work well when bought from one vendor. Otherwise, compatibility and integration issues are more than possible. As with other emerging and evolving technologies, digital twinning lacks generally accepted standards, leading to poor interoperability between systems.

_

Remember that DT is not off-the-shelf technology:

In any case, each DT is as unique as a product or process it represents. While ready-to-use infrastructures, platforms, and models can facilitate the development, but they won’t do all the work. You will still need experts in data, machine learning, cloud technologies, and, of course, engineers, capable of integrating different parts of hardware and software puzzles.

_

What to expect?

Despite all promises and even proven examples of success, digital twins still don’t see wide adoption. To some extent, the complexity of their creation is to blame. Another reason is a scarcity of industry standards that restrains communication and data exchange across platforms and apps from different vendors.

Hopefully, this will change soon: More and more tech leaders including Microsoft, GE Digital, and Dell have become part of the Digital Twin Consortium to facilitate the development, adoption, and interoperability of DTs. This goal is hard to achieve without clear technical guidance and agreed-upon frameworks.

_______

_______

Control tower based on digital twin:

A Control Tower or Command Center collects and analyses multiple data sources and provides teams with visibility, root cause identification, predictions, alerts, response agility and performance management, shifting focus from reactionary management to responsive planning.

Gartner defines Control Tower as:

“The concept combining five elements – people, process, data and organization – supported by a set of technology-enabled capabilities for transparency and coordination”

The aim is to capture and use data — structured and unstructured, internal and external — to provide enhanced visibility for short and midterm decision-making aligned with strategic organizational objectives.

Next Generation Control Tower will be based on a Digital Twin of the Organization. Many Organizations in the manufacturing and retail industries are currently facing unprecedented issues in supply chain, causing significant disruptions, plant shutdowns and margin losses. In this environment, organizations that can connect and orchestrate the entire ecosystem from suppliers to customers to build a digital twin of the organization (DTO) to create advanced solutions such as Control Towers will outpace and accelerate their top and bottom-line growth compared to others.

A capable digital twin solution should be paired with a control tower to:

  • Enable near-real-time visibility within the internal and external network
  • Ingest data across multiple supply chain functions such as planning, logistics, manufacturing and warehousing
  • Simulate and model network disruptions or the effect of near-term plans
  • Provide decision automation and recommendation engine capabilities

These capabilities transform business response during crucial moments. For example, during a supply disruption, an integrated system could detect the event, understand the timeline and impact of disruption, and recommend to either find alternative sources, put finished product on fair share allocation or increase order confirmation lead times.

_____

_____

Using digital twins to solve real world problems:

Digital twins can be used to address various problems that organizations face, including but not limited to:

  • Capturing requirements.

Digital twins can ensure that the requirements that are captured during the earliest stages of a product lifecycle are maintained, verified, and validated as the product evolves, is built, enters service, and is ultimately retired, decommissioned, and recycled.

  • Designing products.

One of the benefits of digital design over the last 30+ years is the ability to evaluate alternatives during the ideation phases and then quickly discard concepts that don’t meet the original intent. In addition, a design digital twin can be used to simulate and test the design before any manufacturing work takes place. Also, 3D models can be visualized in context to produce a configurable digital mock-up of the final product for early user acceptance testing, which has been a commonplace practice in automotive and aerospace industries for several decades.

  • Project planning.

A project planning digital twin can be used to compare different lifecycle plans based on impact from other digital twins as they evolve to assist with contingency and resilience management to ensure the plan is achievable.

  • Reliability engineering.

The ability to digitally reflect sensor information from a real-world instance of an asset has become increasingly feasible over the last few years with the evolution of Industrial IoT solutions; again, this is not a new concept but one that has improved in scalability, security, cost, and resilience in recent years. Being able to monitor asset performance in as near to real time as is needed means that reliability engineers are able to make better decisions about asset maintenance and replacement thus improving overall asset performance, increasing system efficiency, and optimizing asset behavior, all of which allows the reliability engineer to predict and manage risk based on high quality data rather than assumptions based solely on experience.

  • Training.

As assets become ever more complex and the experienced knowledge workers get closer to retirement, the usefulness of a digital twin as an aid to training is gaining significant momentum. No longer are long apprenticeships or mentoring needed when all the information is available to a new user with a digital twin; certainly assistance will often be needed by new team members but the digital twin has been successfully proven in many instances to allow teams to fix things right the first time.

  • Real-time decision making.

Digital twins allow decision makers to rapidly understand the implications of any changes that are made to an asset at any point in its lifecycle. For example, if a material change is made, what will the impact be on the project plan model, the design model for mass and centre of gravity, the cost model for overall financial impact, and so on. The digital twin thus enables an organization to execute simulations to answer “what if” questions, sometimes repeatedly, with adjustments to the parameters, rather than going through the process of creating physical prototypes.

  • Decommissioning resources.

With only finite levels of certain resources available globally, there has been a significant focus in recent years on how assets are recycled, decommissioned, or scrapped to encourage a circular economy. For example, with steel being a finite resource, there is great focus by major steel producers to understand where their products are being used, how long they will be used for, how they will be maintained, and what condition they will be in at the end of their first life, to ensure that they can be reused (possibly at a lower grade) for future products. In addition, there are global initiatives to monitor plastics and other hazardous materials to ensure safe usage and disposal, which means that the digital twin can be used to improve reporting and regulatory compliance.

______

______

Application fields of digital twin:

Through the integration with mobile Internet, cloud computing, big data analytics and other technologies, DT is potentially applicable for many fields where it involves the mapping, fusion, and co-evolution between the physical and virtual spaces. As shown in figure below, the DT applications can be found in smart city, construction, healthcare, agriculture, cargo shipping, drilling platform, automobile, aerospace, manufacturing, electricity, etc.

Figure above shows different application fields of digital twin.  

As a relatively new technology, the application of DT was pioneered by the leading enterprises (e.g., GE, PTC, Siemens, ANSYS, Dassault, etc.).  For civil engineering, Dassault used its 3D Experience Platform to build a “Digital Twin Singapore” to support urban planning, construction, and service. Intellectsoft is exploring the DT applications on construction site to detect potential problems and prevent dangerous operations. In the healthcare field, Sim&Cure developed patient-based digital twin for treating aneurysms, and Dassault conducted a “Living Heart Project (LHP)” toward a human heart DT. According to the whitepaper about DT by Microsoft, DT has the power to accelerate agricultural business and support agricultural sustainability. DNV GL established a “virtual sister ship” (i.e., a vessel DT) to increase reliability, reduce operational cost, and improve safety throughout the vessel’s lifecycle. A drilling platform DT for the Blue Whale 1 in China enabled the visualization display, operational monitoring, and design training. Tesla attempted to develop a DT for each electric car to enable the simultaneous data transfer between car and plant. In the aviation industry, Airbus, Boeing, AFRL, and NASA used DT to mirror actual conditions, identify defects, predict potential faults, and solve the problem of airframe maintenance. LlamaZOO used DT to enable mine supervisors to monitor their operators’ vehicles. Based on the Predix platform, GE built a digital wind farm, by creating a DT for every wind turbine, to optimize maintenance strategy, improve reliability, and increase energy production. Finally, many DT applications can be found in the manufacturing field. For example, SAP and Dassault relied on DT to reduce the deviation between functional requirement and actual performance. Siemens and PTC relied on DT to improve manufacturing efficiency and quality control. GE, ANSYS, TESLA, and Microsoft focused on the real-time monitoring, prognostics and health management, and manufacturing services.

_

Retail, manufacturing, supply chain, healthcare, renewable energy, construction, aerospace, and climate change are all served by digital twins. It has evolved into a tool for developing improved virtual, real-time, and AI-driven models of real-world applications, systems, or objects. The technology is also being utilized to incorporate much of the real-world environment into the Metaverse, a virtual system. We should anticipate using digital twins to expand and grow as enterprises continue to understand the potential of digital twins.

Improved understanding and insight into complex systems:

Organizations may use digital twins to understand better the behaviour and performance of complex systems such as industrial processes or transportation networks. This can assist companies in identifying inefficiencies and opportunities for improvement.

Enhanced ability to predict and optimize outcomes:

By simulating the behaviour of a system using a digital twin, organizations can make more informed decisions and take actions to maximize performance. For example, a digital twin of a manufacturing plant could be used to predict the outcome of changes to the production process and identify the optimal configuration.

Improved Collaboration and Communication:

By offering a shared digital image of a system or process, digital twins may help teams collaborate and communicate more effectively. It can help teams collaborate more effectively and make better decisions.

_

In the era of Industry 4.0, when several industries are experiencing a digital transformation, Digital Twin (DT) is considered no less than a linchpin for gaining a competitive and economic advantage over competitors. DT saw its origins in the aerospace industry, and it is expected to revolutionize other industries. The main applications of DT in different sectors are designing/planning, optimization, maintenance, safety, decision making, remote access, and training, among others. It can be a great tool for companies to increase their competitiveness, productivity, and efficiency. DT has the ability to link physical and virtual worlds in real time, which provides more a realistic and holistic measurement of unforeseen and unpredictable scenarios.

The value DT brings to any sector, by reducing time to market, optimizing operations, reducing maintenance cost, increasing user engagement, and fusing information technologies, is indisputable. As an emerging technology, the development of DT matches the profound strategic plans of some leading manufacturing countries. For example, in accordance with the development and application of a cyber-physical system (CPS), DT has become a key component in the Made in China 2025 plan, which targets the development of intelligent control systems; industrial application software, fault diagnosis software and related tools; and sensor and communication system protocols to realize real-time connection, accurate identification, effective interaction, and intelligent control of manufacturing equipment and products (State Council Of China, 2015). DT has been extensively investigated in different industry fields. DT allows companies to have a virtual copy of their product in their full lifecycles, rapidly detect defaults, solve physical issues sooner, more accurately predict outcomes and better serve their customers (Parrott and Warshaw, 2017). Tuegel (2012) proposed an airframe digital twin (ADT) that was specified for spacecraft. Tao and Zhang (2017) established DT shop-floor (DTS) technology for intellectual shop-floor design. General Electric (GE) has developed the platform “Predix” for a DT wind farm (Noel and Jackson, 2015). A DT for healthcare and a DT for smart cities is categorized in Fuller et al. (2019) and Qi et al. (2021), respectively.

_

The digital twin technology processes enormous amounts of sensor data using algorithms designed for machine learning and identifying patterns within the data. Artificial intelligence and machine learning deliver advanced analytics about how to optimize operations, execute upkeep, maximize efficiencies, and reduce emissions.

Some examples of companies that are working with digital twins are:

  • Microsoft with Azure digital twin which is an IoT- platform
  • Siemens digital twins – which lets you analyze how a product performs under various conditions
  • Unity digital twins for 3D projects
  • Amazon Web Services (AWS) digital twin is also an IoT service
  • IBM digital twin experience – for asset-intensive industries
  • Tesla digital twin – the company create a digital twin of every vehicle it sells
  • Google Supply Chain twin which is a Google cloud solution

_

The reason for unprecedented growth is the increasing importance of digital twins across several industries and niches, including:

  • The construction of physically large structures such as bridges
  • The construction of mechanically complex projects, including automobiles and aircraft
  • The design and construction of power and dangerous equipment
  • Boosting waste reduction, safety, and efficiency across the supply chain

Just some of the industries that have adopted digital twinning technologies include automobile and aircraft manufacturing, building and construction, and power utilities.

_

Is Digital Twin Free?

Digital twins vary in terms of accessibility and price depending on what they are created for. For example, platforms as a service (PaaS) such as Azure Digital Twins allow you to create digital environments on a pay-as-you-go basis. Oppositely, major companies creating digital twins and environments on demand can charge from $1.2 to $4.2 million for projects involving the creation of digital twins for large buildings and infrastructures.

____

____

Digital twins create value:

Digital twins can be used in different ways to add value to a product, process, user, or organization. The value available, and the investment required to capture it, are highly application dependent. Most fall into one or more of the following broad categories.

Descriptive Value.

The ability to immediately visualize the status of an asset via its digital twin is valuable when those assets are remote or dangerous – examples include spacecraft, offshore wind turbines, power stations, and manufacturer-owned machines operating in customer plants. Digital twins make information more accessible and easier to interpret from a distance.

Analytical Value.

Digital twins that incorporate simulation technologies can provide data that is impossible to measure directly on the physical object – for example information generated inside an object. This can be used as a troubleshooting tool for existing products and can help to optimize the performance of subsequent product generations.

Diagnostic Value.

Digital twins can include diagnostic systems that use measured or derived data to suggest the most probable root causes of specific states or behaviors. These systems can be implemented in the form of explicit rules based on company know-how, or they may leverage analytics and machine learning approaches to derive relationships based on historical data.

Predictive Value.

The likely future state of the physical model can be predicted using a digital twin model. One example is GE’s use of digital twins in wind farms to predict power output. The most sophisticated digital twins do more than merely predict the issue that may occur; they also propose the corresponding solution. Digital twins will play a significant role in the development of future smart factories capable of making autonomous decisions about what to make, when and how, in order to maximize customer satisfaction – and profitability.

_

Early adopters of digital twins commonly report benefits in three areas:

-Data-driven decision making and collaboration

-Streamlined business processes

-New business models

Because each digital twin presents a single visualization to key decision makers, it provides a single source of truth for an asset that drives stakeholder collaboration to resolve problems expediently. Digital twins can be used to automate tedious error-prone activities such as inspections, testing, analysis, and reporting. This frees teams to focus on higher-value activities. Digital twins are a major driver of product-as-a-service business models or servitization – this is when companies abandon the one-time sale of a product to instead sell outcomes by managing the full operation of the asset throughout its lifecycle. Digital twins allow manufacturers to monitor, diagnose, and optimize their assets remotely, helping to improve availability and reduce service costs.

_

Digital twins enable you to optimize, improve efficiencies, automate, and evaluate future performance. You can use the models for other purposes such as virtual commissioning or to influence next-generation designs.

Digital twin models are commonly used in several areas:

-1. Operations optimization: Using variables like weather, fleet size, energy costs, or performance factors, models are triggered to run hundreds or thousands of what-if simulations to evaluate readiness or necessary adjustments to current system set-points. This enables system operations to be optimized or controlled during operation to mitigate risk, reduce cost, or gain any number of system efficiencies.

-2. Predictive maintenance: In industry 4.0 applications, models can determine remaining useful life to inform operations on the most opportune time to service or replace equipment.

-3. Anomaly detection: The model runs in parallel to the real assets, and immediately flags operational behavior that deviates from expected (simulated) behavior. For example, a petroleum company may stream sensor data from offshore oil rigs that operate continuously. The digital twin model will look for anomalies in the operational behavior to help avoid catastrophic damage.

-4. Fault isolation: Anomalies may trigger a battery of simulations to isolate the fault and identify the root cause so engineers or the system can take appropriate action.

______

______

Digital twin framework: 

To help understand where digital twins can be used within a smart factory, a framework created by IoT Analytics breaks down the cases and capabilities, using three dominant dimensions: the hierarchical level of a digital twin (six levels), the lifecycle phase in which the digital twin is applied (six levels), and the use or capability of the digital twin (seven levels). An additional fourth dimension can be added that specifies the data type used by the digital twin; this can be real-time, historical, or test data. For simplicity reasons this was not visualized.  There are 252 potential combinations of the 6 levels, 6 phases, and 7 most common uses (6x6x7=252), although research indicates that many digital twin initiatives cater to more than one combination. The resulting Digital Twin cube explains why a digital replica which is used for a “product simulation during the design phase” is completely different to a “process parameter prediction during manufacturing operations”. Both are called Digital Twin but only have limited overlap.

Figure above shows digital twin classification framework.

The 3 Dimensions of digital twin technology:

  • Dimension 1: Hierarchical levels

There are 6 hierarchical levels of a digital twin.

-1. Informational: Digital representations of information e.g., an operations manual in digital format.

-2. Component: Digital representations of individual components or parts of a physical object e.g., virtual representation of a bolt or a bearing in a robotic arm.

-3. Product: Digital representations of the interoperability of components/parts as they work together at a product level e.g., virtual representation of a rotating robotic arm.

-4. Process: Digital representations enabling the operation and maintenance of entire fleets of disparate products that work together to achieve a result at a process level e.g., virtual representation of a manufacturing production line process.

-5. System: Digital representations twinning multiple processes and workflows, not just limited to physical objects, enabling the optimization of operations at a system level e.g., virtual representation of an entire manufacturing system.

-6. Multi-system: Digital representations of multiple systems working together as one unified entity to enable unprecedented insight, testing and monitoring of key business metrics in a data-driven manner e.g., virtual rep. of several systems working in unison such as systems for industrial manufacturing, supply chain, traffic control, communication, HR, etc.

  • Dimension 2: Lifecycle phases:

There are 6 lifecycle phases in which digital twins are applied.

-1. Design: In the design phase, requirements are gathered and one or more designs (e.g., for components, products, processes, or systems) are developed, with which the required outcome can apparently be achieved e.g., using digital twins as the source of all data such as object properties and parameter values for which virtual representations can be built on.

-2. Build: The build phase takes the code requirements outlined previously and uses those to build the actual software-based digital twin. This phase also covers data management, configuration provisioning, repository management and reporting e.g., using digital twins to virtually build and simulate prototypes without having to build more cost intensive physical counterparts for testing.

-3. Operate: The operation phase is when actual users start using online digital twins in active deployments. Typical operative tasks include extracting sensor data or orchestrating devices remotely e.g., using digital twins to extract real-time sensor data from a rotating robotic arm in a production line or updating the device configuration over-the-air.

-4. Maintain: The maintenance phase involves making changes to hardware, software, and documentation to support its operational effectiveness. It includes making changes to correct problems, enhance security, or address user requirements e.g., using digital twins to do regular maintenance tasks such as sending OTA updates for system configuration or cybersecurity.

-5. Optimize: The optimize phase requires use of existing capability information and a statistical approach to tolerancing. This can used to improve development of detailed design elements, predicting performance, and optimizing operations e.g., using digital twins to run a large number of tests that generate insights to help predict future performance and failures.

-6. Decommission: The decommissioning phase involves the removal of a digital twin release from use. This activity also known as system retirement or system sunsetting e.g., using digital twins to remotely decommission devices no longer in use and subsequently retire the use of that corresponding digital twin.

  • Dimension 3: Most common uses:

A multitude of uses exist for digital twins, 7 of the most common uses are highlighted here:

-1. Digitize: Any digitized information

-2. Visualize: Basic digital representation of a physical object

-3. Simulate: Simulation model of a physical system in its environment

-4. Emulate: Emulation model of the physical system with real software

-5. Extract: Extraction model of real-time data streams, physical to virtual system

-6. Orchestrate: Orchestration model for virtual control/updating of physical devices

-7. Predict: Prediction model to predict future behavior of the physical system

One of the best examples of how to implement digital twins in manufacturing to support smart factory initiatives is Unilever. In 2019, Gartner named the consumer goods giant one of the industry’s best-performing supply chain leaders. Unilever implemented digital twins of its manufacturing production line process to increase productivity, reduce waste, and make better operational decisions. Its digital twins are a type described by the IoT Analytics framework as process × operate × orchestrate. According to a 2019 article in the Wall Street Journal, devices send real-time information on physical variables, such as temperatures and motor speeds, into the cloud. Advanced analytics process this data and simulate conditions to ultimately map out the best operational conditions to control and adjust the production process. This results in better quality and productivity. Unilever worked with Microsoft to implement digital twins of dozens of its 300 global plants, and each twin reportedly was implemented in three or four weeks.

Another interesting example is digital twins in the field of maintenance prediction. Using the digital twin, a company can develop predictive maintenance strategies based on the digital replica of a machine or group of machines. With this technology, maintenance specialists can simulate future operations of the machine, create failure profiles, calculate the remaining useful life of the machine and plan maintenance activities based on the simulation results. All of this happens without the machine being stopped. The digital twin collects machine data from the machine controller and external sensors; this data is fed into a simulation model that uses algorithms and data analysis technologies to predict the health status of the asset. According to the IoT Analytics framework, this type of digital twin is a product × maintain × predict digital twin.

______

______

Real-life examples of digital twins:   

Here are real-life examples of how digital twins work in different industries across all levels.

-1. Aircraft industry: pinpointing the right time for engine maintenance

Up to 70 percent of airplanes in the world fly on engines, produced by General Electric (GE). This fact makes the corporation partially responsible for the safety of millions of passengers. To forecast the degradation of the aircraft’s heart over time, GE created a digital twin for its engine GE90 that powers long-range Boeing 777. GE created a digital twin for fan blades of its most popular GE90 engine that surpassed 100 million flight hours. The twin represents not the entire mechanism, but its composite fan blades that are prone to spallation or  the peeling off of the fragments of material due to the impact of rough conditions. This is especially valid for the regions like the Middle East, where engines are exposed to such an additional damaging factor as sand. The DT helps pinpoint the right time for maintenance before any issues arise.

-2. Automotive industry: running Tesla car replicas for remote diagnostics

Each new car produced by Tesla has its own digital twin. Sensors embedded in a vehicle constantly stream data about the environment and performance to the virtual copy that lives in the cloud. AI algorithms analyze these feeds to identify whether the car works as expected. If not, the problems are fixed by sending over-the-air software updates. In this way, Tesla adapts the vehicle’s configurations to different climate conditions, virtually improves its performance, and provides remote diagnostics, minimizing the need for visiting service centers.

-3. Tire manufacturing: reducing the wear of wheels

Bridgestone, the world’s top tire and rubber manufacturer, is taking advantage of digital twins on a regular basis to understand how speed, road conditions, driving style, and other factors affect the performance and lifespan of their products. Armed with these insights, the company helps fleets select the best options for their specific needs and advise on what can be done to prevent breakages and extend the life of the wheels. The industry leader also uses digital twinning to design and test new types of tires. According to Bridgestone estimations, this approach cuts development time by 50 percent.

-4. Power generation: predicting the performance of gas turbines

The Europe’s largest industrial manufacturing company and a digital twinning pioneer, Siemens developed a virtual avatar of its gas turbine and compressor business purchased from Rolls-Royce. The digital twin called ATOM (Agent-Based Turbine Operations and Maintenance) represents the production and servicing of their turbine fleet, spanning the supply chain operations.

ATOM digital twin reflects complex interactions across the entire lifecycle of the gas turbine as seen in the figure above.  ATOM digests live data from multiple sources to thoroughly model cobwebs of engine parameters, performance metrics, maintenance operations, and logistics steps across the entire turbine lifecycle. By running different what-if scenarios and visualizing their results, it helps stakeholders make better investment decisions.

-5. Supply chain simulation: bringing visibility to logistics

In September 2021, Google presented a new service that enables companies to build digital twins of their physical supply chains. The solution focuses on organizations in the retail sector, healthcare, manufacturing, and automotive industry. It aggregates information from multiple sources into one place and helps customers get a complete and clear view of their logistics. Google claims that the DT paves the way to much faster data analysis: tasks that previously took up to two hours now take just a few minutes. With its fresh offering, the company is snapping at the heels of IBM, Amazon, and Microsoft, all of which launched supply chain and other digital twin options a little bit earlier.

-6. Urban planning: creating profiles of buildings to reduce energy consumption

Automatic Building Energy Modeling or AutoBEM for short enables the generating of digital twins for any building in the US. The project took the developers from the Department of Energy’s Oak Ridge National Laboratory five years and became available in 2021.

AutoBEM relies on public information like satellite imagery, street views, light detection and ranging (LIDAR), prototype buildings, and standard building codes to generate energy profiles of structures. A twin reflects all critical external and internal characteristics including a building’s height, size, and type, a number of windows and floors, building envelope materials, roof type, heating, ventilation, and cooling systems.

Advanced algorithms behind the twin predict which technologies are to be implemented to save energy. This includes modern water heaters, smart thermostats, solar panels, and more. Supposedly, AutoBEM will be widely used in urban planning and maintenance as there is much concern about energy consumption in cities across the US.

-7. Azure Digital Twins is an IoT platform used to create a digital representation of real-world things, places, business processes, and people. Its purpose is to simulate and thus understand, control and analyze real-world operations, costs, products and experiences through an open-modeling language. Using IoT sensors, log files and other information, it collects real-time data to replicate assets, which are then combined with AI analytics tools virtually. The digital assets can be created even before the physical.

-8. Omniverse Replicator for Digital Twin Simulations

NVIDIA unveiled Omniverse Replicator to help develop digital twins. It’s a synthetic-data-generation engine that produces physically simulated data for training deep neural networks. Along with this, the company introduced two implementations of the engine for applications that generate synthetic data: NVIDIA DRIVE Sim, a virtual world for hosting the digital twin of autonomous vehicles, and NVIDIA Isaac Sim, a virtual world for the digital twin of manipulation robots. Autonomous vehicles and robots developed using this data can master skills across an array of virtual environments before applying them in the real world. As shown in BMW Group’s factory of the future, Omniverse’s modularity and openness allows it to utilize several other NVIDIA platforms such as the NVIDIA Isaac platform for robotics, NVIDIA Metropolis for intelligent video analytics, and the NVIDIA Aerial software development kit, which brings GPU-accelerated, software-defined 5G wireless radio access networks to environments as well as third-party software for users and companies to continue to use their own tools.

-9. Oil & Gas company, Chevron Energy Technology, is using Asset Digital Twin software to fix small issues before they become big problems. Through continuous monitoring with GE Digital’s Industrial Managed Services (IMS) and Digital Twin software, it uses data, analytics and knowledge to increase availability, reliability, efficiency and profitability. Asset Digital Twins are used to quickly and efficiently expand the company’s predictive diagnostics footprint. By applying predictive diagnostics, Chevron will decrease downtime and decrease maintenance costs by deploying the right amount of service to the right asset at the right time, maximizing the company’s return on investment.

This type of Digital Twin is an increasingly common tool for operators of large equipment to optimize their maintenance schedules and to predict and avoid unplanned downtime. Operators and owners in these industries are now looking for easier, faster ways to get to value from their Digital Twin investments and understand how they can get Digital Twin assets to ‘communicate’ with other systems at the plant level.

_______

_______

Section-6

Technology of digital thread and digital twin:

_

Digital thread:

Digital thread was first proposed by Lockheed Martin of the United States. In the production of F-35, they directly input MBD data into computer numerical control machine tools to process components, or complete the laying of composite materials through the programming system, and called this new working mode “digital thread.” The “digital thread” saves 6000 sets of tooling for the three configurations of the F-35. It also eliminates the time required for the management of these tooling and the configuration of the parts, as well as the time it takes to distribute the tooling and load it onto the machine.

The US Department of Defense uses digital thread as the most important basic technology for digital manufacturing. Boeing has already been pushing forward the digital mainline technology of single data source product to interact. The National Center for Manufacturing Sciences NCMS (the National Center for Manufacturing Sciences) has confirmed that digital manufacturing as “one of America’s largest and most potential competitive assets” and as a key strategy for the future. The digital main line, as its name suggests, is the main theme of digital manufacturing undoubtedly. The Industrial Internet Alliance does not hesitate to use the digital thread as key technology that the Industrial Internet Alliance needs to focus on. It is no exaggeration to say that the digital thread is the key to the revitalization of the US manufacturing industry. The digital thread is a strong connection between OEMs (manufacturers), operation and maintenance service providers, suppliers and end users. The background generated by the digital thread is based on the “model-centered,” and the model here is provided with complete information rich, established in accordance with uniform open standards, norms and semantics of digital model, and it can be read steadily and unambiguously by the machine (or system). On this basis, the digital thread integrates and drives the modern product design, manufacture and guarantee process, allowing the models of each link to timely synchronize and communicate the key data bidirectionally.

_

The digital thread is defined as a communication network that enables a connected flow of data as well as an integrated view of an asset’s data across its lifetime through various isolated functional perspectives. This concept promotes transmission of the correct information, to the correct place, at the correct time. In a manufacturing process, huge amounts of data are produced and analyzed to obtain efficiencies and reduce defects. According to LNS Research, it’s been confirmed that the digital thread increases the supply chain efficiency by 16 percent. Moreover, this new trend of manufacturing enables delivery of new products to the market 20% faster. A good number of manufacturers have yet to integrate the digital thread into their systems. According to the national institute of standards and technology of the United States commerce department, only 10 percent of small manufacturers employ this concept and all the rest use conventional 2D methods to model their products. Another substantial application of the digital thread is in corporate accountability. However, the depth to which government regulators can access pre-product data is still open for discussion. For instance, it is arguable that the Volkswagen emission menace could have been caught sooner had the government regulators accessed the digital thread. Besides, the development data of this automaker’s digital thread could have shown information contrary to what was the reality on the roads.

_

The digital thread is the communication framework that links all information that belongs to a product instance across the development process and IT systems and ultimately enables manufacturers to re-purpose, reuse and trace product information throughout the product development lifecycle and supply chain. When you integrate your designs into a ‘digital thread,’ partners, suppliers and vendors are all connected to consistently updated and traceable product data. Whether you are working on 3D models or other data files, your product development team will always have access to the most updated part and can communicate and collaborate effectively through all domains. Successfully managing the digital record of information throughout the product lifecycle gives you complete interoperability and traceability.

The digital twin represents an asset that includes its metadata – design and models. The 3D model helps you understand how your digital model will and can act in real-world scenarios. In other words, digital twins make it possible to simulate the behavior of your physical asset, whether it be a car, airplane, or organization manufacturing plant. The digital twin enables you to monitor them for continuous improvement since there is a constant feedback loop. The digital twin’s primary purpose is to give you better insight into the system and how it behaves in the real world.

You can’t have the digital twin without the digital thread. And while digital twins impact will reflect the digital representation of a real object or asset, the digital twin’s real-world scenario information is communicated to your systems via the digital thread, the link to interoperability.

_

The digital thread refers to the communication framework that allows a connected data flow and integrated view of the asset’s data throughout its lifecycle across traditionally siloed functional perspectives. The digital thread concept raises the bar for delivering “the right information to the right place at the right time.”

The digital twin is the design, as-built manufacturing and operational data for each physical asset — aircraft with tail number N123 has its specific and unique digital twin N123.  The digital thread (figure above) enables bidirectional flow of data — forward to build the digital twin and also feedback to continuously improve design, manufacturing, operation, etc.

_

Digital thread is a bridge between physical and virtual worlds:

Having all the required components in hand, you can interconnect physical systems and their virtual representations into a closed loop known as a digital thread. Within it, the following iterative operations are performed.

The digital thread (figure above) is a communication framework that captures disparate data types and formats from a product or process across its lifetime — from development and design through service and decommissioning. This data is extracted from systems such as CAD, PLM (product lifecycle management), IIoT devices, ERP (enterprise resource planning), and MES (manufacturing execution systems). In addition to capturing and consolidating data, the digital twin uses machine learning and AI to analyze the data and make it available to stakeholders in a consistent, reliable way. The digital thread is the bridge between the real-world part or process and its digital twin. In the context of digital twins, the digital thread is the link between the real-world product, machine, or process and its digital twin.

Steps within a digital thread:

-1. Data is collected from a physical object and its environment and sent to the centralized repository.

-2. Data is analyzed and prepared to be fed to the DT.

-3. The digital twin uses fresh data to mirror the object’s work in real time, test what will happen if the environment changes, and find bottlenecks. At this step, AI algorithms can be applied to tweak the product design or spot unhealthy trends and prevent costly downtimes.

-4. Insights from analytics are visualized and presented via the dashboard.

-5. Stakeholders make actionable, data-driven decisions.

-6. The physical object parameters, processes, or maintenance schedules are adjusted accordingly.

Then the process is repeated based on the new data.

Digital twins reduce the complexity of the real world to the information necessary for decision-making. This makes the technology welcome across many industries.

_

Digital threads and digital twins are used to create the virtual environments needed to implement model-based system engineering (MBSE), especially for complex cyber-physical systems. Digital threads are complete records of all details of specific aspects of product definition, development, and deployment, from conception through end of life. A digital twin is a comprehensive virtual representation of a physical object or system that spans its lifecycle, is updated from real-time data, and tracked by the digital thread. Digital twins are used for simulation and virtual testing of systems under development.

Digital threads provide digitized traceability from product definition and conceptualization through end-of-life (figure below). Digital threads can include multiple digital strands. There can be digital strands for quality engineering (including safety, security, and reliability engineering), continuous design verification and validation, integrated program planning and execution, product lifecycle management, and so on. Those strands are linked with each other and with the physical world to create a complete digital thread.

Figure above shows that digital thread is a chain of data moving through time that captures all relevant information needed to support MBSE. 

_

Digital threads comprise all digital twins’ temporal details, including design data, simulation, testing and performance data, software and hardware development strands, supply chain data, production data, and actual performance data after the product is used. Digital threads were initially developed to support MBSE, but their utility has been expanded. For complex systems, there is often a unique digital thread associated with each system built, such as individual aircraft. These threads are used for managing product field performance and reliability. They can provide valuable data that can be incorporated into future MBSE-based design, engineering, production, and deployment efforts, accelerating subsequent development processes, reducing costs, and producing more robust results.

Digital threads tie the MBSE process together by embodying all data related to digital twins. Digital twins provide a common description of the system that can be accessed by various design disciplines, such as electric, electronic, mechanical, software, etc. The digital twin eliminates paper-based documentation and provides a single source of ‘truth’ about the system and its development status. It breaks down engineering ‘silos’ and provides all stakeholders access to a complete digital model of the system, including all simulation, testing, and performance data.

A digital twin is dynamic through time, even after the system enters production, and the corresponding digital thread captures the history of all changes to the digital twin. The combined digital twin and digital thread comprise a complete system history.

_

Cons of a Digital Thread:

The shortfall with a Digital Thread is that it is still difficult to attribute your production outcomes to the correct cause of variation in the process. For instance, let’s say one of your product batches fails quality inspection. With a Digital Thread, it can be difficult to know where the quality misstep took place. As a result, it’s nearly impossible to trace the data back and understand what happened during production that may have had an impact on the batch’s quality.

Even with asset data from the time that a batch was in production, teams don’t have a clear understanding of where the batch was in the production process at a given time. This makes it difficult to understand where they should look within the production process to identify a possible machine issue or improvement. In other words, a Digital Thread can make it harder to troubleshoot production issues because you can’t connect key input variables to the outcome (key output variables). 

Additionally, if the quality issue was caused by something other than a machine—say, the batch sat too long in a storage tank before moving to the next phase of production—the data may not be captured at all. This makes root cause analysis difficult for teams to perform, hindering your capacity to continuously improve your processes. 

Lastly, it may be difficult to modify or change a Digital Thread if you have to change out a piece of equipment or add a new piece of equipment. You may have to go back to your third-party vendor and have them make changes to the Digital Twin model to account for any asset changes on your line. This can be a costly adjustment, especially as future asset changes occur on the shop floor.

____

____

Digital twin technology: 

Figure below shows the core technologies of the digital twin and its application service:

Core digital twin technologies can be classified mainly into: 

  • Visualization and operation technology
  • Analysis technology
  • Multi-dimensional modelling and simulation technology
  • Connection technology
  • Data and security technology
  • Synchronization technology.

_

The following are the general purposes of the utilization of digital twin technology:

  • Process optimization: What-if simulations based on digital twin behavior models can help find improved operation processes according to any change of associated personnel, equipment, production procedure, components, etc.
  • To prevent real-world problems in advance, past and present information collected from the real world is analyzed in the virtual world, and risk factors are identified.
  • Efficient product design: Product design simulations by digital twin, using real-world data learned from how existing equipment, processes, and products perform over time, can support experiments with design iterations, more informed design and engineering decisions, and overall product roadmap enhancement.
  • Cause analysis: The behavior models of a digital twin can reproduce the events happening to its physical entity. Reproductive simulation results based on past and log data can help analyze why these events occurred.
  • Multi-disciplinary decision making: The federated interworking of digital twins can make it easier to identify the causes of a co-related and composite problem, analyze co-relations and mutual side effects occurring between industrial domains, and collaborate among stakeholders throughout the industrial ecosystem.

______

The successful adoption of digital twin requires the support of key enabling IoT technologies that include reliable sensors, high-speed networks, low-cost data storage, and Big Data analytics. In addition, key technologies related to Industrial 4.0 that includes PLM, CAD, and VR/AR are required for the successful integration of the digital twin.

As a majority of the above-mentioned technologies have reached the optimum level of maturity, the adoption of a digital twin across the industrial manufacturing domain is expected to rise significantly in the next 3–5 years. These technologies combine in several different ways to support digital twin across the industrial manufacturing ecosystem. In addition, the following advancements in technologies are resulting in an increase in the adoption of the digital twin:

-Collection of data is getting cheaper: IoT sensors are getting cheaper, thereby reducing the data collection cost significantly.

-Advancements in data analytics: Progress in Big Data and machine learning is making analysis and forecasting of data easier and more reliable.

-HMI is becoming more user-friendly: Evolution of chatbots, virtual assistants, speech recognition, and virtual & augmented reality is making it easier for factory employees to work with digital twins.

_____

_____

To construct DT, a variety of enabling technologies should be implemented, as demonstrated in figure below.

The physical object ought to be capable to percept the outside world. Thus, sensing technology needs implementation to gain a full understanding of the environment. Once data are acquired, the virtual model should then be adjusted to keep track of the changes in physical entities. To replicate the physical world as realistically as possible, the virtual model should contain intact features set that consists of geometrical, physical, behavioral and rule information (Tao and Zhang, 2017; Qi et al., 2021; Tao et al., 2018d). Since massive multisource heterogeneous data are generated during the operation of a physical object, big data analytics technologies are required to collect, transmit, store and process the data. DT services cope with the concrete functions depending on the usage of the physical object. To deliver the original and processed data, data transmission technologies, such as different communication protocols and IoT technologies, should be imported. Data-driven technology is also essential for controlling the physical object to respond to the commands from the upper level. The environment factor is an important component of DT, which provides information necessary to secure the consistency of physical entity and virtual model, collect and integrate information on all elements, precisely predict the change of the environment. Thus, the environment coupling technology is required in order to consider the effect of environmental factors.   

-1. Technologies for physical objects:

Physical objects are key components of the DT because they are the sources of massive multisource heterogeneous data from the real world. To perceive real-world data, such as geometric shape, physical property and mechanical precision, sensing and measurement technologies are implemented for DT, including IoT sensing technologies, reverse engineering, image recognition measurement and particle sensing technology, etc. (Qi et al., 2021). In aerospace engineering, Hochhalter et al. (2014) implemented sensory material, which produces a phase transition when sufficient strain occurs, into structural alloy to improve the reliability of crack detection.

Because physical objects are used to performing designated tasks, the control technologies of the physical objects need to be considered when implementing the DT, including the power selection (e.g. electrical power and hydraulic power), mechanical transmission design (e.g. gear drive, belt drive, and connecting rod drive) and control technologies (e.g. programmable control, supervisory control, simulation-based control, etc.) (Qi et al., 2021; Zhuang et al., 2018). For control technologies, Wang et al., (2017) utilized the IEC-61499-based function block to compile the algorithm into robot control codes. Atorf et al. (2017) controlled the robot arm of an assembly line in a simulation-based manner, where users can implement complex abort conditions and objective functions to control the automated simulations.

-2. Data construction and management technologies:

Both the physical entities and the virtual models are driven by data, which are the media by which DT understands, responds and interacts with the real world. The whole lifecycle of DT data consists of data generation, data storage, data transmission and data processing.

A high-fidelity DT model contains intricate information, such as geometry information, physical information, condition information, etc., where a high volume of data needs to be stored intact. Some IoT technologies, such as bar code, quick response (QR) code and radio frequency identification (RFID), can be utilized to store a certain amount of information. With the development of big data storage frameworks, such as the MySQL database, HBase, NoSQL database, etc., a large amount of data can be correctly arranged and utilized. In MySQL the data are stored in the form of tables, where each row has various record names and each column has concrete values of data. Many rows and columns comprise a form, and several forms compose a database (Ongo and Kusuma, 2018). HBase employs the Hadoop Distributed File System (HDFS) as its file storage system, and Hadoop MapReduce provides HBase with high-performance computing power. Zookeeper provides stable service, and a failover mechanism for HBase (Bhupathiraju and Ravuri, 2014). The feature of NoSQL is to remove the relational feature of relational database. The NoSQL database has very high read-write performance, especially in large data volume, also excellent performance.

The basic purpose of data processing is to extract and derive data that are valuable and meaningful to particular people from large, potentially cluttered, incomprehensible amounts of data. Raw data are useless unless they undergo data cleaning, compression, smoothing, transformation, reduction, etc. Big data analytics may be divided into the following aspects: analytic visualizations, data mining algorithms and predictive analytic capabilities. Data visualization aims to communicate clearly and effectively with the aid of graphical means. Data can be visualized by different forms of tables and graphics (histograms, bar charts, pie charts, etc.). Data mining generally refers to the process of algorithmically searching for information that is hidden in a large amount of data from a large amount of data. Commonly employed data mining algorithms include K-means algorithms (Kapil et al., 2016), support vector machines, the apriori algorithm, the expectation–maximization algorithm, the nearest neighbor approach (Dröder et al., 2018), the naive Bayesian model and classification and regression trees (CART), etc. Predictive analytics are advanced analytic techniques that leverage historical data to uncover real-time insights and predict future events. These techniques combine a variety of advanced analytics capabilities, including ad hoc statistical analysis, predictive modeling, data mining, text analysis, optimization, real-time scoring, machine learning, etc.

To address multisource heterogeneous data, data fusion is necessary for collection, transmission, synthesis, filtering, correlation and synthesis of useful information from various information sources. There are three levels of data fusion methods: signal-level fusion, feature-level fusion and decision-level fusion (Liu et al., 2018). Data fusion methods include Kalman filtering (Li et al., 2017), image regression, principal component transform (PCT), the K-T transform, the wavelet transform, etc. Despite these data fusion methods, few articles discuss the implementation of these concrete algorithms or technologies into DT.

-3. Virtual modelling technologies:

As explained in Tao and Zhang (2017) and Tao et al. (2018d), a complete virtual model contains the geometry, physical, behavioral and rule model. Geometry information includes the shape, size, position and assembly relationship. There are diverse mature computer-aided design (CAD) software that can visualize the geometrical information of a physical object, such as UG, AutoCAD, SolidWorks and Creo.

Physical information contains the tolerances (dimensional tolerance, geometrical tolerance, surface roughness, etc.), material properties (density, Young’s modulus, Poisson’s ratio, etc.) and other information. Verner et al. (2018) employed Creo to build a robot DT model, in which the geometry information is recorded and its balance characteristics were calibrated using “center of gravity analysis” and “sensitivity analysis” features of Creo.

A behavioral model describes how the virtual model responds to external stimulates; for example, changes in the outside world, interaction with other objects, etc. Numerous physics-based theories/models have been established to reveal the mapping relationship between the input and the behavior, such as the computational fluid dynamic (CFD) model, finite element model (FEM) (Tuegel et al., 2011b), robot dynamics model, etc. In shop-floor design, Tao and Zhang (2017) suggest that behavioral models describe the mechanical correspondence of production equipment under the circumstance of a given numerical control program and disturbance, such as human interference, and proposed that descriptions of behavior can be implemented specifically using finite element models and neural networks.

The rule model involves associations and constraints, which can be applied to analyze, optimize and predict the object performance. To extract rule information, several technologies and algorithms can be utilized, such as the data mining algorithm (K-means (Tao and Zhang, 2017) and neural network (Tao et al., 2018d)), semantic data analytics (Abramovici et al., 2016) and XML-based specific data format (AutomationML (Schroeder et al., 2016) and CityGML (Ruohomäki et al., 2018)). Tao et al. (2018d) proposed that in the wind turbine rule model, the limitation of wind speed can be calculated by force analysis and correlations between parameters can be explored via neural networks.

It is important to apply verification, validation and accreditation (VV&A) technology to evaluate the accuracy of the virtual model (Tao et al., 2018c). Tao and Zhang (2017) applied VV&A technology to verify the accuracy in the model to the corresponding programming code, simulation confidence, sensitivity and simulation accuracy, etc.

-4. Services technologies:

DT services technologies aim to fulfill different objectives in different applications. For example, in aerospace engineering, DT services refer to aircraft structural life prediction (Tuegel et al., 2011b) while in healthcare, DT services focus on monitoring, diagnosing and predicting the health conditions of the elderly (Liu et al., 2019b). The diversity of services leads to the demand for expertise. To generate a service, corresponding data, knowledge and algorithms must be encapsulated where service description and encapsulation technology is needed (Tao and Zhang, 2017; Qi et al., 2018).

A service description refers to an accurate statement of specific demand, e.g. production planning in manufacturing engineering (Tao and Zhang, 2017) and structural monitoring and fault prediction (Tuegel, 2012). Since multiple services are often embedded in a complex system, decomposing services to corresponding subservices and making a smart service selection are important. The real-time visualization of DT services is a key object of DT, which requires computer graphics processing technology, such as computer graphics, 3D rendering and image processing technology.

-5. Connection and data transmission technologies:  

To realize real-time control and virtual-real state mapping, connection methods with high fidelity are necessary for DT. There are abundant connection protocols for data exchange between the physical space to the DT and inside the cyber space among different software. The existing data transmission method consists of wire transmission and wireless transmission. Wire transmission includes the twisted pair (categories 5 and 6), coaxial cable (coarse and fine) and optical fiber (single-mode and multimode), while wireless transmission includes Zig-Bee, Bluetooth, Wi-fi, ultra-wide band (UWB) and near-field communication (NFC) (Cheng et al., 2018). For long-distance wireless transmission technology, GPRS/CDMA, digital ratio, spread spectrum microwave communication, wireless bridge and satellite communication are available. An extensive range of application program interfaces (APIs) are commonly employed for data exchange between different software to realize data transmission on the software level. Recently, 5G technology can be applied to satisfy the demand of high data rates, high reliability, high coverage and low latency (Cheng et al., 2018).

-6. Environment coupling technologies:

Similar with DT virtual model, the virtual environment model also contains geometry, physics and behavioral information. Geometric information describes the environment in terms of its geometry and appearance, and presents it in a data format that can be processed by a computer. Physical information contains the mechanical parameters of the environment, which are essential when performing physics-based simulation, e.g. finite element analysis and hydrodynamic analysis. For example, in marine engineering, the mechanical parameters of the ocean, such as density, viscosity, etc., will affect the influence of the marine environment on the operation and performance of marine equipment, such as ships and oil wells. Environmental mapping techniques, sensor acquisition techniques and full element digital definition techniques can be used to describe geometric and physical information about the environment. Environmental mapping technologies include remote sensing technologies, seismic wave method, polarization, radio navigation system, hydroacoustic positioning systems, 3D modelling technology, etc. Behavioral information reflects the change of environment in response to the operation of DT model. For example, during tunneling, the cutting head of the tunnel boring machine (TBM) churn up the rocky soil, which has a significant impact on the subsurface environment. Finite element analysis and computer fluid dynamics can be used to simulate the influence exerted by virtual models in the environment, whereas neural network and surrogate model are also able to be implemented to predict the future state of the environment. The visualization of environment is as prominent as the visualization of virtual model. Due to the large volume of the environment, a multi-channel immersive stereoscopic display method through wearable devices (e.g. head-mounted displays, tactile gloves) combined with virtual reality (VR) and mixed reality (MR) would be suitable for the environment visualization.

_____

_____

Key technologies for digital twin arise from three perspectives: data related technologies, high-fidelity modelling technologies, and model-based simulation technologies.

Figure below presents the technology architecture for digital twin.

-1. Data related technologies:

Data is the basis of digital twin. Sensors, gauges, RFID tags and readers, cameras, scanners, etc. should be chosen and integrated to collect total-element data for digital twin. Data then should be transmitted in a real-time or near real-time manner. However, data that digital twin needed are usually of big volume, high velocity, and great variety, which is difficult and costly to transmit to digital twin in the cloud server. Thus, edge computing is an ideal method to pre-process the collected data to reduce the network burden and eliminate chances of data leakage and real-time data transmission is made possible by 5 G technology. Data mapping and data fusion are also needed to understand the collected data. The most common data mapping technology used in literatures is XML.

He et al. surveyed the practical industrial IoT and signal processing algorithms in digital twin for real-time and multisource data collection. Ala-Laurinaho examined which application layer protocols and communication technologies are the most suitable for the sensor data transmission from a physical twin to a digital twin. The developed platform allowed easy addition of sensors to a physical twin and provided an interface for their configuration remotely over the Internet. The examined application layer protocols were HTTP, MQTT, CoAP, XMPP, AMQP, DDS, and OPC UA. The examined communication technologies were 4 G, 5 G, NB-IoT, LoRaWAN, Sigfox, Bluetooth, 802.11 ah, 802.11n, ZigBee, Z-Wave, and WirelessHART. Angrish et al. used document type unstructured schemas (MongoDB) to store streaming data from various manufacturing machines. It can be found that the used data related technologies, including data collection, data mapping, data processing, data transmission, varied with different applications in literatures. In addition, static data are of different types and formats in different fields. Standard data interfaces are required to transform these initial data into data catering to digital twin.

-2. High-fidelity modelling technologies:

Model is the core of digital twin. Models of digital twin comprise semantic data models and physical models. Semantic data models are trained by known inputs and outputs, using artificial intelligence methods. Physical models require comprehensive understanding of the physical properties and their mutual interaction. Thus, multi-physics modelling is essential for high-fidelity modelling of digital twin. In literatures, the most common multi-physics modelling language is Modelica.

One key issue that digital twin model should address is the contradiction between simplified virtual model and complex behavior of the physical object. A compromised approach is to implement flexible modelling in a modular way. Negri et al. proposed to add black-box modules to the main simulation model. Different behavior models of digital twin were activated only when needed. The modules interacted with the main simulation model through standard interfaces. To balance computational effort and accuracy, before creating digital twin model of a complex system, engineers should identify which components are crucial for the system’s functionality and define the modeling level of each component. Thus, high-fidelity model of digital twin can be built according to different modeling level.

For complex manufacturing phenomenon, Ghosh et al. addressed the construction of digital twins using hidden Markov models. The models encapsulated the dynamics underlying the phenomenon by using some discrete states and their transition probabilities. Sun et al. fused theoretical model and physical model to commission high precision products’ assembly. The theoretical model was based on Model Based Definition (MBD) techniques and the physical model was built by point cloud scanning. For complex manufacturing systems, Lu and Xu studied resource virtualization as a key technology to create digital twin of a smart factory. Semantic web languages, OWL and Jena were recommended as the modeling languages. Digital twin modeling can usually start with physics-based modeling. Black-box modeling using data or grey-box modeling using a combination of physics and data are also feasible.

-3. Model based simulation technologies:

Simulation is an important aspect of digital twin. Digital twin simulation enables virtual model to interact with physical entity bi-directionally in real-time. To realize bidirectional interaction, Schroeder et al. presented a high level model based on AutomationML for easily exchange data between heterogeneous systems. The attributes necessary to exchange data were put in an IoT middleware and other systems can have access to this attribute. Talkhestani et al. defined Anchor-Point as the data of a mechatronic component from interdisciplinary domains. Based on a PLM IT-Platform and an AnchorPoint method, variances of the mechatronic data structure between the digital models and the physical system can be systematically detected. Different from traditional simulation, digital twin simulation uses realtime data of the physical system that are collected and recorded from physical space via IoT. Qi et al. recommended using image recognition and laser measurement technology to measure the parameters of the physical world, and using electrical control, programmable control, embedded control, and network control technology to control the physical world. Through in-depth literature review, it can be found that most of the existing works focus on unidirectional data flow, which is from physical to digital. The data flow from digital to physical after executing digital twin simulation requires deeper research.

Multi-physics, multi-scale simulation is one of the most important visions of digital twin. Thus, quantitative uncertainties and interfaces between different simulations are to be researched. Simulation models over different levels of detail, over all involved disciplines, and over lifecycle phases must be integrated. Digital twin should provide an interface to different models and data in different granularities and keep them consistent. To solve multi-scaled, multi-physics problems, one strategy is to partition the solution domain along disciplines, and each domain exchanges data through an interface. The existing work regarding digital twin multi-physics simulation usually treat integration of different models as simple input-output, showing better results than traditional methods. However, in some application scenarios, the mutual influence between different disciplines cannot be neglected and the problem of different temporal and spatial scales should be addressed appropriately.

_____

_____

Combine Blockchain & Digital Twins:

Blockchain was named one of Gartner’s Top 10 Strategic Technology Trends for 2019.  When it comes to digital twins, the primary use case for distributed ledger and blockchain technology is around storing information from the digital twin and how it interacts on the blockchain. The second use case is to let digital twins engage in smart contracts with other digital twins or systems.

Storing Digital Twin Information:

Here are the key benefits of storing digital twin data in a distributed ledger:

  • Information is not managed centrally by a central organization.
  • Records are immutable and require validation, which reduces the risk of tampering.
  • Information can be transferred seamlessly while maintaining the integrity of the data.

When assets change ownership, the corresponding records about that asset all need to move to the new owner. One example is when a mining organization sells one of its processing plants or mines. Using digital twins and distributed ledger technology can help simplify this transfer process.

Another example is keeping statutory records where there are multiple stakeholders involved. In cases like this, government agencies, certification bodies, and the owners/operators of the plant equipment all need to have confidence that no one tampered with the records.

______

______

Technology of mobile digital twin (MDT):

The Digital Twin concept was first born in the aerospace domain when the National Aeronautics and Space Administration (NASA) adopted that as a key element in its 2010 technology roadmap. Along with its rapid development in different domains during the past decade, including aeronautics and space, robotics, manufacturing, and informatics, the Digital Twin also has a huge potential in the transportation domain.

The emergence of connected vehicle technology introduces another platform to implement the Digital Twin. Since the level of connectivity within our vehicles has greatly improved, these equipped vehicles are able to “talk” with other entities, such as with other connected vehicles through vehicle-to-vehicle (V2V) communications, with traffic infrastructures through vehicle-to-infrastructure (V2I) communications, and with cloud servers through vehicle-to-cloud (V2C) communications. Specifically, V2C communications allow connected vehicles to 1) upload their data to the cloud server, enabling Digital Twins to be built in the digital (cyber) world based on their counterparts in the physical world; and 2) offload their onboard computations to the cloud server, enabling Digital Twins to build models and calculate guidance information through powerful cloud computing, which can then be fed back to connected vehicles.

_

Mobility Digital Twin (MDT) framework is defined as an Artificial Intelligence (AI) based data-driven cloud-edge-device framework for mobility services. The MDT framework is built on top of three different planes: 1) the physical space that has human beings, vehicles, and traffic infrastructures; 2) the digital space that has the digital replicas of aforementioned physical entities; and 3) the communication plane between these two spaces. Given the connectivity nature of this framework, it transforms connected vehicles into Internet of Vehicles (IoV) by leveraging IoT technologies. The cloud-edge architecture is built with Amazon Web Services (AWS) to accommodate the proposed MDT framework and implement its digital functionalities of storage, modeling, learning, simulation, and prediction. The effectiveness of the MDT framework is shown through the case study of three digital building blocks with their key microservices: the Human Digital Twin with user management and driver type classification, the Vehicle Digital Twin with cloud based Advanced Driver-Assistance Systems (ADAS), and the Traffic Digital Twin with traffic flow monitoring and variable speed limit.

_

The MDT framework is shown in figure below, consists of three planes: 1) The lower plane, highlighted in yellow, stands for the physical space where human beings, vehicles, and traffic infrastructures reside; 2) The upper plane, highlighted in blue, represents the digital space where the digital replicas of those physical entities are located at; 3) Between these two planes, the communication plane (in grey) plays a crucial role in this framework to allow real-time and non-real-time data streaming for both upstream and downstream.

Three entities are considered in this MDT framework: Human, Vehicle, and Traffic. Given the existence of the communication plane, each of the entity can be connected to the digital space (e.g., Internet) and exchanges data with each other. Therefore, this MDT framework is a good representation of the IoT, and it allows connected vehicles to act as IoVs with IoT technologies.

Figure above shows illustration of the Mobility Digital Twin (MDT) framework for connected vehicles, which consists of a physical space, a digital space, and a communication plane between two spaces.

______

______

Section-7

Digital twin and IoT:  

First of all, it is important to correctly set the distinctions that describe each concept in its entirety.

In essence, IoT describes physical objects embedded with sensors, processing ability solutions, and other technologies that connect and exchange data with other systems and devices over the Internet or other communications networks. Internet of Things (IoT) defines hundreds of millions of devices that can be connected and controlled via internet or other communications networks. For instance, house lights that can be turned on through your phone via Bluetooth connection are considered as IoT. This concept essentially involves every single device that is either connected or could be connected to the Internet or other communications networks and controlled that way. Manufacturing firms have embraced IoT, using sensors to improve business functions creating the term Industrial Internet of Things (IIot).

On the other hand, digital twin is a virtual replica of an object, system, process, product or service that is digitalized and put on a simulation platform. Digital Twinning is called a process that involves creating and building a reality-based entity on an online platform. Unlike Internet of Things that must meet the criteria of being able to connect to the Internet and be controlled, a virtual twin can be anything you can imagine. Moreover, this concept doesn’t necessarily have to be an object because “twinning” can involve replicating a system, process or anything else that is intangible but crucial for business. As can be deduced from its etymology, a Digital Twin (DT) is defined as ‘an integrated multi-physics, multiscale, probabilistic simulation of a complex product, which functions to mirror the life of its corresponding twin’. The major objective of this technology is to serve as a panoptic reflection of a physical body in the digital world. While it is sometimes mistaken as IoT or Computer-Aided Design (CAD), it is fundamentally different as IoT is characterized solely by the physical implementation, whereas CAD focuses exclusively on stand-alone representation in the digital domain. Only the digital twin technology uniquely focuses on bilateral interdependency between the virtual and physical representations. This poses various inherent benefits as the physical product can adapt to modify its real-time behaviour concurrently to the feedback generated by the digital twin. Conversely, the bridging allows the simulation to be able to precisely mirror the real-world condition of the physical body. For the realization of a true cyber-physical system, the digital twin must have a dynamic interconnected nature with the physical model, which can be achieved through real-time sensory data. In this process, the environment of the machine must also be dynamically replicated in the digital world, including all compatible parameters that affect its function.

_

IoT is the Backbone for Digital Twin:

IoT is a key strategic consideration in order to realize the full potential of digital twins of physical products, operational processes, or person’s tasks. The physical world experiences of these three ‘P’s’ – products, processes, and people – are captured through sensors and IoT is a fundamental requirement of a true digital twin. Twin implies that what happens to one happens to the other, in a mutable fashion, which puts IoT as the bi-directional link to enact this and empowers the transformative use cases that come with it. IoT and digital twin technology are interconnected. You need to deploy IoT devices to gather live data about the real object and transfer the data to the digital replica in a computer server. Since IoTs can communicate among themselves and with a central server, it becomes easy to collect holistic data about the physical object you monitor. IoT devices constantly feed data so the digital twin model can analyze and present performance insights. Clearly, the rise of IoT sensors is partly responsible for making digital twins possible. As IoT devices are refined, digital twin scenarios can involve smaller and less complicated objects.

Similarly, IoT manufacturers can also create a digital replica of one device or a whole home or office and test the efficacy of various IoT devices they are willing to develop. For example, an IoT manufacturer can create a simulation of a smart home using CAD and 3D scanning of the physical home. Then, on its digital twin, it can place various IoT devices in different positions and find out if they can connect with the home Wi-Fi or need improvements.  

_

Digital twin capabilities are not new. Since the early 2000s, disruptive companies have explored ways to use digital replicas to enhance their products and processes. NASA was the first to use digital models. They could not operate, monitor, and repair their systems physically—as they were in space. So they created digital representations to simulate and analyze their systems back on earth. However, thanks to the Internet of Things, digital twins have become more affordable to create and deploy. A digital twin in IoT can provide a holistic view of all the capabilities an object has, which helps orchestrate different aspects of an IoT device. By providing a unified model and API, digital twins pair to and work with physical assets smoothly. With additional software and data analytics, digital twins can optimize an IoT deployment for maximum efficiency, help engineers figure out how things will operate before being physically deployed, and thus future-proof enterprise technology solutions.

______

______

3 Ways IIoT is enhancing Digital Twins:  

-1.  Digital Twin of Products: IoT offers visibility into the Full Product Lifecycle:

Smart connected products are replacing assumptions with facts; real-world IIoT data closes the feedback loop with product usage data, which then informs future iterations – and even business model changes, including product-as-service. Product telemetry also gives engineers and product designers behavioral characteristics of deployed products or fleets of products.

Providing a frame of reference to compare the ‘as-is’ versus ‘as-used’ product usage is an extremely powerful IIoT-enabled insight that can inform the development of future product iterations. Its applicability can range from replacing or modifying certain features to drilled-down insights into the specific performance of part(s).

Expanding visibility into the product lens through cross-functional collaboration can also drive downstream efficiencies. This includes change management in manufacturing and service processes, which lowers scrap, rework, and lead times.

Real-World Example:

Whirlpool is achieving data-driven design by connecting deployed appliances through IIoT and analyzing operating performance metrics (torque, drum speed, motor temperature, etc.) across fleets of products to improve future iterations.

-2. Digital Twin of Processes: IoT unlocks deeper Operational Intelligence:

Many operational processes are plagued by two factors: disparate and black-boxed information sources. IIoT unlocks these unknown insights and threads them with different sources both in real-time and historical systems of record. Twins of these connected assets and workers and how they interact are critical to a constructing a process lens – essentially a system-wide view of an industrial environment.

IIoT through a process lens can drive critical manufacturing KPIs. For example, improving the uptime of a single asset on a factory floor through IIoT-driven predictive insights can drastically improve throughput while a twin of a production line can reduce bottlenecks through enhanced operational visibility.

This connected operational intelligence from diverse assets creates the real-time 360-degree visibility manufacturers need to be flexible and agile – a necessity in today’s changing markets and shifting customer demands.

Real-World Example:

Woodward is gaining operational visibility by integrating its technologies and workers through the IIoT across its factories. An IIoT platform contextualizes myriad information sources in its production facilities, including connected devices (torque wrenches, pressers, etc.), manufacturing execution systems (MES), and product centric-software (CAD, PLM), to give an end-to-end operational view.

-3. Digital Twin of Service: IoT optimizes Maintenance:

Much of a product’s operational condition and performance in the end user’s environment hasn’t been accessible to the manufacturer or customer. With maintenance and service being critical functions to reduce asset downtime and differentiate offerings, digital twins with IoT can drastically improve these metrics and enable new revenue streams.

Digital twins can bolster remote service IIoT use cases where software updates, patches or reboots for deployed assets can negate the need to send a technician on-site. IIoT’s flexibility can enable mission-critical systems to sample data every second to inform services, or less frequently to optimize resources, all depending on the digital twin use case.

Telemetry data can also feed into the deployed asset’s digital twin to gain a baseline of its health and apply next-generation predictive maintenance modules blending machine learning and physics-based simulation techniques. Simulating historical patterns of machine performance with design expectations against real-time sensor data will reduce unplanned downtime and add another layer of intelligence, which further maximizes asset utilization.

Real-World Example:  

Howden is helping customers succeed through its ‘Data Driven Advantage’ program. This initiative has embedded IIoT-driven insights into customers service and maintenance workflows to save millions in unplanned downtime and reduce business risk.

_____

_____

Digital twins help IoT systems in the following ways:

-1. Status of the device

In IoT, since the Internet interconnects all devices and machines, a digital twin helps determine how they are operating in real-time. This allows one to access information about their status much easier and quicker — a boon in sectors like patient healthcare.

-2. Documentation and communication

Every machine has its own set of behaviors and processes that are unique to it. Creating a digital twin model helps understand these behaviors better and document that information appropriately. No more relying on physical records!

-3. Predictive modeling

The digital twin technology helps analyze the future state of the machine or device. This allows the providers of IoT development services to create a functional IoT model that can predict whether the state of the concerned IoT machine would be in a better condition in a specific timeframe and whether it would adapt to the changing processes later.

-4. Measurement of different outcomes

The digital twin model of a machine or device can be used to measure different possible outcomes for a process by changing the input variables. This way, digital twin models help one estimate data without losing time and money in IoT product development systems.

-5. Risk reduction

IoT gives access to a large population of devices simultaneously. A minor security loophole can provide hackers with room to gain unauthorized access to the IoT network. This risk is magnified when the actual physical devices are deployed in production. Digital twins eliminate that risk and allow developers to safely experiment with multiple scenarios before arriving at one, which is operationally feasible and secure.

-6. Integration of systems

The supply chain consists of a large set of operations, starting from production, storage, transportation, and the shipment of goods. A variety of backend applications can be interconnected to receive accurate data about the supply chain operations in real-time.

-7. Efficient experiments

Experiments of any kind are tedious and incur expensive resources. Since IoT is a relatively new technology, there is considerable scope for experimentation, which needs to be carried out with reasonable resource usage. Digital twins offer the virtual infrastructure needed to conduct multiple experiments when not many physical devices are available.

-8. Digital twins anchor the IoT

With the IoT growing larger and more complex for businesses, digital twins are only growing more important. They’re the anchor for sensors and beacons. They’re the repository for de-siloed data. They’re the backbone for workplace management systems. Without digital twins, the IoT involves a lot more networking between points of data origin and points of data use. Just like your workplace brings the company together, digital twins centralize all its data.

______

______

Section-8

Digital twin and metaverse:    

The Metaverse is a term that refers to a virtual world that exists entirely in a digital form. It’s a collective space where users can interact with each other, engage in activities, and experience virtual reality. The metaverse is a vision of what many in the computer industry believe is the next iteration of the internet: a single, shared, immersive, persistent, 3D virtual space where humans experience life in ways they could not in the physical world. Today, when people say “the metaverse,” that generally refers to a 3D digital world where people work and play experiencing the blend of the physical and virtual world in a way that feels real and permanent. As of yet, it would be more accurate to say that there are many metaverse efforts. Few would mistake them as “real,” in part because current metaverses lack true interoperability. But as the metaverse concept evolves and matures, people will be able to more easily and seamlessly engage with objects and each other in ways that blur the digital and real worlds. Individuals will be able to use avatars to try on digital versions of actual clothes; train for and practice complex work tasks such as high-risk surgeries; and engage in simulated experiences like sky diving while feeling the actual physical sensations. Some metaverse-type worlds now exist, especially in the virtual environment of game worlds, and the technologies creating the metaverse — such as virtual reality, blockchain and artificial intelligence — are already in enterprise use and enabling elements of this future digital space. 

An example of metaverse is a 3-D virtual world where users can buy, create and explore NFT-based plots of land using the MANA cryptocurrency, which is Ethereum-based. Non-Fungible Tokens (NFT) can be used to establish ownership and authenticity. The popular online video game from Epic Games offers an immersive digital gaming and social space that is an example of a real-life metaverse-like environment.

_

The world of technology is headed towards a paradigm shift, and the application of artificial intelligence (AI), augmented reality (AR), and virtual reality (VR), is becoming ubiquitous. The obvious way forward seems to be the next iteration of the internet. This metaverse is a single, shared, immersive, persistent, 3D virtual space where humans experience life, even in the absence of physical proximity. Fusing technologies like AI, AR, VR, through continuously evolving connectivity like 5G networks helps in building online experiences that are more immersive, interactive, and experiential. Blockchains would legitimize the storage, credibility, and transfer of crypto-value once the metaverse gets going. The metaverse is an overlap of the real and the virtual world, creating a 3D universe that connects multiple virtual domains. The combination of artificial intelligence, augmented reality, web 3.0 technologies, and virtual reality create apt conditions for innovative solutions. Though, in its nascent stage of development, it is set to shift the perspective of the healthcare ecosystem through its immersive experience. In the field of medicine, metaverse is all set to be a game-changer. It advances healthcare 4.0 by enhancing patient safety results, enabling patients to participate, and giving them a clear picture of their health status. Hospital chains are increasingly veering towards the metaverse space to improve their patient care and services. The use of the metaverse in healthcare can significantly improve patient outcomes by opening up brand-new channels for the distribution of reasonably priced medications.

_

Digital Twin is an essential building block of Metaverse:

In a world where everything is increasingly inclining to become digital and virtual, the Metaverse holds the power to transform our digital lives. It is a convergence of the real and virtual world where people can feel the real and vivid experiences of the digital world retreat to. In other words, Metaverse can also be touted as the next generation of the internet. As the next step in advancing the web and social media, the Metaverse brings us nearer to completely stimulated virtual reality through disruptive transformation. However, the Metaverse requires a digitalized copy of the real world as an entry point to provide fully connected, immersive, and engaging 3D experiences. Many businesses and enterprises are now exploring and building on the metaverse-based fundamentals to introduce new possibilities and experiences for digitally-driven consumers. By deploying digital twins, organizations can introduce dimensionally precise real-life spaces into the metaverse virtual mirror world. Digital twins are one of the metaverse’s core building blocks because of their intrinsic qualities. While the metaverse can help us create virtual worlds and experiences beyond our dreams, it will also be useful in constructing exact replicas of reality. With their inherent features and functionalities, digital twins can bring realism to the digital world.

_

The Metaverse and the Digital twin technology can bring realism into the virtual world and experiences beyond our imagination, creating exact replications of reality. Just imagine entering a virtual store of a fashion e-commerce company to try the clothes before buying them. It would be best suited for you to let your digital twin avatar try the clothes first to match your real measurements. In a professional setting, a meeting in a metaverse-powered meeting room will be productive if participants of the virtual meeting can interact with a replica of the company’s instruments, equipment and information system. Similarly, a technical training program led by Metaverse will add value to the training if technicians can operate 3D representations of complex systems. Digital twin technology can drive all these ideas into reality and help build a metaverse that is more interviewed into reality. Digital twin and simulation technology will empower Metaverse to support remote maintenance workshops of machines that need to be serviced and potentially connected with or mapped onto a real workshop. These inartistic properties make digital twins one of the essential blocks of the Metaverse.

_____

_____

Four Tiers of the Metaverse:

If the physical world can be considered Tier 0 of the metaverse, digital twins are Tier 1, which some would consider the (Industrial) Internet of Things. All the protocols running on top of that are Tier 2. Above that, there can be many layers of applications, either in the 2D virtual world (e.g., mobile apps), virtual reality, or using augmented reality, that interact with either of these layers, which can be considered Tier 3.

This might sound counterintuitive (why place digital twins as a tier prior to protocols and applications?), but the sensors of digital twins collecting the raw data enable the applications, and to have universal interaction with those applications, we would require open standards and protocols.

The (raw) data moves between these layers, is analysed by the applications and fed back to the digital twin for additional insights. Combined, this would deliver value to the physical and digital worlds.

_

No metaverse without data:

Data is what makes digital twins dynamic digital twins. The more protocols, as in global standards, that exist, the more value the applications can deliver to the global economy. For example, a city can be made smart by using digital twins. By ensuring that the data of those various digital twins are easily accessible using universal standards, anyone can create applications that would deliver value to cities, businesses, and inhabitants.

All the applications in tier 3 delivering value can be perceived as lenses, where each lens provides the user with a different perspective or experience of reality. These lenses can only exist if the data exists, so there is no metaverse without data.

These lenses can be for entertainment, offering the user digital art linked to a certain physical location. It can be a monitoring application of a solar farm that allows the user to trade its energy using crypto. It can be a communications lens for users to join a hybrid meeting using VR. It can be a predictive maintenance lens of an aeroplane’s jet engines or an electrical lens for city officials to manage their smart city. There can be countless applications and capabilities or lenses. Some will be accessible to all, others only after payment and others only with the proper credentials. Ideally, all are secured on the blockchain.

_

Complexity levels of Digital Twins:

Regarding tier 1, there are different levels of complexity for digital twins, each increasing in data generated and insights delivered. The simplest variant of a digital twin is the digital representation of a single object, for example, a connected wearable, a simple robot, or a machine in a factory. These product digital twins can be used to design new products by analysing how a product performs and by enabling digital prototyping.

One level up are production digital twins that simulate a process, e.g., a manufacturing process, consisting of multiple product digital twins. The more complex performance digital twins capture data from a system of objects, such as an aeroplane or an entire (dark) factory.

The next level comprises entire systems of systems, such as supply chains that cover the globe or city-scale digital twins. The most complex of them is the digital twin of the earth, as developed by the European Space Agency (ESA), which aims to build a dynamic digital replica of our planet.

First and foremost, digital twins enable the optimisation of processes through their synchronised systems in the digital and real world. The various applications of tier 3 can be used to monitor and analyse a process or system, run simulations to optimise its physical counterpart, or collaborate with multiple people to prototype and create new physical products.

_

Digital Twin Visualisations:

No matter how simple or complex, each digital twin offers value and benefits to its users. The value and level of collaboration possible depend on the digital twin’s visualisation level. These can be simple visualisations—descriptive, predictive, or prescriptive analytics—that provide the user insights into the status of the digital twin and give the ability to change levers to adjust its behaviour.

Visualisation can also be more advanced 2D visual representations of the digital object or system, such as AutoCAD models used by architects or engineers. These advanced 2D representations enable users to view the digital twin from different angles and collaborate remotely, to further improve or develop either the digital twin or the physical counterpart.

The most advanced visual representation is a detailed 3D digital replica that can be explored or interacted with within virtual reality or using augmented reality. 3D digital representations enable users to explore the digital twin from different perspectives, drill deep into its inner workings, see real-time insights from its sensors collecting data, make changes that reflect in real-time in the physical world, or collaborate with others to design and create a digital prototype of a future physical object, such as a car, or fix a physical problem remotely. The more advanced the visual representation, the more value can be achieved.

______

______

The industrial metaverse:

Even as technologists are trying to envision what the metaverse will bring for businesses and consumers, the industrial metaverse is already transforming how people design, manufacture, and interact with physical entities across industries. The industrial metaverse combines physical-digital fusion and human augmentation for industrial applications and contains digital representations of physical industrial environments, systems, assets and spaces that people can control, communicate, and interact with. While definitions abound and it remains to be seen how the industrial metaverse will fully unfold, digital twins are increasingly viewed as one of its key applications. Experts say a convergence of maturing technologies is fueling the growth of the industrial metaverse. Foremost among these is 5G. 5G creates interesting new vectors of capability that enable lower latency (delay) and more precise exchange of data, both key for driving metaverse applications.

_

Creating digital twins is just one of the many advantages of the industrial metaverse. The industrial metaverse can reach a much larger scale with increasing complexity by creating digital twins of entire systems such as factories, airports, cargo terminals, or cities—not just digital twins of individual machines or devices that we have seen so far. Nokia Bell Labs’ technology-partnership with indoor vertical farming company AeroFarms, started in 2020, that is an example of how the industrial metaverse’s immersive reality, sensing, and machine-learning capabilities can be used to gain operational insights. By combining its AI-based autonomous drone-control solution and advanced machine-learning capabilities with machine vision tools, Nokia Bell Labs has created a technology that can track the growth of millions of plants. They have developed a completely autonomous drone solution with multiple drones flying through this farm. That allows the farm to monitor details such as the height and color of its plants, spot poor growth areas, and predict the production yield. They actually built a complete digital twin of the farm that gives the growers a real-time picture of the entire production throughout the farm. With data analysis, the farm can optimize its water, energy, and nutrient consumption; speed up troubleshooting; improve accuracy in yield forecast; and maintain a consistently high quality.

_____

_____

Digital twin and Augmented/Virtual Reality:

While learning about Digital Twin and Metaverse, we cannot skip the Augmented Reality and Virtual Reality technologies since they are the interface for human interactions.

Augmented Reality (AR):

Augmented reality utilizes the existing real-world environment and puts the virtual data to enhance the user experience. It is a type of technology that allows digital images and information to be displayed in the physical world.

Example:

In Pokémon Go, users search for the animated character that pops up on their phone in their real-life neighborhoods.

Virtual Reality (VR):

Virtual Reality or VR is a computer modeling and simulation that enables a person to interact with an artificial 3D visual or sensory environment. Virtual reality immerses the users in an animated scene. It replaces the real-life environment with a simulated one.

Example:

Through a virtual reality headset, a person can walk around Italy as if they were present there.

_

Extended Reality (XR) refers to all combined real and virtual environments and man-machine interactions, and is therefore, to be understood as the “reservoir” for representative forms such as Augmented Reality (AR) and Virtual Reality (VR) and the interpolated areas between them.

Mixed reality (MR) is a blend of physical and digital worlds, unlocking natural and intuitive 3D human, computer, and environmental interactions. This new reality is based on advancements in computer vision, graphical processing, display technologies, input systems, and cloud computing.

_

Digital twins involve ingesting large volumes of data in order to arrive at actionable insights. Visualization is equally important, so that managers and executives can fully understand the data, and drive actions based on the insights provided. Augmented Reality (AR) and Virtual Reality (VR) offer immersive experiences to visualize such insights. A stakeholder can interact with a wearable or a hand-held AR/VR device to consume insights and queries in a context-aware manner. Using the foregoing example, the digital twin experience of a car will be different for an operations manager in the auto manufacturing plant compared to the experience of a service technician in a dealership. Figure below illustrates how an engine technician at a dealership can experience the digital twin of a car engine using AR.

Figure above shows custom AR experience for an engine technician created to ingest insights from the digital twin of a car engine. 

While AR and VR constitute the avenue of data ingestion and assimilation, they are not essential building blocks of the digital twin, but rather convenient technologies for obtaining a holistic, immersive understanding of the asset by leveraging the digital twin paradigm.

______

Digital Twins and Augmented & Virtual Reality can improve your Business:

AR: 

Using Augmented Reality technology, data taken from the IoT sensors in combination with the digital twin can present real time information to on-site workers to notify of them of any potential issues. Data can be overlaid on to the real world allowing for a better level of visualisation by placing data on for example, a table or other flat surface all whilst the user is using their own mobile phone. Smart Glasses can also be used by overlaying data over what the user is currently seeing. Onsite workers like individuals working on oil rigs, could use something like the Microsoft HoloLens in combination with a digital twin to get real time information when performing maintenance and repairs.

VR:

Using Virtual Reality is an even more impressive way in which to present the data created by simulations by the digital twin as well as the data being fed into it by the sensors. Presenting data in VR has been in the past has been incredibly effective for other industries like the oil and gas and banking industries, VR can be used with digital twins this way. By using VR headsets and 3D digital representations of large sites like an oil rig or an energy plant, workers onsite can be presented with data from the Digital Twin. Rather than working with raw data, VR allows workers to immerse themselves and experience the site in real time or whilst simulations are running.

Whether it’s using VR to present data from a digital twin in an immersive and engaging experience or support onsite workers through augmented reality headsets and AR mobile phone technology, digital twins are a technology that you’ll see more and more large businesses enterprises use in the years to come.

____

What are some areas where AR/VR or digital twins still need work?

The user experience of AR/VR, as well as accuracy of the hardware, still has room for improvement. For example, the VR headset itself is heavy, has low battery life, and presents challenges for those with glasses. Both VR and AR devices provide spatial accuracy in centimeters. This level of accuracy limits the use cases, especially when mechanical precision is critical.

Digital twins will keep evolving as new technology and products are continuously introduced. Digital twins work best when a fully connected digital thread is present – connecting product design, manufacturing engineering, production shop floor, and product usage. This is an area that both technology supplier and user need to work on together in order to exploit the digital twin’s full potential.

_____

_____

Section-9

Digital twin and Artificial Intelligence (AI):   

Artificial intelligence (AI) is the digital replication of three human cognitive skills: learning, reasoning, and self-correction. Digital learning is a collection of rules, implemented as a computer algorithm, which converts the real historical data into actionable information. Digital reasoning focuses on choosing the right rules to reach a desired goal. Whereas, digital self-correction is the iterative process of adopting the outcomes of learning and reasoning. Every AI model follows this process to build a smart system, which performs a task that normally requires human intelligence. Most of the AI systems are driven by machine learning, deep learning, data mining, or rule-based algorithms, where others follow logic-based and knowledge-based methods. Nowadays, machine learning and deep learning are widely used AI approaches. 

_

It is often confusing to differentiate between artificial intelligence, machine learning, and deep learning techniques. Deep Learning is a subset of Machine Learning which is a subset of artificial intelligence. Machine learning (ML) is an AI method, which searches for particular patterns in historical data to facilitate decision-making. The more data we collect, the more accurate is the learning process (reflects the value of big data). Machine learning can be 1) supervised learning, which accepts data sets with labeled outputs in order to train a model for classification or future predictions; 2) unsupervised learning, which works on unlabeled data sets and is used for clustering or grouping; and 3) reinforcement learning, which accepts data records with no labels but, after performing certain actions, it provides feedback to the AI system. Examples of supervised learning techniques are regression, decision trees, support vector machines (SVMs), naive Bayes classifiers, and random forests. Similarly, K-means and hierarchical clustering, as well as mixture models, are examples of unsupervised learning. Finally, Monte Carlo learning and Q-learning fall under the reinforcement learning category. On the other hand, deep learning is a machine learning technique that is motivated by biological neural networks with one or more hidden layers of digital neurons. During the learning process, the historical data are processed iteratively by different layers, making connections, and constantly weighing the neuron inputs for optimal results.

_

How AI & Machine Learning are used in Digital Twins:

Embedding AI and machine learning in the DNA of your digital twins can help your organization be more competitive at real-time operations.

Real-Time Analytics:

As a digital twin ingests data in real-time, it can apply AI and machine learning to look for anomalous behavior, predict future states, and optimize production. This advanced real-time analytics is the first step to getting the most value out of your digital twin.

Decision Support:

This additional layer of intelligence can be used to display predictions from your digital twin. It provides decision support for your engineers when they need to make real-time decisions. By providing details and predictions about metrics like remaining useful life or stock levels, you can empower your team to respond faster to critical business events.

Prescriptive Analytics & Recommendations:

The final way to leverage AI and machine learning in your digital twins is to use them for prescriptive analytics and to create recommendations on the best action to take next based on their predictions. In this scenario, your digital twin will do more than provide you with real-time status updates. It will help you and your team take the actions that are most likely to produce the best result based on real-time data.

You might use a digital twin to predict stock levels at different nodes of your supply chain or predict the remaining useful life of a machine. AI and machine learning provide the essential capabilities to help you maximize the value you get from your digital twins.

_

The differences between ML-enabled DT and AI-enabled DT are:

  • ML-enabled DT is a subset of AI-enabled DT.
  • ML-enabled DT involves algorithms such as ANN, RF, kNN, whereas AI-enabled DT involves algorithms such as genetic algorithm, ant colony optimization, and particle swarm optimization, in addition to ML algorithms.
  • ML-enabled DT is primarily used for process control, scheduling and prediction, whereas AI-enabled DT is primarily used for optimization, scheduling, and resource allocation.
  • ML-enabled DT is more abundant than AI-enabled DT

_____

_____

Relationship between IoT, big data, AI-ML and digital twins:

Big data remains one of the top research trends from last few years. It is different from an ordinary data because of its high volume, high velocity, and heterogeneous variety. Researchers named these characteristics as ‘‘the 3Vs of big data,’’ i.e., volume, velocity, and variety. Big data analytics is a process that analyses big data and converts it to valuable information, using state-of-the-art mathematical, statistical, probabilistic, or artificial intelligence models.  

The emerging sensor technologies and IoT deployments in industrial environments have paved the way for several interesting applications, such as real-time monitoring of physical devices, indoor asset tracking, and outdoor asset tracking. IoT devices facilitate the real-time data collection—that is necessary for the creation of a digital twin of the physical component—and enable the optimization and maintenance of the physical component by linking the physical environment to its virtual image (using sensors and actuators). Note that, the above-mentioned IoT data is big in nature, so the big data analytics can play a key role in the development of a successful digital twin. The reason is that industrial processes are very complex, and identifying potential issues in early stages is cumbersome, if we use traditional techniques. On the other hand, such issues can easily be extracted from the collected data, which brings efficiency and intelligence into the industrial processes. However, handling this enormous amount of data in the industrial and DT domains requires advanced techniques, architectures, frameworks, tools, and algorithms. For instance, Zhang et al. proposed a big data processing framework for smart manufacturing and maintenance in a DT environment.

_

Oftentimes, cloud computing is the best platform for processing and analyzing big data. Additionally, an intelligent DT system can only be developed by applying advanced AI techniques on the collected data. To this end, intelligence is achieved by allowing the DT to detect (e.g., best process strategy, best resource allocation, safety detection, fault detection), predict (e.g., health status and early maintenance), optimize (e.g., planning, process control, scheduler, assembly line), and take decisions dynamically based on physical sensor data and/or virtual twin data. In short, IoT is used to harvest big data from the physical environment. Later, the data is fed to an AI model for the creation of a digital twin. Then, the developed DT can be employed to optimize other processes in the industry.

The overall relationship among IoT, big data, AI, and digital twins is presented in figure below:

_

_

Digital twin based smart manufacturing using big data analytics and AI-ML is depicted in figure below:

_____

_____  

Artificial intelligence is one of the key drivers of Industry 4.0 and has been revolutionizing manufacturing. Many manufacturing uses AI-enabled algorithm to optimize processes. Integrating digital twins with AI helps to identify outcomes to complex virtual scenarios thereby improving product quality and efficiency. AI, machine learning, and deep learning also help in understanding complex virtual data, creating multiple variables that would not be possible in real-world data.  

The real boom for digital twins comes from AI and its predictive capabilities. In the past, creating spatial models digitally was exciting – but little more than a way of visualising an object statically. Today, all the data we have from sensors, historical performance, and inputs on behaviour can be linked to the spatial model and predict future behaviour by changing different inputs. In fact, the data and the predictive capabilities bring the spatial model to life.

The first advantage of a digital twin is the ability to generate simulated data. A virtual environment can be subjected to an infinite number of repetitions and scenarios. The simulated data generated can then be used to train the AI model (e.g. as part of an AI development platform). In this way, the AI system can be taught potential real-world conditions that might otherwise be rare or still in the testing phase. AI Development Platform enables efficient training of AI models based on simulated data from digital twins through a fully integrated work flow system.

The second advantage is the possibility to plan and test new functions. The digital twin should depict reality – but it can also provide a glimpse into the future. Should investments be made in a new warehouse and dispatch centre? Or is machine learning being considered to augment new data operations? The big advantage of this: You can create the world of tomorrow virtually and test various scenarios. The tests can be optimised and run as many times as necessary in order to find the optimal solution.

Finally, the addition of machine learning to an industrial process will make the process smarter by obtaining more accurate data and predictions as well as understanding visual and unstructured data. Integrating machine learning into the work flow not only opens up opportunities to discover previously unseen patterns in the data but also creates new possibilities for optimising processes.

_____

_____

Digital Twin and Machine Learning:  

For simple applications, digital twin technology offers value without having to employ machine learning. Simple applications are characterized by a limited number of variables and an easily discoverable linear relationship between inputs and outputs. However, most real-world systems that contend with multiple data streams stand to benefit from machine learning and analytics to make sense of the data. Machine learning, in this context, implies any algorithm that is applied to a data stream to uncover/discover patterns that can be subsequently exploited in a variety of ways. For example, machine learning can automate complex analytical tasks. It can evaluate data in real-time, adjust behavior with minimal need for supervision, and increase the likelihood of desired outcomes. Machine learning can also contribute to producing actionable insights that can lead to cost savings. Smart buildings are an excellent example of applications that stand to benefit from machine learning capabilities in the digital twin. Machine learning uses within a digital twin include: supervised learning (e.g., using neural network) of operator/user preferences and priorities in a simulation -based, controlled experimentation testbed; unsupervised learning of objects and patterns using, for example, clustering techniques in virtual and real-world environments; and reinforcement learning of system and environment states in uncertain, partially observable operational environments.

_____

How Machine Learning supercharges Real-Time Digital Twins:  

When tracking telemetry from a large number of IoT devices, it’s essential to quickly detect when something goes wrong. For example, a fleet of long-haul trucks needs to meet demanding schedules and can’t afford unexpected breakdowns as a fleet manager manages thousands of trucks on the road. With today’s IoT technology, these trucks can report their engine and cargo status every few seconds to cloud-hosted telematics software. How can this software sift through the flood of incoming messages to identify emerging issues and avoid costly failures? Can the power of machine learning be harnessed to provide predictive analytics that automates the task of finding problems that are otherwise very difficult to detect?

Real-time digital twins offer a powerful software architecture for tracking and analyzing IoT telemetry from large numbers of data sources. A real-time digital twin is a software component running within a fast, scalable in-memory computing platform, and it hosts analytics code and state information required to track a single data source, like a truck within a fleet. Thousands of real-time digital twins run together to track all of the data sources and enable highly granular real-time analysis of incoming telemetry. By building on the widely used digital twin concept, real-time digital twins simultaneously enhance real-time streaming analytics and simplify application design.

_

Incorporating machine learning techniques into real-time digital twins takes their power and simplicity to the next level. While analytics code can be written in popular programming languages, such as Java and C#, or even using a simplified rules engine, creating algorithms that ferret out emerging issues hidden within a stream of telemetry still can be challenging. In many cases, the algorithm itself may be unknown because the underlying processes which lead to device failures are not well understood. In these cases, a machine learning (ML) algorithm can be trained to recognize abnormal telemetry patterns by feeding it thousands of historic telemetry messages that have been classified as normal or abnormal. No manual analytics coding is required. After training and testing, the ML algorithm can then be put to work monitoring incoming telemetry and alerting when it observes suspected abnormal telemetry.

_

To enable ML algorithms to run within real-time digital twins, ScaleOut Software has integrated Microsoft’s popular machine learning library called ML.NET into its Azure-based ScaleOut Digital Twin Streaming Service™. Using the ScaleOut Model Development Tool™ (formerly called the ScaleOut Rules Engine Development Tool), users can select, train, evaluate, deploy, and test ML algorithms within their real-time digital twin models. Once deployed, the ML algorithm runs independently for each data source, examining incoming telemetry within milliseconds after it arrives and logging abnormal events. The real-time digital twin also can be configured to generate alerts and send them to popular alerting providers, such as Splunk, Slack, and Pager Duty. In addition, business rules optionally can be used to further extend real-time analytics.

_

The following diagram illustrates the use of an ML algorithm to track engine and cargo parameters being monitored by a real-time digital twin hosting an ML algorithm for each truck in a fleet. When abnormal parameters are detected by the ML algorithm (as illustrated by the spike in the telemetry), the real-time digital twin records the incident and sends a message to the alerting provider:

Training an ML algorithm to recognize abnormal telemetry just requires supplying a training set of historic data that has been classified as normal or abnormal. Using this training data, the ScaleOut Model Development Tool lets the user train and evaluate up to ten binary classification algorithms supplied by ML.NET using a technique called supervised learning. The user can then select the appropriate trained algorithm to deploy based on metrics for each algorithm generated during training and testing. (The algorithms are tested using a portion of the data supplied for training.)

_____

Machine Learning-Based Digital Twin for Predictive Modeling in Wind Turbines, a 2022 study:

Abstract:

Wind turbines are one of the primary sources of renewable energy, which leads to a sustainable and efficient energy solution. It does not release any carbon emissions to pollute our planet. The wind farms monitoring and power generation prediction is a complex problem due to the unpredictability of wind speed. Consequently, it limits the decision power of the management team to plan the energy consumption in an effective way. Authors’ proposed model solves this challenge by utilizing a 5G-Next Generation-Radio Access Network (5G-NG-RAN) assisted cloud-based digital twins’ framework to virtually monitor wind turbines and form a predictive model to forecast wind speed and predict the generated power. The developed model is based on Microsoft Azure digital twins infrastructure as a 5-dimensional digital twins platform. The predictive modeling is based on a deep learning approach, temporal convolution network (TCN) followed by a non-parametric k-nearest neighbor (kNN) regression. Predictive modeling has two components. First, it processes the univariate time series data of wind to predict its speed. Secondly, it estimates the power generation for each quarter of the year ranges from one week to a whole month (i.e., medium-term prediction). To evaluate the framework the experiments are performed on onshore wind turbines publicly available datasets. The obtained results confirm the applicability of the proposed framework. Furthermore, the comparative analysis with the existing classical prediction models shows that authors’ designed approach obtained better results. The model can assist the management team.

_____

_____

Section-10

Digital twin components, platforms, architecture, and systems:  

_

Components of digital twin:

A digital twin (DT) is a detailed and dynamically updated virtual replica of physical objects or processes, made to monitor performance, test different scenarios, predict issues, and find optimization opportunities. Unlike traditional computer-aided design and engineering (CAD/CAE) models, a DT always has a unique, real-world counterpart, receives live data from it, and changes accordingly to mimic the origin through its lifecycle.

The twinning, however, doesn’t happen out of thin air. This process involves numerous pieces working as a uniform system.

A digital twin system contains hardware and software components with middleware for data management in between.

Figure above shows components of the digital twin system.

_

-1. Hardware components.

The key technology driving DTs is the Internet of Things (IoT) sensors, that initiate the exchange of information between assets and their software representation. The hardware part also includes actuators, converting digital signals into mechanical movements, network devices like routers, edge servers, and IoT gateways, etc.

-2. Data management middleware.

The second component is data management middleware. Here, the hardware stores and processes the necessary information for the digital twin model. Its bare-bones element is a centralized repository to accumulate data from different sources. Ideally, the middleware platform also takes care of such tasks as connectivity, data integration, data processing, data quality control, data visualization, data modeling and governance, and more. Examples of such solutions are common IoT platforms and industrial (IIoT) platforms that often come with pre-built tools for digital twinning.

-3. Software components.

Data collection is useless without a way to observe and evaluate it.  Software takes the data from the sensors, gathered and stored by the middleware, and turns those observations into valuable insights about the process. The software uses the collected data to form a digital twin model that accurately reflects what is happening in real time. The crucial part of digital twinning is the analytics engine that turns raw observations into valuable business insights. In many cases, it is powered by machine learning models. Other must-have pieces of a DT puzzle are dashboards for real-time monitoring, design tools for modeling, and simulation software.

_____

_____

Best free software tools form modeling and simulating Digital Twins:

There are a variety of popular Free & Open Source Software (FOSS) and Commercial Off-the-Shelf (COTS) Digital Engineering software tools that can be used to model and simulate Digital Twins. Some examples of popular open source tools include OpenModelica, OpenDSS, and OpenFOAM, which are used to model and simulate physical systems in a range of domains, including electrical, mechanical, and fluid dynamics. Other examples of popular open source tools include Gephi, NetworkX, and Neo4j, which are used to model and simulate complex networks and systems of interactions.

In addition to open source tools, there are also a number of popular commercial Digital Engineering software tools that can be used to model and simulate digital twins. Some examples of popular commercial tools include Cameo/Cameo Simulation Toolkit, MATLAB/Simulink, Mathematica, MapleSim, Comsol Multiphysics, and Ansys, which are used to model and simulate physical systems in a range of domains, including electrical, mechanical, and fluid dynamics. Other examples of popular commercial tools include IBM Watson Studio, SAP Leonardo, and Siemens MindSphere, which are used to model and simulate complex networks and systems of interactions, as well as to integrate Digital Twins with other data and analytics tools.

Overall, the specific Digital Engineering software tools that are used to model and simulate Digital Twins will ultimately depend on the specific domain and application of the Digital Twin, as well as the specific modeling and simulation requirements of the particular system. Some tools may be more suitable for particular types of systems or applications, while others may be more versatile in nature and therefore applicable to a wider range of domains.

_____

Best Digital Twin Software to try in 2023.

  • aPriori Digital Manufacturing Simulation Software
  • SAP Leonardo Internet of Things
  • Predix
  • Ansys Twin Builder
  • Digital Twin Intelligent Automation Platform
  • iLens
  • cohesion
  • Akselos
  • IoTIFY
  • Autodesk Digital Twin

_____

Digital Twin Software Cost:

The cost of digital twin software varies significantly depending on the features and capabilities the system offers, as well as the size of the organization and the industry it operates in. Additionally, the cost of the software will depend on the vendor and the number of users.

For smaller organizations, the cost of digital twin software may range from around $20,000 to $50,000 for a basic system, depending on the vendor and the features offered. For larger organizations, the cost of the software may range from $100,000 to $200,000 or more for a system with more advanced features.

For organizations that require additional support, such as custom development, maintenance, and training services, the cost of digital twin software can be even higher. In these cases, the cost of the software may range from $500,000 to millions of dollars, depending on the complexity of the system and the services offered.

The cost of digital twin software continues to evolve as new features and capabilities are added, and as vendors continue to compete for market share. As such, the cost of digital twin software should be evaluated on a case-by-case basis, taking into account the needs of the organization and the features offered by vendors.

_____

_____

Digital twin platforms:

Digital twin software platforms incorporate IoT sensor data and other data to monitor asset performance and run simulations. Scalable digital twin platforms have six key features:

-1. Manage the digital twin lifecycle. Digital twins are the instantiations of “digital threads” or “digital masters,” i.e., the requirements, parts, and control systems that make up a physical asset. For instance, a windmill’s digital master consists of the engineering diagrams, bill of material, software versions, and other artifacts used to create a windmill. A digital twin platform for windmills needs to leverage that digital master and provide tools to test, deploy and manage the digital twin based on the digital master. These tools need to be able to scale to hundreds of digital masters and thousands of digital twins.

-2. Single source of truth. Platforms need to be able to update and provide the exact state for each digital twin. For example, routine maintenance might lead one asset to have a different part or firmware version than another asset. A platform must be able to update the exact state for each object and asset as soon as it changes.

-3. Open API. As the interface and integration point for an industrial IoT solution, the platform must provide an open API that allows any system to interact with it.

-4. Visualization and analysis. The platform must allow the organization to create visualizations, dashboards, and in-depth analyses of live data from the digital twin. The live data should be linked to the digital master.

-5. Event and process management. The platform must allow users to create events and business processes that can be executed based on the platform’s data.

-6. Customer and user perspective. The platform needs to enable collaboration between the stakeholders of a digital twin. It should reflect what organization owns or operates each one and what users are allowed to access the data.

_

Kongsberg Digital is a company delivering PaaS and SaaS services for energy, oil and gas, and maritime industries. Kongsberg is partner with some key cloud vendors, including Microsoft Azure, to provide enabling information technology services. In 2017, Kongsberg launched its open digital ecosystem platform, called KognifAI. It combines a group of applications on the cloud focusing on optimal data accessibility and processing. KognifAI is built on cybersecurity, identity, encryption, and data integrity. Moreover, its main infrastructure can be utilized to easily scale applications and services. KognifAI offers digital twin solutions in maritime, drilling and wells, renewable energies, etc. Kongsberg dynamic digital twin combines safety with fast prototyping and implementation, and connects offshore and onshore users in oil and gas industry. It can provide a model not only to represent an existing physical system, but also a planned installation (e.g., greenfield digital twin) or maintenance and repair (e.g., brownfield).

_

In 2018, MapleSim, a product of MapleSoft, added new features for developing digital twins. MapleSim is powerful in creating an accurate dynamic model of the machine based on CAD data, with all forces and torques included. The major focus of MapleSim digital twin module is to implement virtual plant models that do not necessarily require expert knowledge. To test motor sizing in a new screwing machine at Stoppil Industrie, a digital twin of the machine was created using MapleSim. The initial motor size was found to be undersized by a factor of 10, which would have caused machine failure and excessive losses. MapleSim was integrated into B&R automation studio to facilitate the use of digital twin for machine development. MapleSim in conjunction with B&R software were used to build modelbased feedback during motor sizing for an injection molding machine.

_

Cognite provides the full-scale digital transformation services to heavy industries such as oil and gas, power, original equipment manufacturers (OEMs), and shipping companies. Cognite has built a software package named Cognite Data Fusion that extracts useful information from the data. One of the main features of Cognite Data Fusion is its APIs, SDKs, and libraries that are open-source to its customers. Developers and analysts can build applications and ML models that best suit the operation needs. These application scan include large CAD models, complex asset plans, and can be run on phones and tablets. Cognite Data Fusion also offers to customize permissions hierarchies for sharing data with partners and suppliers. For example, the Cognite Digital Platform has helped Framo (an OEM for pumping systems) and their customer Aker BP to communicate and share the live operational data of equipment more efficiently. This enabled Framo to create their applications and monitor the status of equipment to plan the maintenance. Using the operational data of their equipment, OEMs can inform their customers about how to improve the performance of the equipment. The integration of the Siemens information management system (IMS) and Cognite data platform has benefited Aker BP in optimizing the offshore maintenance and reduce costs. With the availability of live data and using artificial intelligence, ML algorithms, Siemens presented a powerful analysis of each equipment with advanced visualizations.

_

Siemens digital twin leads the industry by offering diverse computational tools in CAE, CAD, manufacturing and electronic design and connect information from all of these domains using a seamless digital thread to give companies tremendous insight into products and designs. Siemens plant simulation (PS) digital tool was successfully interfaced (digital copy) with production line involving manufacturing of pneumatic cylinders within automotive industry to promote the concept of Industry 4.0. The production line simulation model was optimized by genetic algorithm provided by Siemens PS tool which adjusted the simulation model and then simulated the digital twin. Smart factories are designed with machines that can operate based on manufacturing environments, control production processes and share information with each other in the form of knowledge graphs. Generally, knowledge graphs are incomplete and missing data has to be inferred. Siemens digital twin powered by machine learning tools (e.g., recurrent neural networks) was demonstrated to complete the knowledge graph and synchronize the digital and the physical representations of a smart factory.

_

ANSYS introduced digital twin builder in its ANSYS 19.1 version. ANSYS twin builder provides the developer with several features such as creating a multi-domain system, multiple fidelity and multiphysics solver, efficient ROM construction capabilities, third-party tool integration, embedded software integration, as well as system optimization. ANSYS along with other companies built the digital twin for a pump that can use the real-time sensor data to improve its performance and to predict failures. General Electric also used a customized version of ANSYS digital twin to design megawatt-sized electric circuit breakers.

_

Akselos, founded in 2012, offers instantaneous physics based simulations and analyses of critical infrastructures calibrated with sensor data in an asset-heavy industry. The key benefits of Akselos digital twin are the asset performance optimization and life extension, failure prediction and prevention, as well as contingency planning. Akselos owns a structural analysis tool that’s fast enough to integrate, recalibrate, and re-analyze the sensor data. Their framework uses a reduced basis finite element analysis (RB-FEA) technology, a state-of-the-art reduced order modeling approach which is quite faster than conventional FEA and higher accuracy is ensured by using a posteriori accuracy indicators and automated model enrichment. Akselos provides a cloud based platform to develop the digital twin framework for any number of users from any geographic locations or organizations. The structural models developed by Akselos have the capacity to incorporate the localized nonlinearities as well. Among the existing case studies, Akselos digital twin is used for offshore asset life extension, optimizing floating production storage and offloading, inspection, return on investment, and ship loader life extension. Akselos unlocked 20 years of structural capacity for the assets operated by Royal Dutch Shell in the North Sea.

_

General Electric (GE) has been developing a digital twin environment integrated with different components of the power plant that takes into account customer defined Key Performance Indicator (KPIs) and business objectives by measuring asset health, wear and performance. Their Digital Twin runs on the Predix platform, designed to operate large volumes of sensor data at an industrial scale. Their platform offers advanced distribution management solutions, geospatial network modeling solutions, grid analytics, and asset performance management for power and utility services such as next generation sensing technologies, digital threading, artificial intelligence, advanced control and edge computing. Many world renowned companies have been applied these technologies for diverse industrial fields like automotive, food and beverage, chemicals, digital energy, steel manufacturing, equipment manufacturing, pulp/paper manufacturing, and semiconductors.

_

Oracle IoT Cloud offer Digital Twin through three pillars, (i) virtual twin where the physical asset or device is represented virtually in the cloud, (ii) predictive twin using either physics based models (FEM/CFD) or statistic/ML models having support from Oracle’s products such as Oracle R Advanced Analytics for Hadoop (ORAAH) and Oracle Stream Explorer, and (iii) twin projections where the insights generated by digital twin is projected to the backend application and supported by Oracle ERP (supply chain, manufacturing, maintenance applications) and CX (service).

_____

_____

Architecture of digital twin:

The Basic architecture of Digital twin Technology comprises four elements:

  • The physical systems for which a Digital Twin will be created.
  • The IoT sensors to collect and transmit the data.
  • The virtual model that mirrors the real-world physical system.
  • Software that monitors and profoundly analyses data.

_

Steps involved in the creation of digital twin:

  • Physical to Digital data conversion: Sensors connected to the physical system capture all the sort data from the physical assets and convert it into digital data.
  • Digital to Digital Data conversion: The twin technologies utilize advanced analytics, Artificial Intelligence, and scenario analysis to acquire and share information.
  • Digital to Physical Data conversion: Algorithms in the Digital twin Technologies translate digital world decisions to data (in human-readable form) to spur-up action and change in the physical world or the real world.

_

An architectural overview of the DT concept:

In the DT concept, each physical object has its virtual counterpart. These virtual counterparts are called virtual mirror models. These virtual mirror models have built-in capabilities to analyze, evaluate, predict, and monitor physical objects.

This innovative architecture of the DT concept creates a powerful communication mechanism between the physical (material) and the virtual world by using data.

In DT architecture, physical and virtual components integrate synchronously to create a close loop.

Digital twin initiatives are the practical implementation of Cyber-Physical Systems (CPS) architecture in engineering and computer science domains. To understand the technical aspect of the digital twins, we need to know the CPS architecture.

In a nutshell, CPS is a technical construct converging physical and virtual domains. The core architecture of CPS is to embed communication and computing capacity into physical objects so that physical objects can be coordinated, controlled, and monitored via virtual means.

CPS integrates physical processes, computing, and networking as a single entity called embedded object. We can use embedded objects in various devices and appliances. The prime examples are medical devices, scientific instruments, toys, cars, fitness clothes, and other wearables.

CPS requires architectural abstraction and modeling. Based on the models, we can develop designs leveraging computation power, data, and application integration for monitoring physical phenomena such as heat, humidity, motion, and velocity.

CPS can leverage the IoT (Internet of Things) technology stack. CPS can be part of the IoT ecosystem in global IoT service providers.

The scalability and capacity of the embedded objects are critical. 

_

Simplified digital twin reference architecture is depicted in the figure below:

_

_

Digital twin conceptual architecture is depicted in the figure below:

The conceptual architecture may be best understood as a sequence of six steps, as follows:

-1. Create:

The create step encompasses outfitting the physical process with myriad sensors that measure critical inputs from the physical process and its surroundings. The measurements by the sensors can be broadly classified into two categories: (1) operational measurements pertaining to the physical performance criteria of the productive asset (including multiple works in progress), such as tensile strength, displacement, torque, and color uniformity; (2) environmental or external data affecting the operations of a physical asset, such as ambient temperature, barometric pressure, and moisture level. The measurements can be transformed into secured digital messages using encoders and then transmitted to the digital twin.

The signals from the sensors may be augmented with process-based information from systems such as the manufacturing execution systems, enterprise resource planning systems, CAD models, and supply chains systems. This would provide the digital twin with a wide range of continually updating data to be used as input for its analysis.

-2. Communicate:

The communicate step helps the seamless, real-time, bidirectional integration/connectivity between the physical process and the digital platform. Network communication is one of the radical changes that have enabled the digital twin; it comprises three primary components:

  • Edge processing:

The edge interface connects sensors and process historians, processes signals and data from them near the source, and passes data along to the platform. This serves to translate proprietary protocols to more easily understood data formats as well as reduce network communication. Major advances in this area have eliminated many bottlenecks that have limited the viability of a digital twin in the past.

  • Communication interfaces:

Communication interfaces help transfer information from the sensor function to the integration function. Many options are needed in this area, given that the sensor producing the insight can, in theory, be placed at almost any location, depending on the digital twin configuration under consideration: inside a factory, in a home, in a mining operation, or in a parking lot, among myriad other locations.

  • Edge security:

New sensor and communication capabilities have created new security issues, which are still developing. The most common security approaches are to use firewalls, application keys, encryption, and device certificates. The need for new solutions to safely enable digital twins will likely become more pressing as more and more assets become IP enabled.

-3. Aggregate:

The aggregate step can support data ingestion into a data repository, processed and prepared for analytics. The data aggregation and processing may be done either on the premises or in the cloud. The technology domains that power data aggregation and processing have evolved tremendously over the last few years in ways that allow designers to create massively scalable architectures with greater agility and at a fraction of the cost in the past.

-4. Analyze:

In the analyze step, data is analyzed and visualized. Data scientists and analysts can utilize advanced analytics platforms and technologies to develop iterative models that generate insights and recommendations and guide decision making.

-5. Insight:

In the insight step, insights from the analytics are presented through dashboards with visualizations, highlighting unacceptable differences in the performance of the digital twin model and the physical world analogue in one or more dimensions, indicating areas that potentially need investigation and change.

-6. Act:

The act step is where actionable insights from the previous steps can be fed back to the physical asset and digital process to achieve the impact of the digital twin. Insights pass through decoders and are then fed into the actuators on the asset process, which are responsible for movement or control mechanisms, or are updated in back-end systems that control supply chains and ordering behavior—all subject to human intervention. This interaction completes the closed loop connection between the physical world and the digital twin.

The digital twin application is usually written in the primary system language of the enterprise, which uses the above steps to model the physical asset and processes. In addition, throughout the process, standards and security measures may be applied for purposes of data management and interoperable connectivity.

The computation power of big data engines, the versatility of the analytics technologies, the massive and flexible storage possibilities of the aggregation area, and integration with canonical data allow the digital twin to model a much richer, less isolated environment than ever before. In turn, such developments may lead to a more sophisticated and realistic model, all with the potential of lower-cost software and hardware.

It is important to note that the above conceptual architecture should be designed for flexibility and scalability in terms of analytics, processing, the number of sensors and messages, etc. This can allow the architecture to evolve rapidly with the continual, and sometimes exponential, changes in the market.

_____

Digital twins are not standalone applications. Digital twins integrate into the organization’s existing enterprise application suite to support the intended business outcomes. When it comes to specific uses, methods, protocols and even enabling technologies, DT concepts will vary for each domain. This is mostly due to the nature of information from each domain. Each domain will determine the rationale for deploying a DT within a built environment by answering business-case questions. However, there is a general framework for the DT architecture which is composed of three main elements: the physical world, the virtual world and the connectivity between the two. Each element will integrate a variety of components dependent on the designer’s needs and requirements. However, some basic components include sensors in the physical world (to gather information from the real environment), a physical twin, edge processing capabilities, data security, the digital twin itself, data processing capabilities (enabled by machine learning (ML), artificial intelligence (AI), big data, etc.) and communication interfaces such as the internet, Bluetooth, satellite, etc. An important part of this DT architecture also includes data visualization for the user. This is showcased in figure below, in which the physical world is composed of the physical object or process, sensors, actuators and processing capabilities. The digital world is composed of the digital twin itself, machine learning and data processing capabilities and databases. Both are connected in the communication element where several protocols and interfaces are available such as WiFi, Bluetooth and wired connections. For the user, this architecture allows constant monitoring and visualization.

Figure above shows general DT architecture.

It is vital to note that the digital twin does not stand alone; it must be integrated with the overall enterprise architecture. As a matter of fact, some elements that are used in the digital twin are likely to already exist within the organization and they can be extended or re-purposed to support the digital twin models.

_____ 

More detailed architectural concepts and frameworks have been proposed for the implementation of DTs. An overview of these concepts is given and summarized in table below that also gives a classification based on their level of abstraction. A high level of abstraction means that a more general concept is presented, whereas a low level indicates a more concrete architecture or framework, targeting the implementation of a DT.

Overview of concepts, architectures, and frameworks for Digital Twins (DT):

Name

Target Domain

Structure

Main Parts

Level of Abstraction

3D-DT 

Life-cycle Management

component-based

3 components

high

5D-DT 

Manufacturing

component-based

5 components

high

5C Architecture 

CPS in manufacturing

layer-based

5 layers

high

Intelligent DT 

Production Systems

component-based

4 interfaces & 9 components

low

Ref. Framework for DT 

CPS in general

component-based

4 main components

low

COGNITWIN 

Process Industry

components & layers

5 layers & 19 components

low

Conceptual DT Model 

CPS in general

layer-based

6 layers

medium

ASS 

Manufacturing

only meta-model

ongoing work

Note:

Detailed description of architecture of various types of digital twins is beyond scope of this article.

_____

_____

DT System:

Figure below shows a schematic diagram of a Digital Twin System, consisting of a physical asset, its digital twin, and a third part that represents all-things-human. Even the humans who may physically be inside the physical asset are schematically inside the circle representing humans. High fidelity simulations are an integral part and a necessity for DTs. High and low fidelity simulations are represented in the ‘simulator’ part of the digital twin. The flow of information, data, actions and recommendations, and sensory data are represented by arrows. The physical asset is expected to be highly instrumented. Operational history of the physical asset is also expected to be a part of the DT. Application of AI/ML are distributed over the DT as well as in the human part of the system.

Figure above shows schematic diagram of a digital twin system.

_

DT with ChatGPT

The next frontier in this space is expected to be a marriage between DT (encompassing traditional simulations with AI/ML/DA/VR and sensory data) with intelligent natural language processing (NLP) systems, such as ChatGPT. Given the storm ChatGPT has caused in the months after its release, and ways in which it is already being used, any attempt to predict its long-term impact is likely to fall short. In the context of this article, it is, however, easy to visualise DTs being available to human operators (possibly at locations far from the physical assets) in the VR environment, linked with knowledge management systems comprising of general-purpose systems such as ChatGPT, as well as specialised (maybe even proprietary) knowledge bases specific to the physical asset. Information extraction would happen via spoken commands. The general-purpose and system-specific knowledge bases would intelligently integrate responses to specific commands before passing it to the humans. Responses could be in voice or text form. When appropriate, this hybrid, intelligent system can also make ranked recommendations for actions along with associated costs, benefits, and risks. This information can be made available through VR or in AR HMIs. Voice command and voice responses are likely to become much better integrated with query-and-responses than they currently are. In addition to responses to specific queries, smart AI-based NLP systems can also be used to make recommendations on their own (without a query), in text or voice form, based on continuous analysis of sensory data processed through the AI and data analytics parts of the system.

______

______

Section-11

Challenges and limitations of digital twins:

_

Realizing a mammoth technology like DT comes with its own challenges. The challenges that arise with developing a DT depends on its scale and complexity but there are some barriers with the technology that are common to all. Table below is overview of common challenges to digital twins and enabling technologies that will address those challenges.

_____

_____

Shared challenges:

It is becoming more evident that Digital Twin runs in parallel with AI and IoT technology resulting in shared challenges. The first step in tackling the challenges is to identify them. Some of the common challenges are found with both data analytics and the Internet of Things, and the end aim is to identify shared challenges for Digital Twins.

Table below shows a summary of challenges for both data analytics and I/IoT while showing the overarching combined challenges for a Digital Twin.

Shared challenges:

Digital Twin

Data Analytics

Industrial IoT/IoT

IT Infrastructure

IT Infrastructure

Data

Data

Privacy

Privacy

Security

Security

Trust

Trust

Expectations

Expectations Connectivity

_____

_____

Key challenges in constructing Digital Twin:

The main challenges in constructing DT can be summarized as the following five aspects:

-1. High-fidelity modeling.

Due to the variability, uncertainty and fuzziness of physical space, building models in virtual space to mirror entities with high fidelity is a fundamental issue. Virtual models ought to be faithful replicas of physical entities, which reproduce physical geometries, properties, behaviors and rules. Current modeling is usually limited to geometric consistency, and much work is needed at the other three levels. With the operation of equipment, the physical entity will change or degrade to a certain extent, and the built model will be inconsistent with the entity. When inconsistences between models and entities appear, how to appropriately identify and update them is difficult. Current modeling generally focuses on a specific life stage of a product. Building a virtual model of a product throughout its entire life-cycle, including design, manufacturing, operation and maintenance, and recycling, is valuable but challenging.

-2. Data acquisition and processing.

Data comprise another key driver of DT, which consists of multitemporal scale, multidimension, multisource and heterogeneous data. The whole data lifecycle includes data collection, transmission, storage, processing, fusion and visualization. To solve these problems, we need to integrate sensors, machine vision, Internet, IoT, database, data fusion and other technologies. Some data processing approaches, such as signature-based techniques, various neural networks, etc. have low accuracy, are time-consuming, and consume an excessive amount of computing resources. To ensure real-time and reliable simulation analysis results, we need to develop some fast data analysis methods with high accuracy. In addition, we have to address the problems of fleet data, considering both common characteristics and individual differences of batch products.

-3. Real-time, two-way connection between the virtual space and the real space.

The virtual model obtains real-time data of physical entities, and the analysis results are utilized to guide the physical entities in real time. Due to the large amount of data, network transmission delays, model analysis time, etc., it is difficult for DT to achieve a real-time, two-way connection. We also need to solve problems such as visualization and human-equipment interaction.

-4. Unified development platform and tools.

Due to different formats, protocols and standards, current tools may not be simultaneously integrated and applied for a particular objective. Therefore, the development of a universal design and development platforms and tools for DT are required in the future.

-5. Environmental coupling technologies.

The current DT lacks association with external environment. The mechanism explaining how physical object interacts with environment has not been fully embodied in the virtual models. Lots of research studies have explored the mechanism where physical entities interact with their environment in reality. However, there is still an urgent need on their corresponding digital expression method, which will lead to an efficient and accurate prediction in the future DT.

______

______

Challenges to Implement Digital Twin: 

DT technology being in its infancy stages means that even though it has great potential, its implementation carries complications that can be either engineering and technology-related or can be commercial. Such complications include ambiguity surrounding the definition or concept of DT, lack of appropriate tools, expensive investments, data-related issues, lack of rules and regulations, etc. In the following paragraphs, you can find a compilation of the most common themes currently preventing a streamlined adaptation by industries of DT technologies.

-1. Novelty of Technology

As DT is still an emerging technology, there is a lack of clear understanding about the value it can bring to individuals, businesses, or industries. Incompetency on the part of technical and practical knowledge is also hindering the progress of the technology. There is also a lack of case studies of successful practices or business models implementing DT into company activities or realistic estimations on the costs involved in this implementation.

Several technologies come together to make DT a reality such as 3D simulations, IoT/IIoT, AI, big data, machine learning, and cloud computing. Since these technologies themselves are in developing phases, it impedes the evolution of DT. The infrastructure to implement DT needs to be improved to enhance the efficacy of the technology. There is a need for further research in technologies such as high-performance computing technology, machine learning technology, real-time virtual-real interactive technology, intelligent perception and connection technology, among others, in order to implement DT.

Along with the infrastructure, there is a need for supporting software. There are a plethora of software packages providing DT solutions such as Predix by General Electronics (GE), Azure Digital Twin by Microsoft, PTC, 3D Experience by Dassault Systèmes, ABB LTD, Watson by IBM, Digital Enterprise Suite by Siemens, etc., which makes it harder to identify and chose the platform that can deliver the most appropriate service based on the specific needs of the interested industries/business.

-2. Time and Cost

One of the biggest challenges DT needs to overcome to reach its full potential is the high cost associated with its implementation. The whole process of developing ultra-high-fidelity computer models and its simulation of processes to create a DT is a time-consuming and labour-intensive exercise that also requires a huge amount of computational power to run, thus making DT an expensive investment. On top of that, embedding the existing system with sensors for data collection along with the requirement of high-performance IT infrastructure, which includes hardware as well as software for storing and processing that data, contribute to the additional cost. Gartner analyst and expert Marc Halpern has also shown concern over cost and time-related aspects of DT at the PDT Europe conference in Gothenburg, Sweden, saying that bringing DT concepts together can take more time and resources than one can imagine. A paper published by West and Blackburn gives a glimpse of the scale of cost and time invested in bringing DT into reality. The authors claim that it could cost trillions of dollars and hundreds of years to completely implement Digital Threads/Digital Twins of weapons systems for the U.S. Air Force, making it impractical to fully realize the technology. This makes it very crucial for industries to perform cost-benefit analysis before implementing DT. Due to the expensiveness of DT implementations, their accessibility is limited by the accessibility of resources, which is often poor in developing countries.

-3. Lack of Standards and Regulations

As there are a plethora of DT models and architectures available in the literature, there is a need for defining a consistent framework for DT throughout the industry that includes shared and mutual understanding of interfaces and standardization for uniformity together with efficient design of data flow to make accessibility of data easier without compromising its security. Standardizing models, interfaces, protocols, and data is essential for efficient third-party communication, product and human safety, data security, and integrity, especially in industries such as aerospace, automobile, healthcare, etc. Besides that, standards and standards-based interoperability need to be developed to address the social and organizational challenges unfolded by digital transformation within industries. A lack of device communication and data collection standards can compromise the quality of data being processed for DT, which will be reflected on its performance. Since the technologies including big data and AI are also still in their infancy, the laws and regulations around them are yet to be formalized. There is a standardized framework ISO 23247 (Digital Twin Manufacturing Framework) under development which is aimed at providing guidelines and methods for developing and implementing Digital Twins in the manufacturing sector. This framework will have four parts: (i) Overview and general principles, (ii) Reference architecture, (iii) Digital representation of physical manufacturing elements, and (iv) Information exchange. Articles that explore the benefits, define concepts and architectures of DTs and review the technology’s state of the art are important for adopting a widespread, concrete understanding of DTs and their relevance. Furthermore, targeting this specific challenge with surveys and literature reviews, researchers may impact lower levels of the TRL to make basic principles and concepts widely known.

-4. Data Related Issues

As DT technology deals with the data, one of the biggest concerns that arises is about privacy, confidentiality, transparency, and ownership of these data.

Owning and sharing data is influenced by company policies as well as by the mindset of people and society about data ownership, thus putting a limitation on DT that is beyond the complexities of technology and engineering. Not having proper policies in place regarding sharing the data internally (within the organization) or externally (stakeholders across the supply chain) can lead to data silos within different departments of an organization, which can be detrimental to the value chain as data silos lead to inconsistency and synchronization issues. Another possible issue that needs to be considered is how to share the data among different DTs, i.e., data interoperability. In a setting where there are multiple DTs at different hierarchical levels, each generating a different type of data and one feeding on the other DT can create a complicated relationship between data set that causes data interoperability issues. Cybersecurity cannot be neglected when it comes to handling the data. On one hand, where having data silos can affect the overall performance of DT, not having them makes DT more vulnerable to cybercrime.

Another major challenge regarding data involved in DT is its convergence and governance. Projects involving big data are likely to fail due to lack of data governance and management to tackle the challenges related to big data, which include identifying and accessing data, transforming data from different sources, poor quality of data, translation loss, etc.

Another challenge is the use of AI and big data to satisfy the long-term and large-scale requirements for data analysis. With the large amount of data generated and analyzed in DT systems, big data algorithms and the IoT technology are powerful allies that can provide support to a great extent to successful DT implementations. Furthermore, information flowing from various levels of indicator systems presents a challenge for developing common policies and standards.

-5. Communication network-related obstacles

There is a need to build faster and more efficient communication interfaces such as 5G to enable real-time data connectivity and operational efficiency for the DT. For large scale deployment of digital twins, we need the ability to connect many more sensors and devices, the high-speed ubiquitous connectivity, the improved reliability and redundancy and ultra-low power consumption.

-6. Life-Cycle Mismatching

An additional concern over DT technology is related to the products that have long life cycles such as buildings, aircraft, ships, machinery, or even cities. The life cycles of such products are far longer than the validity of the software used for designing or simulating the DT as well as for storing and analyzing the data for DT. This means that there is a high risk, in the future at some point of time, of either the formats used by software becoming obsolete or becoming locked with the same vendor for new versions of software or other authoring tools.

______

______

Limitations of digital twins:

DT is an emerging technology with significant potential. However, the following limitations hinder the thriving of DT:

(1) Most DT models contain only geometric models. Although some recent research considered physical models, there is still a lack of in behavioral modeling and rule modeling.

(2) Delay and distortions occur in data transmission. Current data transmission methods fail to satisfy the demand of high accuracy and high speed due to the large amount of data that need to be transmitted simultaneously.

(3) Current data analysis algorithms and methods need improvement in both accuracy and rapidity.

(4) There are various DT platforms for different applications, especially in complex equipment monitoring and IoT fields. However, because of the diversity in communication protocols and different service demands, the development period of these platforms is too long to form a unified DT platform.

_

Limitations of AI in the digital twin:

Regardless of its benefits, AI doesn’t come without its challenges. There are several shortcomings and limitations that engineers are likely to encounter when using AI techniques to implement a digital twin:

  • AI technology needs trusted, ‘clean’ mass data for teaching the AI system with respect to the expected properties/behavior. This is often hard to get and to verify, especially for newly designed systems. On the other hand, the verification and certification of AI-based behavior in safety-critical applications, such as autonomous driving, is a challenge yet to be solved.
  • A digital twin mirrors the properties/behavior of a concrete asset, while AI systems are essentially mirroring statistical properties/behavior. This can make AI twins problematic.
  • AI technologies have no physical background. They can show deviations between real and expected behavior but cannot easily explain the physical reasons behind it.
  • Finally, customers are always concerned with how they can protect IP and the data associated with those assets while data sharing.

There are two basic approaches to addressing the limitations when combining conventional AI techniques with physics models:

  • Sequential approaches.

Using AI to detect anomalies before a simulation is employed for detailed analysis. Conversely, simulations are used to teach AI before it is applied as fast surrogate model for simulations.

  • Parallel approaches.

AI and simulation models are combined. For example, by substituting well understood components by AI surrogates in a system simulation, using AI to calibrate simulation models, or using AI to automatically analyze complex simulation results.

_______

Don’t allow bad data to ruin your digital twin plans: 

Old machinery retrofitted with sensors, multiple IoT devices from varying manufacturers, event streaming and the sheer volume of collected data are creating a perfect storm of problems for any business looking to use digital twin technology to simulate and monitor real-world activities. The problem is that so much of this data is either inconsistent, inaccurate, or incomplete that it cannot be trusted as a source for creating simulations. It’s a risk and one that needs addressing urgently to avoid wasted budgets and poor decisions.

For digital twins, poor data quality is the Achilles heel. IDC predicts in its Global DataSphere that the volume of data generated annually will more than double from 2021 to 2026, which will only make the situation worse if it is not addressed, as data quality tends to decrease with increased volumes. For the burgeoning digital twin market, this could be disastrous. Digital twins are being widely adopted across industries and rely on quality data to create accurate simulations of real-world scenarios. In manufacturing, construction, city planning, environmental monitoring, transportation and healthcare, digital twins have already found a home. And they are only going to grow in influence. It’s only going to get bigger, more widespread and more mission-critical to so many industries and organisations, so the ability to replicate, measure, monitor, predict, test and simulate in real-time means that accurate data is essential.

It’s easy to see why. Imagine the scenario where the underlying machine learning and AI algorithms are designed using poor-quality data. Imagine this also running within digital twins built using the same poor-quality data. It exacerbates the problem and will lead to inaccurate anomaly alerts and predictions that cannot be trusted. In short, a waste of time and money, with what is an excellent technology for actually saving time and money.

______

What a Digital Twin cannot do:

Caution is advised, if too much time and effort is put into the Digital-Twin-approach so that it is never really finished and ready for use. This not only leads to frustration for developers and product owners, but also endangers entire projects or can at least slow down their execution.

It also becomes problematic when different business units with their specific requirements expect too many capabilities from the Digital Twin. For example, different applications for a digital project could also be developed in individual projects or sub-projects. Not only can this lead to frustration if carried out at a different pace depending on the task and the team, but the added value of a Digital Twin is also lost. This is essentially what happens when not every project is started and carried out individually. Sufficient attention should therefore be paid in advance to the shared use of infrastructure by different users and their respective requirements.

Even as middleware, such as an Enterprise Service Bus or a queuing system, a Digital Twin is not suitable. A device can mirror data in the Twin, but the other way around this is limited. According to experts, a Digital Twin should therefore not “bring” data to third-party systems like classic middleware. Instead, it is recommended to allow the Twin to record the presence of new data in an event queue but continue to focus on its core competence.

It should also be noted that a Digital Twin only stores data and history to a certain extent. Just as a device does not store all data for capacity reasons, not all data can be stored in a Digital Twin. A complete archive of all transaction data would, for example, overload a Twin. In order to store complete histories, it is recommended to store them in a Data Lake. In general, you should therefore be aware that this technology generates a very large amount of data and that corresponding database technologies are required.

Another aspect to keep in mind is the security risks that a Digital Twin can pose – even if security issues always play a role in IT and are an integral part of discussions involving cloud technologies. Ultimately, it is always a trade-off between security and efficiency.

_____

_____

Legal implications of digital twins:

At the most basic level, a digital twin could simply act as a central repository of information, incorporating data about how a specific asset – say a building – has been designed and constructed, into which further data is added about how it performs and ages over time. This could be used to inform the management, operation and maintenance of the building.

However, at the most advanced level, a digital twin could be something far more complex and multi-layered, incorporating virtual projections of almost anything. This is the scale which the UK government is advocating – an intricate model of our current infrastructure which could be used to inform decisions and test solutions to population growth, congestion, climate change, you name it. A model of this complexity would be very challenging for a contractual framework to govern. Issues such as data ownership, causation and liability could all potentially be unclear, difficult to unravel, and contentious. The range of potential legal issues will no doubt expand as the use of digital twins evolves.

_

  • Data ownership:

In many cases digital twins may incorporate copyright material, meaning that the intellectual property provisions of contracts will need to be updated to reflect the now wider range of use of the data for the digital twin. In some cases, this may be as simple as including the digital twin within the contractual definition of permitted use, but this will vary from contract to contract.

Any licence granted in relation to use of the data should be for a suitably long period so as not to expire before the end of the life of the twin. This is likely to be the full life cycle of the asset, process or system, so potentially many years.

There also needs to be legal clarity on who is the rightful owner of the data held within the model. It is important that the rights of individual parties making a distinct contribution continue to be recognised, particularly where the data shared incorporates copyright material. However, complex situations may arise where more than one party have contributed, as the end product of the data sharing might result in a situation of joint ownership.

Unless there are specific contractual provisions covering this, the rights of joint owners may not be clear. Further, if ownership of the individual data contributions sits with the party who shared the data, where does ownership of the digital model as a whole lie? This will need to be established.

_

  • Data sharing and confidentiality:

By their very nature, digital twins rely on data sharing, and the contracts that govern them must allow for that. This runs contrary to current norms that oppose non-essential sharing of data. However, this prevailing cultural resistance to data sharing, particularly where the benefits of doing so are technical, complex, or difficult to understand, must be broken down if the full benefits of digital twins are to be realised. The case will need to be made (where it can be) for the perpetual benefits to the public, as a legitimate reason for sharing data.

There is a related issue of confidentiality. Given the number of stakeholders who may have access to the digital twin, some parties may not feel comfortable with sharing confidential information, such as trade secrets. This may be further compounded by the fact that with any data-sharing platform there is always a risk of security breaches and data losses, and any vulnerabilities associated with such systems will increase substantially when different digital twins are amalgamated.

Where the data is considered to be confidential, appropriate non-disclosure clauses may be required within individual contracts, or a project-wide confidentiality agreement may need to be signed by all the parties. Where there are many users of the twin, different access permissions may be required to allow confidential data to only be viewed by certain users. Similarly, some parties may request the redaction of certain data. However, this should be proportionate. Although it may not be essential that all information be embedded into the model, a digital twin will only ever be as good as the data that goes into it.

_

  • Liability:

Probably the most complex issue to touch on is that of liability. Digital twins are interconnected systems in which changes in one item of data will impact other parts of the model, and as the digital twin evolves, more and more parties may be using and relying on data that could include errors. In situations where there are multiple parties and data sources, the digital twin is likely to require a single organisation to act as gatekeeper to the data, to prevent unauthorised changes. However, where there is an error, it may still be difficult to establish where the liability lies.

Equally, the blame may not lie with one party alone, or it may be hard to prove that the original data provided was not of sufficient quality to begin with. The fact that different parties are relying on the accuracy of data provided by one another may also lead to trust issues.

The Gemini Principles, a series of values published by the Centre for Digital Built Britain’s Digital Framework Task Group to guide the creation of the national digital twin, place heavy emphasis on clarity of purpose, trust, openness, quality and the effective function of the twin.

Given the complexity of potential liability issues, it would be beneficial to ensure that all contracts are clear about the purpose and function of the data, perhaps with reference to the Gemini Principles, with the aim of fostering trust between the parties involved and to encourage the sharing of data which meets the same high standards.

The NEC suite of contracts, which provide that parties must act in a spirit of mutual trust and cooperation, is certainly evidence that contracting can encourage more collaborative and open ways of working. Although it may be difficult to enforce such obligations (especially given the common law stance that neither party must act in good faith), having clear terms of reference may go some way to avoiding disputes between the parties later down the line.

_____

_____

Section-12

Digital twin purpose, benefits and pitfalls:  

_

Why digital twin?

Decision cycles across industrial environments are becoming increasingly disrupted by proliferation of data, new data sources, and compute speeds within an increasingly volatile business environment. The digital twin is the key to effective decision making in this new world. Making better decisions, faster, that can be executed perfectly every time is vital for delivering superior results, sustained. However, this is easier said than done. Every individual perspective is underpinned by a series of unique cognitive biases that drive swift action in adversity and make accurately weighing evidence, assessing probabilities, and deciding logically a challenge. Look no further than the constant discrepancy between strategic planning/ambition and results realization. A single view of the truth and analytics is therefore key to situational awareness and effective organizational decision making. But many players in the industry are stuck on determining what type of analytics they need. The solution to this question should be driven by the problem, not by how much analytics can be thrown at data in the hope it will both find the problem and solve it. The desired outcome should influence the type of analytics being sought and the available analytics technology that is fit for purpose.

_

Basic analytics technology can move data around and display key performance indicators (KPIs) to the right people at the right time to enable decision making. They work well for understanding what happened in hindsight. However, increasing plant complexity requires more sophisticated ways of approaching KPIs and targets. In some cases, a rudimentary approach to KPI setting and monitoring can even become ineffective and counterproductive. In this case, deeper analytics technology, using digital twins, is necessary to account for the multidimensional factors and nonlinear trade-offs that make effective decision making a challenge. The digital twin allows “What if?” and “What’s best?” scenarios to be run automatically on actual plant data to determine available strategies that maximize profitability. Experts can then review the recommended strategies to assess the effect of each approach without disrupting the live process.

_

A digital twin works in the present, mirroring the actual device, system, or process in simulated mode, but with full knowledge of its historical performance and an accurate understanding of its future potential. Therefore, the digital twin can exist at any level within the traditional ISA-95 architecture and can be defined as a decision support tool that enables improved safety, reliability, and profitability in design or operations. It is a virtual/digital copy of a device, system, or process that accurately mimics actual performance, in real time, that is executable and can be manipulated, allowing a better future to be developed. A digital twin is useful for the entire life cycle of an asset. It is ideally created during the initial study to evaluate the feasibility and process model of the asset. It is then used and further developed during the design, construction, and commissioning of the asset, thereby facilitating the optimal design of the asset and the training of the staff who will operate it. During the bulk of a plant’s life cycle, operation, and maintenance, the digital twin can be employed for optimization and predictive maintenance.

_

The digital twin enables everyone to see inside assets and processes and perceive things that are not being directly measured. They are wired so that insights are instantly available without end users having to wrangle data and models, and they run in a consistent way that everyone can understand and agree on. In this way the digital twin drives agility and convergence in understanding and action across the whole business, for example from engineering to operations, operations to supply chain, reservoir to facilities, and shop floor to board room. The digital twin aims to be an accurate representation of a device, system, or process over its full range of operation and its full life cycle. Ideally, the digital twin should be able to transition from design to operations with ease.

_

To achieve the desired levels of accuracy, source data must be gathered in real time and be validated and reconciled to ensure that all physical and chemical laws are respected. Electronic noise and dynamic effects must be eliminated through filtering. Only through this approach can data quality issues be identified and mitigated, and the digital twin be trusted to reflect reality and relied on for the quality and accuracy of its predictions. Although individual point solution digital twins exist today, a future digital nirvana has one multipurpose digital twin. Getting to the future state in one step is unrealistic, and it is likely to be achieved by connecting valuable high-performing individual elements. Therefore, the mantra has to be one of agility-think big, start small, scale fast, and drive adoption.

Some examples of what digital twins are mirroring today include:

  • instrument/device
  • control system
  • 3D design and engineering
  • worker
  • process/optimization
  • energy/utilities
  • supply chain

Considering the above, some can understandably believe that “digital twin” is a marketing term used to repackage certain technologies that have been available in the market for a long time. To some extent that might be true, but not all digital twins are made equal. Their perceived use value varies, for example, a 3D computer-aided design model of a plant may be of less value to a process engineer than a digital copy of the plant’s operating conditions and the way in which molecules behave and transform. If anything, the term has been a catalyst for driving clarity and understanding of the value that it represents. Comprehensive digital twin solutions have been developed for an integrated production management system. These operate across the entirety of the process manufacturing supply chain and asset life cycle to align production management and reliability, energy and supply chain optimization, and strategic asset investment planning.

_____

Using a digital twin enhances insight and understanding of how your systems work and interact, helping evaluate parameters and interdependencies. As a virtual environment, a digital twin also allows low-cost and no-risk possibilities for experimentation. There are the three factors that matter in the digital twin. The digital twin allows us to understand, to predict, and to optimize. Those three components of the digital twin drive positive business outcomes. The main reasons for using a digital twin are based on them providing:

-1. system insight

Inspect how processes are operating and how they interact. See how resources are being utilized and understand workflows. For example, a digital twin for improving project management at a turbine manufacturer.

-2. what-if analysis

Test ideas and explore possibilities for your processes and systems and see how they perform based on current or historical data. For example, digital twin simulations that help manufacturing maintenance decision making.

-3. clarity

A digital twin model can output statistics and provide visuals, including 3D animations, to help verify analyses, increase understanding, and more easily communicate findings and proposals. For example, a simulation based digital twin for well construction process optimization.

___

According to Deloitte, digital twin technology can deliver specific business value in the following areas:

  • improve quality of product and processes and help predict and detect quality defects quicker
  • improve customer service by enabling a better understanding of equipment and determining warranty costs and claim issues more accurately
  • reduce operating costs by improving product design, equipment performance, and by streamlining operations and reducing process variability
  • create record retention of serialized parts and raw materials to support tracking and quality investigation
  • reduce time to market and cost to produce a new product by reducing lead times of components and optimizing supply chain performance
  • create new revenue growth opportunities by helping to identify new products and improving efficiency and cost to service.

_____

Building up on a report from Oracle, the following eight value additions of digital twin are identified:

-1) Real-time remote monitoring and control:

Generally, it is almost impossible to gain an in-depth view of a very large system physically in real-time. A digital twin owing to its very nature can be accessible anywhere. The performance of the system can not only be monitored but also controlled remotely using feedback mechanisms.

-2) Greater efficiency and safety:

It is envisioned that digital twinning will enable greater autonomy with humans in the loop as and when required. This will ensure that the dangerous, dull and dirty jobs are allocated to robots with humans controlling them remotely. This way humans will be able to focus on more creative and innovative jobs.

-3) Predictive maintenance and scheduling:

A comprehensive digital twinning will ensure that multiple sensors monitoring the physical assets will be generating big data in real-time. Through a smart analysis of data, faults in the system can be detected much in advance. This will enable better scheduling of maintenance.

-4) Scenario and risk assessment:

A digital twin or to be more precise a digital sibling of the system will enable what-if analyses resulting in better risk assessment. It will be possible to perturb the system to synthesize unexpected scenarios and study the response of the system as well as the corresponding mitigation strategies. This kind of analysis without jeopardizing the real asset is only possible via a digital twin.

-5) Better intra- and inter-team synergy and collaborations:

With greater autonomy and all the information at a finger tip, teams can better utilize their time in improving synergies and collaborations leading to greater productivity.

-6) More efficient and informed decision support system:

Availability of quantitative data and advanced analytics in real-time will assist in more informed and faster decision makings.

-7) Personalization of products and services:

With detailed historical requirements, preferences of various stakeholders and evolving market trends and competitions,

the demand of customized products and services are bound to increase. A digital twin in the context of factories of the future will enable faster and smoother gear shifts to account for changing needs.

-8) Better documentation and communication:

Readily available information in real-time combined with automated reporting will help keep stakeholders well informed thereby improving transparency.

_____

Socioeconomic impact:

Digital Twinning will bring about unprecedented automation in the management of any physical asset. One of the first concern that can be a stumbling block for the adaptation of digital twins just like any other automation technologies will be its acceptability by the work force. The fear of losing jobs seemed very logical a few decades earlier, however even at that time contrary opinions were prevalent based on systematic studies. In fact, Sheridan showed that automation just results in redistribution of workplace without much impact on employment. Such studies highlight the vulnerability of the workforce with lower qualification involved in repetitive jobs. There are also positive aspects with automation and that is with a careful task allocation between humans and machines, it can enable greater safety and creativity at work place by deallocating dirty, dumb and dangerous (3D) jobs to machines and artificial intelligence. Such task allocation has been extensively studied. Humans should be in the loop not only for coordinating AI developments but also checking AI results. At this point it is worth remembering the reflection ‘‘ironies of automation’’ by Lisanne Bainbridge. It simply states that the as automation takes over the simpler works, the role of humans to manage more complicated unpredictable tasks will become even more critical. How to groom humans to deal with unexpected events after prolonged phases of inactivity will be a major challenge. Moreover, for economic reasons, there will be a lesser motivation for recruiting professional users of the relevant technologies, a base of nonprofessional users might be a natural outcome. This will require strategies to handle security and hacking. As the world gears towards greater autonomy resulting from digital twinning, efforts will have to be made to create opportunities for all and nor for a selected few. Nevertheless, with good guiding intentions, the technology will improve the quality of work at work places and with good training and career counselling will enable workers to focus on more creative work.

______

______

Are digital twins living up to the hype?

In a recent survey of 300 executives across a range of industries, the responses showed it’s getting there. Organizations are pursuing digital twin – only 5% of respondents did not have digital twin as part of their larger digital transformation strategy and 86% reported digital twins are an achievable goal worth pursuing or investing in today. Companies aren’t using digital twins as a solution for just one problem, they’re implementing the concept to achieve a variety of improvements.

The Benefits of Digital Twin:

-1. Improved customer satisfaction

We are in an increasingly customer-centric marketplace and delivering customer satisfaction can be a competitive differentiator. Customers expect companies to understand their needs and research shows customers will spend more if they know they will receive excellent service. Digital twins can support improved customer satisfaction though use cases like predictive maintenance, but because they collect real-time data on the product, they can also enable smoother customer service and repair operations, while informing future product improvements. A recent survey found that this benefit is a top priority for the aerospace and defense, industrial machine and electrical equipment, and pharmaceutical industries.

-2. Improved product quality

This benefit comes with time and data collection through digital twin. After initial investments have been made, generational improvements of a product – based on real–world operational data from many digital twins – can inform engineers and designers when developing a new product or version. This benefit is a top priority within the following industries: Automotive, Chemicals, High-tech, Industrial Machine and Electrical Equipment (tie), and Medical Devices.

-3. Reduced time to market

With digital twins, companies receive continuous insights into how their products are performing in the field. With these insights, they can iterate and innovate products faster and with more efficiency. When used with digital models and simulation tools, digital twin enables engineers to validate product performance before physical prototyping. For costly products, this results in significant savings in both costs and time. Digital twins can help sidestep late-stage redesign and reduce time-to-market. Given the acceleration of innovation over the past decade, this benefit offers a competitive advantage.

-4. Inform and drive sustainability efforts

An increasingly prevalent goal for organizations is to drive sustainability in as many efforts as possible. There are opportunities across the value chain with digital twins. It can mean swapping out product materials for more sustainable options, reducing carbon emissions or scrap in the manufacturing process, or decreasing the number of service truck rolls. When integrated into a greater digital transformation strategy, digital twin can provide the data, visibility, and visualization necessary to improve performance across multiple dimensions related to sustainability. For example, a company could apply a plant-wide digital twin model to identify potential areas for efficiency by integrating disparate data models and uncover collective improvement opportunities. From these previously unavailable insights, the company can then take steps to reduce energy consumption or find ways to increase production capacity without additional resources.

-5. Enhance supply chain agility and resilience 

Supply chain disruptions are plaguing companies in the wake of the COVID-19 pandemic and the Russian invasion of Ukraine, both of which have put a spotlight on agility and resilience. A combination of emerging technologies and platforms have made it possible to pursue a digital twin of the physical end-to-end supply chain. With this type of digital twin, companies gain visibility into their supply chain, such as lead times, and can make real-time adjustments internally and with their partners.

-6. Enable new business models (i.e., product as a service) 

Digital twins sometimes have a secondary benefit if you’re able to think about the possibilities. Celli creates beverage dispensers that are used all around the world. They incorporated digital twin technology into their dispensers to reduce maintenance costs and product downtime. However, they soon realized the data collected as part of the digital twin wasn’t just relevant and valuable to their business – it could help improve their customers’ business. For example, the data revealed patterns in drink consumption, like which beverages were selling best and where they were most popular. They were then able to deliver these analytics for customers on a subscription basis. Restaurant and bar owners can use this data to better anticipate orders and stock inventories more effectively.

-7. Drive operational efficiency 

Businesses want to be able to execute with greater efficiency, but it’s hard to know where to start. Digital twins offer the insights necessary to gain those operational efficiencies across the value chain. Particularly relevant to process digital twins, organizations can bring together different data sets to capture real-time information on asset and production performance. Not only can they see where there might be bottlenecks, but also how potential solutions could impact the overall process.

-8. Improve productivity

The challenge of employee turnover and retention is nearly universal across industries. When a skilled employee leaves, they almost always take their knowledge with them, creating a barrier that slows productivity. With digital twin, organizations can mitigate some of these challenges through remote monitoring and assistance. Offsite experts can observe active equipment through systems that utilize digital twin technology. If an issue is detected, an onsite employee is contacted and could potentially be guided through the maintenance procedure with the help of augmented reality. If AR guidance is not available and a service technician needs to be deployed, they already have the information necessary to make the repair. This can reduce truck rolls and increase first-time-fix-rates. Harpak-Ulma uses digital twin in this way. They’ve seen productivity improvements between 30% and 60%, depending on the specific application.

_____

The following are some of key benefits identified in general:

______

Advantages of Digital Twin: 

The main reason DT technology is seen as the cornerstone in Industry 4.0 is its plethora of advantages, including the reduction of errors, uncertainties, inefficiency, and expenses in any system or process. It also removes all the silos in processes or organizations that otherwise work in isolation within compartments and divisions in more traditional industrial structures.

Some of the advantages reported for DT include: 

  • Speed prototyping as well as product re-designing:

Since simulations allow the investigation of a number of scenarios, the design and analysis cycles shorten, making the whole process of prototyping or re-designing easier and faster. Once implemented, DT can be used in different stages of the product design process, from conceptualizing the idea of the product to its testing. Besides that, it also creates an opportunity where the customization of each product based on users’ needs and usage data is possible. Since the DT is connected to its physical twin throughout its lifetime, the comparison between the actual and predicted performance can be made, allowing engineer/product designers to reconsider their assumptions on which the product was designed.

  • Cost-effective:

Due to DT involving mostly virtual resources for its creation, the overall cost of prototyping decreases with time. In traditional prototyping, redesigning a product is time-consuming as well as expensive because of the use of physical materials and labour, and on top of that, a destructive test means the end of that costly prototype, whereas using DT, products can be recreated and put through destructive tests without any additional material cost. Thus, assuming even if the cost is equal at the start, the physical costs keep increasing as inflation rises but the virtual cost decreases significantly as time progresses. DT allows the testing of products under different operating scenarios, including destructive scenarios, without any additional costs. Moreover, DT can reduce operating costs and extend the life of equipment and assets once implemented.

  • Predicting Problems/System Planning:

Using DT, we can predict the problems and errors for future states of its physical twin, providing us an opportunity to plan the systems accordingly. Due to the real-time data flowing between the physical asset and its DT, it can predict problems at different stages of the product lifecycle. This is beneficial especially for products that have multiple parts, complex structures, and are made up of multiple materials such as aircraft, vehicles, factory equipment, etc., because as the complexity of any product increases, it gets harder to predict component failures using conventional methods.

  • Optimizing Solutions and Improved Maintenance:

The traditional methods of maintenance are based on heuristic experience and worst-case scenarios rather than on the specific material, structural configuration, and usage of an individual product, making them reactive rather than proactive. However, DT can foresee defects and damage in the manufacturing machine or system and thus can schedule the maintenance of the product in advance. By simulating different scenarios, DT provides the best possible solution or maintenance strategy that makes the maintenance of the product/system much easier. In addition, the constant feedback loop between DT and its physical counterpart can be used to validate and optimize the system’s process all the time.

  • Accessibility:

The physical device can be controlled and monitored remotely using its DT. Unlike physical systems, which are restricted by their geographical location, virtual systems such as DT can be widely shared and can be remotely accessed. Remote monitoring and controlling of equipment and systems becomes a necessity in a situation where local access is limited, like during the COVID-19 pandemic when lockdowns have been enforced by governments and working remotely or non-contact is the only viable option.

  • Safer than the Physical Counterpart:

In industries such as oil and gas or mining where the working conditions are extreme and hazardous, the capability of DT to remotely access its physical twin, as well as its predictive nature, can reduce the risk of accidents and hazardous failures. However, DT’s advantage of accessing remotely is not limited to preventing accidents. During the global COVID-19 pandemic, not having human contact and in-person monitoring is also a way to guarantee safety. According to a recent Gartner survey, almost one-third of companies are using DT amidst the global COVID-19 pandemic to increase the safety of employees and customers through remote monitoring.

  • Waste Reduction:

Using DT to simulate and test product or system prototypes in a virtual environment significantly reduces wastage. Prototype designs can be probed and scrutinized virtually, under a variety of different test scenarios, to finalize the final product design prior to manufacture. This not only saves on material wastage but also reduces development costs and time to market.

  • Documentation and Communication:

To create a DT, it is important to synchronize data scattered across different software applications, databases, hard copies, etc., which simplifies the process of accessing and maintaining the data in one place. DT enable a better understanding of system reactions and thus it can be used to document and communicate the behaviour and mechanisms of the physical twin.

  • Training:

DT can be used to develop more efficient and illustrative safety training programmes than the traditional one. Before working on a high-risk site or hazardous machinery, operators can be trained using a DT to reduce the dangers, as exposing and educating them about different processes or scenarios will make them confident in dealing with the same situations in person. For example, mining is a high-risk environment where new employees can be trained using DT on machinery operations, as well as how to deal with emergency scenarios. DT can also be a great tool in closing the knowledge gap from experienced workers to newcomers.

_____

_____

Digital Twin Disadvantages:

Here are the potential disadvantages of adopting digital twin:

-1. Security vulnerability

As per Gartner’s estimate, over three-fourths of the digital twins implemented for IoT-connected manufacturing products will have a minimum of five different kinds of integration endpoints within 2023. Each of these endpoints represents a threat of security vulnerability. Given the multitude of data collected by digital twins, it can prove catastrophic for the organization unless they are well-prepared and address all the security concerns beforehand.

-2. Stringent workforce training requirements

Manufacturing is predominantly labour-intensive, and adopting sophisticated tech can lead to a plethora of unforeseen bottlenecks. Moreover, to make the most of digital twins, organizations have to adapt to new ways of working and integrating technical data more than ever. As a result, it can lead to a reluctance to work with digital twins. 

-3. Using the same platform for different applications

Although it’s tempting to try and repurpose a digital twin platform, doing so can lead to incorrect data at best and catastrophic mistakes at worst. Each digital twin is completely unique to a part or machine, therefore assets with unique operating conditions and configurations cannot share digital twin platforms.

-4. Going too big, too fast

In the long run, a digital twin replica of your entire production line or building is possible and could provide incredible insights, but it is a mistake to try and deploy digital twins for all of your pieces of equipment or programs all at once. Not only is doing too much, too fast costly, but it might cause you to rush and miss critical data and configurations along the way. Rather than rushing to do it all at once, perfect a few critical pieces of machinery first and work your way up from there.

-5. Inability to source quality data

Data collected in the field is subject to quality errors due to human mistakes or duplicate entries. The insights your digital twin provides you are only as valuable as the data it runs off of. Therefore, it is imperative to standardize data collection practices across your organization and to regularly cleanse your data to remove duplicate and erroneous entries.

-6. Lack of device communication standards

If your IoT devices do not speak a common language, miscommunications can muddy your processes and compromise your digital twin initiative. Build an IT framework that allows your IoT devices to communicate with one another seamlessly to ensure success.

-7. Failing to get user buy-in

A successful digital twin strategy includes users from across your product value chain. It is critical that your users understand and appreciate the value your digital twin brings to them individually and to your organization as a whole. Lack of buy-in due to skepticism, lack of confidence, or resistance can lead to a lack of user participation, which can undermine all of your efforts. 

-8. High Levels of Investment required

Major drawback to digital twins is the high level of investment that is needed to implement. Not every business can afford the return on investment when financial resources are tight. In order for virtual twin technology to become prevalent across all industries, the technology adoption and installation costs have to be minimized or different options must be presented in order to compliment every company’s budget.

-9. Danger of inaccurately representing an object using digital twin

The biggest concern most business owners who are interested in this technology have is the risk of misrepresenting the object or system they want to replicate using this technology. Considering the fact that there is not much information regarding how accurate the twin is compared to its physical counterpart.

-10. Accuracy of future scenario simulation using digital twin

 The other major concern that troubles most company owners who are keen on trying out digital twin is doubt regarding the overall accuracy of the simulation the technology will build.

_____

When digital twin becomes the evil twin:

Most digital twins exist in public clouds, for the obvious reason that they are much cheaper to run and can access all of the cloud’s storage and processing, as well as special services such as AI and analytics, to support the twin. Moreover, the cloud offers purpose-built services for creating and running twins. The ease of building, provisioning, and deploying twins has led to a few issues where the digital twin becomes the evil twin and does more harm than good. Some of the examples include:

  • In manufacturing, the twin over- or understates the proactivity needed to fix issues before they become real problems. Companies are fixing things identified in the twin simulation that actually don’t need to be fixed. For example, they replace hydraulic fluid three times more often than needed in a factory robot. Or worse, the twin suggests configuration changes that result in overheating and a fire. That last one really happened.
  • In the transportation industry, a digital twin could shut down a jet engine due to what the twin simulated to be a fire but turned out to be a faulty sensor.
  • In the world of healthcare, a patient was indicated as having factors that would likely lead to a stroke but it was determined to be a problem with the predictive analytics model.
  • The Pentagon and industry have turned to “digital twins” to support development of some of their most critical programs. But experts warn that the technology could also be used by adversaries for ransomware, phishing and even cyber warfare.

The point is that attaching simulations to real devices, machines, and even humans has a great deal of room for error. Most of these can be tracked back to the people creating twins for the first time who don’t find their mistakes until shortly after deployment. The problem is that you could crash an airplane, scare the hell out of a patient, or have a factory robot go aflame.

With the cloud making the use of digital twins much more affordable and speedier, issues like these are increasing.

_____

_____

Section-13

Applications of digital twins:

_

The more general application of DTs can be found in industrial applications, inside lifecycle management platforms, in predictive maintenance and in the automotive industry. The most recent and/or future DT applications include: agriculture, healthcare, business, construction, education, mining, natural disaster detection, communication and security. Another, very complete, review on DT technology remarks on the use of DT in applications for: logistics, robotics, design, manufacturing (production, modeling, experiments and process) and products. The use of DTs for supply networks and service in industrial operations is also mentioned in another review.

In a review study, researchers collected academic publications that contain digital twin as a keyword for the years between 2017 to 2022. Among these academic articles, top digital twin use cases were found for urban spaces and smart cities as seen in the figure below:

Figure above shows the publications on digital twin by use case/ application.

_______

_______

The six most common Digital Twin applications

As part of the research, IoT Analytics looked at 100 digital twin case studies. The result was that six clusters of digital twin activity stand out. We call them digital twin applications:

 

Digital twin application

Description

% of projects

1.

Twins for system prediction

A digital twin geared toward predicting complex systems

30%

2.

Twins for system simulation

A digital twin geared toward simulating complex system behavior

28%

3.

Twins for asset interoperability

A digital twin geared toward common data formats and streamlined data extraction in complex systems

24%

4.

Twins for maintenance

A digital twin geared toward assisting with maintenance-related use cases

21%

5.

Twins for system visualization

A digital twin geared toward visualizing a complex system (e.g., in 3D)

20%

6.

Twins for product simulation

A digital twin geared toward simulating the behavior of (future) products (mostly during the design phase)

9%

Note:

The total adds up to more than 100% because many projects fall into several clusters at the same time.

______

______

Top digital twin applications and use cases:

-1. Industrial applications of digital twin:

Through in-depth analysis and summary, kinds of digital twin applications in different lifecycle phases are identified, including three applications in design phase, seven applications in manufacturing phase, and five applications in service phase as seen in the figure below. Research on digital twin application in retire phase is insufficient in the existing literatures.

Figure above shows industrial applications of digital twin in different lifecycle phases.

(1. Applications in design phase:

Digital twin enables fusion between information model and product physical model and their iterative optimization, thereby shortening the design cycle and reducing rework cost. Commonly, the design process comprises four steps: a) task clarification, b) conceptual design, c) embodiment design, and d) detail design. Teri et al. took a village cobbler as an example. The cobbler was responsible for everything throughout the design, manufacture, repair, and recycle of the shoes, in which way data integrity, product traceability, and knowledge accessibility were enhanced. In this simple ‘cobbler model’, the cobbler knew customers’ requirements and design constraints. He also knew which material was required for the type of product and the process to operate on it. Digital twin is a possible replacer for ‘the cobbler’s mind’ in the current trend of products’ complexity and variability.  Tao et al. claimed that digital twin made designers have a complete digital footprint of products through design. It served as an ‘engine’ to convert big data into useful information. The information can be directly utilized by designers to make informed decisions at different design phases. Schleich et al. stated that digital twin allowed the check for conformance of the product specifications with the design intent and customer requirements.

(2. Applications in manufacturing phase:

Traditionally, manufacturing refers to an industrial production process through which raw materials are transformed into finished goods. However, with the development of high demands on product qualities and rapid market response, manufacturing is shifted from primary processes to smart processes. Modern manufacturing requires physical and digital interaction in a closed-loop manner. The core of digital twin is to realize the communication and interaction between the physical world and the digital world. We can conceptualize the actual manufacturing processes visually, compare the formation of the physical product to the virtual product to ensure what we are producing is what we want to produce, and finally collaborate with others to have up-to the-minute knowledge of the products that are producing. In the past decades, with the development of advanced mechanical technology and electrical technology, manufacturing automation is achieved. Fixed, carefully engineered sequences of actions are executed automatically. Nowadays, manufacturing elements are becoming context-aware by communicating with their surroundings and are able to make intelligent decisions without explicitly being programmed. Digital twin is the key enabling factor towards this vision. A lot of new concepts, paradigms, and frameworks are being proposed in manufacturing phase. Rosen et al. showed how digital twin worked to transform a cyber-physical production system to an autonomous system. Tao and Zhang explored a novel concept of digital twin shop-floor (DTS), which included four key components, i.e., physical shop-floor, virtual shop-floor, shop-floor service system, and shop-floor digital twin data. Park et al. designed and implemented a digital twin to solve problems of personalized production and distributed manufacturing system. The digital twin was designed to monitor the present in real-time, track information from the past, and support operational decision-making for the future.

(3. Applications in service phase:

In the service phase, products are usually decentralized and out of control from manufacturers and suppliers. As a result, it is difficult to manage and gain access to their data, or to realize closed-loop data stream. In addition, the existent virtual model may be an accurate representation of the product’s design, but it has no link to a specific manufactured part. Furthermore, a manufactured part has no link to a specific used product, meaning that products of the same batch may perform differently in different service environments. In this phase, users are mainly concerned with the reliability and convenience of the product, while manufacturers and suppliers are mainly concerned with real-time product operation state, maintenance strategies, and so on.

Tao et al. argued that with the digital twin methodology, degradation and anomalous events could be understood and unknowns could be foreseen previously. Tuegel et al. proposed an aircraft structural life prediction process that utilized an ultrahigh fidelity model of individual aircraft by tail number (a digital twin). A reasonable estimate of the flight trajectory and maneuvers that would be flown during the mission was established. Then the likelihood of the airframe satisfactorily surviving the demands of the mission could be considered to decide whether to send that particular aircraft on that specific mission. NASA’s vision is that digital twin integrates various best-physics models with one another and with on-board sensor suites. As a result, decision-makers will understand physical processes related to degradation at the material, structural, and system level throughout the whole lifecycle. The systems onboard the Digital Twin are also capable of mitigating damage or degradation by activating self-healing mechanisms or by recommending changes in mission profile to decrease loadings.

(4. Applications in retire phase:

Retire phase is often ignored as an actual phase. Knowledge about a system’s or product’s behavior is often lost when it is retired. The next generation of the system or product often has similar problems that could have been avoided by using knowledge about the predecessor. In retire phase, digital twin contains the whole lifecycle information of the physical twin and can be retained at little cost in virtual space. Wang and Wang presented a novel digital twin-based system for the WEEE (Waste Electrical and Electronic Equipment) recovery to support the manufacturing/remanufacturing operations throughout the product’s life cycle, from design to recycle. The product information models were developed from design to remanufacturing based on international standards. Liu et al. aimed at the uncertainties in the process of remanufacturing and constructed architecture for digital twin-based remanufacturing shop-floor. The remanufacturing operation paradigm for automobile based on digital twin in future was also explored.

______

Optimize Industrial Automation Systems using Digital Twins:

Engineers can have a tricky time changing up a production line. To keep up with market demands, they have to adjust existing factory automation systems to produce the latest designs and implement new processes. Updating a production facility could shut down a line for weeks, even months, as engineers perform physical tests, maintenance, modifications and safety checks. Engineers can easily update factory automation systems if they use simulation-based digital twins. The idea is to enable engineers to design and test the updated manufacturing line using simulation-based digital twins. By optimizing the line digitally, engineers can reduce downtime and physical testing. Engineers will also be able to use digital twins to gain insights from the factory floor that can improve product designs early in development. This enables them to design products that are easier to assemble and manufacture. Digital twins also enable engineers to predict manufacturing issues that can arise as they run and update the facility. This technology can improve a factory’s lines, product yield and throughput. Digital twins could be the key to successful automation.

______

Digital Twin enables Predictive Maintenance:

Predictive maintenance is, quite simply, the ability to fix what isn’t broken…yet. Through numerous advancements in data gathering and compiling, it is now possible to accurately predict when and how certain hardware components will break down. The key is, as with most everything in digital transformation (DX), data. Information is essential in the 21st century, and it’s more than just having a bunch of data points and sensor readouts – it’s knowing how to process it quickly and efficiently. This is at the crux of predictive maintenance, as well as digital twin. Digital twins are made up of numerous different technologies – from IoT sensors to 3D CAD files to potentially augmented reality (AR) visualization, it’s really the product of an ecosystem of data communication. With all this constant visibility and measuring, a concept like predictive maintenance becomes not only possible but practical. Sensors constantly monitor equipment at the component level, identifying and evaluating each aspect of operation. Should a component break down, this incident is recorded. This recording is logged in a complete and comprehensive history of the product in question. As this happens over and over again, across one or more devices, patterns start to emerge. This allows the digital twin to predict – with incredible accuracy – when and where the next breakdown will occur. The more digital twins there are, the more complete the picture. What is happening now literally was not possible before, not even with an expert technician assigned to monitor the equipment as a full-time job. There is no downtime and no gaps in the data stream. By having the most complete picture, digital twins are providing complete transparency into the product life stream. Predictive maintenance is only possible with this level of visibility. 

_

That said, digital twin benefits, amazing as they are in the predictive arena, are not confined to it. Remember, this is essentially a complete and total digitized readout of every important component in the physical, real-world instance. Even if an organization chooses not to pursue predictive maintenance capabilities, this readout will still be very useful for corrective maintenance operations. For those unfamiliar with the term, corrective maintenance describes the traditional process of cause and effect. A machine breaks and then it is fixed. The reality is this can be a very economically and culturally draining situation, as breakdowns in products and processes often lead to serious consequences. As such, it’s important to get the repair right as quickly as possible. 

The first-time fix rate (FTFR) is an enormously important stat for many executives and decision makers. It informs them of just how often the technician was able to solve the problem on the first try. Any downtime is costly, but downtime prolonged by second, third, and fourth attempts is far more expensive. As such, anything that can improve diagnosis accuracy is appreciated. Since digital twins provide the most complete picture to date of the physical realities they reflect, this makes them valuable tools in the overall maintenance process, corrective or predictive.

_

The Decentralization of Maintenance:

The development and increasing deployment of digital twins will not just lead to more cases of predictive maintenance, which by itself is already a positive development, but for organizations fully embracing these practices – this shift will empower a decentralization of maintenance efforts overall. This means more effectively hiring, maintaining, and deploying a technician force based on the actual reality of products and processes, rather than estimating what is needed and where the staff should be stationed. One technician with perfect information and complete visibility could, in theory, better service three manufacturing centers than three technicians in one plant operating on incomplete and frequently outdated data. This is not to say that fewer technicians will be needed – just to say that these skilled workers can now be better utilized for maximum efficiency and effectiveness. 

Predictive maintenance as a concept hinge on a consistent stream of accurate information, and digital twin provides this foundation. It is possible to have digital twins without deploying predictive maintenance procedures. It is very difficult to effectively enforce predictive maintenance without the data that digital twins provide.  

______

It is straightforward to find industrial applications for creating and managing manufacturing digital twins.

Companies such as Siemens, GE, IBM, and others market their digital twin solutions. There are commonalities throughout each of these provider’s explanations of their digital twins such as accurate representations of products and processes as they currently exist, the ability to forecast and predict future states such as needed maintenance, break-down of components, and areas of cost savings. These firms depict the necessary technology, information, and infrastructure from an already-installed, operational, and successful perspective.

A virtual product combines individual component designs into a nominal, complete, digital product. The virtual product allows for a simulation of production (i.e., “virtual production”) by digitally representing production aspects such as work cell layout, machine positions, and design tolerances. The simulated production mimics real production through the use of accurate machine and tool models, along with knowledge of machine controllers and the use of programmed NC code. The virtual product and virtual production work hand in hand by exchanging design specifications and design verifications back and forth to improve the product design. This feedback loop enhances the virtual product and, in turn, improves the virtual production. 

Next, the ideal virtual production connects with real production, through leveraged process automation, to facilitate a process optimization feedback loop. The real product, an output of real production, is finally analyzed to pass maintenance data back to the virtual product. This data helps predict needed maintenance, estimate costs, and further optimize product design. 

A digital twin can address the three manufacturing issues brought up thus far: digitalization of data, timeliness of data, and scope (i.e., across a product’s full lifecycle) of data. Service providers of digital twin platforms market the concept as a means of achieving “increased reliability and availability, reduced risk, lower maintenance costs, improved production, faster-time-to-value”. Obtaining specific, comprehensive benefits or quantified results is not as easy. For instance, IBM’s Watson’s IoT (which includes digital twin functionality) marketing contains generic phrases including “intelligent innovation,” “speed and agility,” and “value creation”. These are hardly concrete-enough to build a capital expense project budget around and reiterate the buzzword status mentioned previously.

Fortunately, General Electric (GE), one early adopter of digitals twins in the energy and aerospace industries and an extensive user and promoter of the technology, has provided greater detail in results and real-world use cases. In a whitepaper discussing GE’s involvement in the digital twin realm, GE highlights the scenario of a combined-cycle power plant featuring a “natural gas-powered turbine as its primary electricity generator, and a steam turbine that uses the byproduct of gas-fired production (i.e., steam) as its secondary generator” (Stephanie, 2018).  Due to the extreme complexity of the system and the vast quantity of components which each vary in their recommended maintenance frequency, durability, environment, and other factors, traditional methods of managing the process are not sufficient. Because it is “impossible to pinpoint the proper time to take a turbine offline for maintenance using the traditional mean time between failures (MTBF) estimates,” plant managers have historically relied on expensive scheduled maintenance. This disruptive process relies more on best-guesses of a few expert employees rather than any quantifiable means (Stephanie, 2018). By installing individual sensors on critical components and assemblies, GE can build a digital twin to monitor factors like system load, ambient temperature, and air quality in real-time. In this way, monitoring of the overall process’s health is possible, and forecasts are feasible using advanced statistical tools to schedule maintenance predictively and eliminate unnecessary, disruptive downtime. 

_

Real benefits exist for companies that implement a digital twin. For example, and shown in figure below, assuming physical and virtual costs of implementation are the same today and forecasting them over time, virtual costs “diverge with physical costs increasing at the rate of inflation and virtual cost decreasing on an exponential basis,” (Grieves & Vickers, 2017).

Figure above shows real versus virtual costs (Grieves & Vickers, 2017)

Furthermore, using digital twin patterns, GE has realized these specific benefits: “93-99.49” increased reliability in less than 2 years, 40% reduced reactive maintenance in less than 1 year, 75% reduced time to achieve outcomes, $11M avoidance in lost production by detecting and preventing failures”.

____

Digital twin in Manufacturing:

As emerging trends such as the fourth industrial revolution (4IR) continue to gain traction, manufacturers are using digital twin technology to transform their product lifecycle. From faster time-to-market in product development to increased productivity among frontline workers, many manufacturers are already reaping the benefits. Over 80% of companies who implemented immersive technologies identified improvements in their ability to innovate and collaborate in their production, manufacturing, and operations work phases, according to a Forrester Consulting study commissioned by Unity.

Top use cases of digital twins in manufacturing:

  • Factory design and layout – Optimize machine layouts, assembly flows, employee interactions, and more by spatially mapping factories.
  • Robotics simulation – Build fundamentally safer systems by training robots in simulated environments.
  • Operator training – Increase the efficiency of knowledge transfer with immersive, interactive training applications that maximize safety and reduce costs.
  • Monitoring, guided maintenance and repair – Transform routine, time-consuming procedures into seamless processes with remote-enabled AR technologies.

_

In industrial manufacturing, digital twins are used to simulate the production process. Based on data, collected from industrial IoT solutions, sensors connected to machines, and manufacturing tools, manufacturers can create virtual representations of a real-world product, equipment elements, production process, or whole system. Thus, for production purposes, such simulations help track machine operation and adjust it in real time. Augmented with machine learning algorithms, digital twins help manufacturing companies identify problems before they occur and predict future outcomes. For maintenance purposes, digital twins allow for monitoring equipment health and recognizing potential anomalies in a timely way. They capture real-time data on equipment operations and augment it with historical data on failures along with contextual maintenance data. With the help of machine learning and artificial intelligence, the solution predicts when maintenance work will be necessary. Based on this data, companies can take proactive measures to prevent production stoppages.

_

Benefits:

Manufacturing companies using digital twins on their production lines report the following effects:

  • Enhanced product quality: Boeing achieved a 40% improvement rate in the first-time quality of parts using the digital twin concept.
  • Higher production efficiency: Deloitte reduced the time spent on changeovers for their manufacturing clients by 20%.
  • Improved profitability: Challenge Advisory outlined that their automotive client was able to improve annual profit margins by up to 54%.

_

Real-Life Example: Consumer Goods Manufacturing:

Unilever PLC is using digital twins to make the production process more efficient and flexible. The company has created virtual models of its factories. At each location, the IoT sensors feed real-time performance data such as temperature and motor speed into the enterprise cloud. Using advanced analytics and machine learning algorithms, an IoT digital twin simulates complex what-if scenarios to identify the best operational conditions. This helps manufacturers use materials more precisely and limit waste from products that don’t meet quality standards. Unilever has seen significant success from deploying digital twins to greatly increase the consistency in production of soaps and detergents. By using AI, the company has also been able to reduce the number of false alerts which require action in their plants. Previously the company received 3,000 such alerts per day but has been able to reduce this number by 90%. Right now, Unilever is operating eight digital twins across North America, South America, Europe, and Asia.

Real-Life Example: Pharma Manufacturing:

A good example of the digital twin technology in pharma is a collaboration between Atos, a digital transformation consultancy, and Siemens, the engineering firm. The companies are working on a digital twin solution for the pharmaceutical industry. The system creates a digital representation of a specific production step. Connected with IoT sensors installed on the actual plant, it gives an instant view of all operational details. Powered by AI and Advanced Analytics, the solution also offers optimized measures for process quality and reliability.

_

Digital Twins and the Future of Manufacturing:

Digital twins will only get more popular as companies learn how to use them to improve productivity and reduce costs. A 2020 study by Research and Markets indicates:

  • Up to 89% of all IoT platforms will include digital twins by 2025
  • Digital twinning will be a standard IoT feature by 2027
  • Nearly 36% of executives across a variety of industries understand the benefits of digital twinning, with about half of them planning to use it in their operations by 2028

______

-2. Digital twins in Aerospace:  

Aerospace tasks are intrinsically complex. End products like aircraft and spacecraft are massively expensive to design and build, making it all the more imperative to get work done right the first time in order to avoid costly delays. From design and engineering all the way through to assembly and maintenance, digital twins improve decision-making by allowing teams to visualize and interact with computer-aided design (CAD) models and other datasets in real-time 3D.

Top use cases of digital twins in aerospace:

  • Product development and prototyping – 3D visualization enables designers, engineers and other stakeholders to better collaborate and evaluate design and manufacturing alternatives for complex systems.
  • Simulation and training – Engaging training experiences in interactive 3D or augmented or virtual reality (AR and VR) enable better knowledge transfer and safer workplaces.
  • Maintenance and operations – Creating work instructions in mixed reality from as-built models from design and manufacturing can simplify and optimize inspection, maintenance, and repair activities.
  • Sales and marketing – Virtual showrooms and 3D product configurators empower buyers to explore every variation of aircraft and make purchasing decisions more confidently.

Real-life example:  At Boeing digital twins are used to design aircraft, as a digital twin is created for a new plane, after which simulations are run that predict the performance of various airline components over the lifecycle of the product. As a result, Boeing engineers can predict when products are expected to fail. According to the company they have achieved a 40% improvement rate in the first-time quality of parts by using a digital twin. The company plans to digitize all of its engineering and development systems in the future and also plans to share this information with its supply chain. Another use case being explored for Boeing aircraft is using a digital twin to achieve a perfect cargo load balance. For example, a Boeing 737-800 has a maximum cargo load of 80,000 kilograms but many planes fly with less cargo than this as weight figures are calculated manually. By using IoT sensors on a digital twin, a precise and yet safe cargo load can be determined increasing cargo revenue per flight.  Boeing created an AR-powered aircraft inspection application using a digital twin of one of its planes. The twin enabled this aerospace industry leader to generate over 100,000 synthetic images to better train the machine learning algorithms of the AR application.

______

-3. Digital twins in Architecture:

At the start of a project, architects produce design materials, including renderings and models, to allow clients to evaluate and approve the design. The problem is there’s no shared, collaborative environment with stakeholders to make decisions in real-time. Communicating design intent during traditional reviews is a difficult process. Static 2D and 3D models cause details to be lost in translation, renderings aren’t flexible enough, and not everyone is on the same page. Digital twins solve these problems so there’s no more costly mistakes.

Top use cases of digital twins in architecture:

  • Design review – Architects and designers can easily bring design models into an immersive experience in AR and VR to facilitate interactive, real-time design reviews.
  • Design visualization – Explore more design options and iterate faster, allowing for design issues to be identified and resolved more quickly.
  • On-site AR – Overlay models in 1:1 AR at scale on the jobsite to effectively communicate design intent to stakeholders.

Real-life example:  SHoP Architects use real-time 3D digital twins to envision skyscrapers before they’re built. Award-winning architecture firm SHoP Architects and JDS Development Group, a real estate development, construction and acquisition firm, are utilizing real-time data with Unity to make decisions faster with every project stakeholder. Also, digital twin of The Brooklyn Tower, a 93-story, 1,073-foot skyscraper in New York City, saves time and money and reduces the project’s carbon footprint.

______

-4. Digital twins in Automotive industry:

In the automotive industry, digital twins are used to simulate and test new design concepts before they are built, optimize production processes, and even predict how a vehicle will perform in different conditions. The top benefit of using digital twins for automotive OEMs is the ability to save time and money by identifying and addressing potential issues before they occur. As the industry continues to embrace this technology, it plays an increasingly important role across every workflow in the automotive lifecycle, from design and manufacturing to marketing and maintenance.

Top use cases of digital twins in automotive

  • 3D car design and product development – 3D visualization enables global collaboration and avoids delays common with traditional 3D automotive rendering software.
  • Human-machine interfaces (HMI) – Create interactive 2D and 3D user experiences for in-vehicle infotainment (IVI) systems and digital cockpits.
  • Autonomous driving simulation – Simulate scenarios and visualize results in real-time, all within the safety of virtual worlds.
  • Training and guidance – Empower frontline workers with interactive, immersive experiences proven to increase knowledge retention and productivity.
  • Sales and marketing – Create photorealistic renders and interactive 3D configurators using your existing 3D data.

_

Figure below visualizes an overview of the applications of DT in the automotive industry.

Beginning from the design to operation, digital twin technology opens the avenues for cost-effective and efficient development of sustainable electric vehicle technologies. Digital twin technology is relatively recent to the automotive industry, due to the lack of development in its sister technologies, such as IoT, delocalized wireless connectivity, and artificial intelligence, during its conception. However, the scientific world has now entered a prime era for the development of digital twin technology and other smart development methodologies. Furthermore, the coming decade will be a major turning point in human history due to unprecedented environmental challenges. The digital twin technologies can be adapted for smart electric vehicle use cases, such as predictive mobility, autonomous motion control, driver assistance systems, vehicle health management systems, battery management systems, intelligent charging, vehicle power electronic converters, and electric power drive systems.

_

Real-life example: Volvo Cars revolutionizes the vehicle production lifecycle. Volvo Cars embraced digital twin technology to improve design-engineering communication and collaboration, reduce reliance on physical prototype vehicles, and create more immersive and effective buying experiences.  BMW Group, which has 31 factories around the world, is collaborating with NVIDIA on digital twins. The German automaker is relying on NVIDIA Omniverse Enterprise to run factory simulations to optimize its operations. Mercedes will use NVIDIA Omniverse to design, plan and optimize its factories. Specifically, Mercedes is preparing to manufacture its new electric vehicle platform at its plant in Rastatt, Germany.

_____

-5. Digital twins in Construction:

The construction industry is one of the largest in the world and by 2030, it is expected that the volume of construction output will grow by 85% to $15.5 trillion. China, India and the US will all lead the way accounting for 57% of that growth.

What challenges are facing the construction sector?

(1. Poor productivity and profitability

(2. Project performance (timing and budget issues)

(3. Skilled labour shortages

(4. Sustainability concerns

Digital twin technology could be the answer to these issues. It provides the ability to create virtual replicas of potential and actual physical assets, processes, people, places, systems and devices that can be used for various purposes. Let’s get into the full capabilities of digital twins in the construction sector.

Developing new building construction models using digital twin:

The biggest difficulty that hinders architects in terms of coming up with new building designs and projects is the practicality of them. Meaning, that construction design developers are forced to limit their creativity because if a new building design or concept is created, it must be approved to match all of the safety requirements. This presupposes that the new design has to be tested in the real world, which will take up an immense amount of capital, time and other resources just to find out if it can be done or not. If building developers would have the ability to test out their ideas in reality-based simulations that would involve all the necessary real-world factors (e.g. gravity, weather, wind etc.), the total timeframe during which they could share their ideas and get them approved would be cut 100x. The safety, practicality, and sustainability of the new building designs would get tested in a simulation and the feedback would be just as accurate as it would be if the test was executed in real life because the simulation directly derives data from the world.

Using virtual twin to minimize worker risk and pre-launching construction projects:

Construction is deemed to be one of the riskiest professions out there. But what if there was a new way of decreasing the total amount of life-threatening scenarios in the construction industry? If that was possible, the work environment would be more appealing for employees to work in, future job positions in construction would pose themselves to be more attractive to young talent, the total amount of incidents would be greatly reduced and the overall efficiency and work quality would skyrocket. All of these positive outcomes can be brought into reality by manufacturing digital twin versions of work environments in which construction projects are taking place.

The newly generated simulation would be connected directly to the workflow and structure of the workplace, giving engineers the ability to prevent specific life-threatening outcomes from happening. For instance, a virtual twin of newly installed scaffolding would monitor every single piece of metal, measure their most important stress points and show developers which places are under extreme stress. This wouldn’t just be useful for workers to see which screw is loose in a door handle – digital twin is capable of saving lives in the real world by utilizing a virtual one.

_

The construction industry is plagued with many difficult problems. As we mentioned before, there are labor shortages, chief among them in demand skilled trades; and supply chain impediments for key construction materials including lumber and wood, concrete, and metal.  Consider of some of the key factors leading to project overruns: changes and rework; contractor financing issues; payment delays; poor cost estimation; poor tendering (bid) documents; poor material management. 

Digital twins can help resolve some of construction’s biggest problems that would otherwise lead to costly construction downtime. BIM can assist in the collaborative design process, but data captured in a digital twin during the operations phase of a project helps inform the planning and design of future projects. Tool managers can create an IoT of assets via smart tools and equipment that’s tagged, asset ID tags and Bluetooth® tracking tags, GPS trackers, and other tracking hardware to better keep track of construction inventory. Construction drones, furthermore, can be used for fly-overs, the recordings providing real-time status updates. Smart tools embedded with Bluetooth® tracking functionality can be fed into a digital twin to get better visibility to where high-value assets are; data pulled off of these assets can also be pushed into your BIM or project management software to deliver software interoperability and eliminate duplicate data, manual inputs, and mistakes; utilization data and reporting functionality can also be pulled to glean insights about tools and their continued use.

_

Faced with rampant supply chain delays, labor shortages, and inflated material costs, the stakes for builders are at an all-time high. Bad data and poor decision-making can lead to expensive delays and rework. Digital twin and AR technology allow the construction industry to optimize project data, streamline collaboration, and better visualize projects from design through to operations and maintenance. By using AR to bring valuable BIM data to the field, contractors are able to capture and communicate design errors in just a few clicks, allowing stakeholders to resolve issues quickly and avoid costly rework.

Top use cases of digital twins in construction:

  • Visualize designs in real-time – Use any mobile device to overlay building information modeling (BIM) data in the field using AR to quickly inspect design and installed conditions and stop rework in its tracks.
  • Streamlined virtual design and construction (VDC) workflows – Issues are captured in the field marked directly within the model, reducing downtime and making office-to-field communication seamless.
  • Immersive safety training – Transform job-critical learning and training into interactive, immersive environments across key scenarios.
  • Improved project close-out and maintenance – Maintain critical model and project data that will improve digital twin operation and maintenance long after completion.

_

Digital Twins can improve Urban Planning: 

The world is fast urbanizing. It is anticipated that 56% of the world’s population will live in cities. Hence, cities need to be resourcefully prepared for a massive population influx. But, given the haphazard growth of most urban areas in the world, digital twins can help streamline urban developments with the help of data analytics. By creating digital twins of cities, one can virtually test and enact policies, design principles, and construction methods in the real world. The adoption of digital twins can revolutionize how cities are designed, operated, maintained, and sustained to enhance the quality of life of their citizens.

Suppose a city needs to redesign a major intersection:

A program can create a digital twin of the intersection and use it to study changing traffic patterns, economic activity, weather conditions, population movement, and so on. These data sets can help city planners create systems that are efficient and meaningful before design and development takes place. City planners can take the data they glean from digital twins and then approach different vendors about requests for proposals.

_

Real-life example: DPR Construction leverages AR to empower field teams. DPR, an ENR Top 10 Contractor, is integrating AR and immersive tech into the project lifecycle to bring valuable BIM data to the field in real-time to improve team performance and reduce rework. Digital twins are emerging as an essential tool in construction and the built environment. In fact, Las Vegas unveiled a digital twin of a 7-square-kilometer section of its downtown in January 2022.

______

-6. Digital twins in Energy:

Energy companies generate a wealth of data, especially as operations are increasingly outfitted with Internet of Things (IoT) sensors, high-definition cameras with artificial intelligence (AI) capabilities, and more. Digital technologies like real-time 3D can visualize this data to provide right-time insights, better-informing decisions around production, maintenance, safety and security, and optimization.

Top use cases of digital twins in energy:

  • Design visualization and collaboration – Walk through sites before they are built by creating immersive experiences from BIM data.
  • Learning, training and safety – Transform job-critical learning and training into interactive, immersive environments across key scenarios.
  • Field service – Enhance quality, reliability, and speed of service while reducing error with AR-assisted maintenance and empower your employees by putting all the information they need right before their eyes.
  • Site operations and maintenance – Get insight into the design, management and performance of assets, while tracking your sites’ status remotely in real-time with a digital twin of any facility, pipeline, or physical asset.

Real-life example: Zutari improves design of large-scale renewable energy sites. Zutari, a South African engineering consultancy, is using Unity’s real-time 3D development platform to automate large-scale solar photovoltaics (PV) projects to reduce the time required to develop design-level insights and decrease costs.

_____

-7. Digital twins in Infrastructure:

Digital twin technology helps builders, planners, and operators across cities worldwide better understand and optimize these spaces for public use. By using advanced, interactive models and live IoT data, stakeholders are able to simulate traffic flow, mobility patterns, and even the effects of climate change and shifting landscapes surrounding key infrastructure like airports, roads, and transportation hubs. From individual facilities to smart cities, digital twins are helping owners, operators, and policy-makers manage large volumes of valuable data that will allow them to better equip our infrastructure for future demands. Digital twins of smart cities can enable better planning of construction as well as constant improvements in municipalities. Smart cities are building 3D replicas of themselves to run simulations. These digital twins help optimize traffic flow, parking, street lighting and many other aspects to make life better in cities, and these improvements can be implemented in the real world.

Top use cases of digital twins in infrastructure:

  • Combine and visualize key datasets – Digital twins bring geospatial, model, and sensor data together to visualize spaces in real-time, giving decision-makers the data they need to optimize operations and resource allocation.
  • Simulate mobility and usage patterns – With digital twins, planners and designers are able to simulate how people, vehicles and everything else moves through a space or even an entire city, and model how they will move in an updated or changed landscape, helping to inform key decisions around design and infrastructure maintenance.
  • Improve sustainability outcomes – When IoT sensors and energy usage information are combined with model data, teams can better understand where inefficiencies exist to reduce emissions, waste and water consumption.

Real-life example: Making cities smart with digital twins. According to ABI Research, more than 500 cities will deploy digital twins by 2025.

______

-8. Digital twins in Government:

The use of real-time 3D, extended reality (XR), and AI technologies are accelerating at a rapid pace in civilian, defense and intelligence applications. New technologies are being deployed rapidly and putting challenges on government agencies and contractors that need to stay at the forefront of cutting-edge development. Digital twins help reduce the risk, time and cost of designing, developing, deploying and supporting cutting-edge applications in simulation and training and beyond.

Top use cases of digital twins in government:

  • Flight and vehicle modeling and simulation – Conduct immersive reviews and testing on aircraft, vehicles and other products to minimize costs from physical prototyping and testing.
  • Team-based situational training and machine and equipment controls training – Bring job-critical learning and training into immersive 3D and VR experiences. Accelerate time to train and knowledge retention across key skills.
  • Guided maintenance and repair – Enhance quality, reliability and speed of production and service by using extended reality to empower users with all the information they need before their eyes.

Real-life example:  Rebuilding Tyndall Air Force Base with digital twin technology. The reconstruction of Tyndall Air Force Base in Florida after Hurricane Michael provides an opportunity to imagine what modern installations require and to rapidly undergo digital transformation.

_____

-9. Digital twins in Luxury goods:

Luxury interactive shopping is on the rise, complementing premium in-store experiences. Many luxury brands have been preparing for the future of retail for many years by creating 3D marketing experiences. Investing in this new way of selling can reduce costs and increase revenue.

Top use cases of digital twins in luxury goods:

  • Real-time 3D product configurators – Create and deploy highly interactive 3D product configurators, allowing customers to explore their ideal combination.
  • Photorealistic marketing imagery – Virtualize marketing content creation pipelines to create photorealistic high-definition assets in record time and at scale.
  • Virtual try-on experiences – Give shoppers confidence in their purchase by letting them try on products virtually.
  • Virtual showrooms – Interactive 3D and VR showrooms immerse consumers in a new shopping experience.

Real-life example: Globe-Trotter takes luxury shopping to new heights. Knowing traditional ways of selling products like photographs or rendered images wouldn’t be enough to turn shoppers into buyers, Globe-Trotter, a luxury travel accessories brand, delivered a more immersive experience to help their customers feel confident in purchasing high-priced custom luggage sight unseen.

_____

-10. Digital twins in Supply Chain and Logistics:

Digital twin technology is drastically transforming the supply chain. As the global manufacturing and distribution network continues to expand, new risks arise for both manufacturers and consumers. Indeed, a more complex and far-reaching supply chain can become vulnerable to inefficiencies, counterfeit, and fraudulent activities. By creating a digital twin of the supply chain, manufacturers are able to gain visibility into each step of the product’s journey to the end user. What’s more, when combining digital twins of connected devices, manufacturers can deliver accurate product information to consumers, who can then benefit from unparalleled transparency levels. Ultimately, end-to-end tracking systems based on digital twinning technologies, such as Authena L1VE™, are essential business tools to increase efficiency, boost profitability, and safeguard consumer trust.

Using digital twins in the supply chain offers many benefits. These include:

  • Improving the design of manufacturing systems and processes
  • Boosting the efficiency of the entire supply chain
  • Identifying weaknesses, areas of low productivity, and vulnerabilities
  • Increasing the visibility into complex and layered systems
  • Streamlined product testing
  • Monitoring the functioning of machinery and tracking maintenance schedules
  • Making asset lifecycle management more effective

____

Digital twins help improve overall logistics operations:

With the implementation of digital twins in logistics, the management of warehouses, products and global hubs will become more efficient as the replica will provide companies the opportunity to rectify errors in their logistics network without damaging the actual product, network or data.

(1. Managing transportation roadblocks with alerts

Creating a digital twin of the provides critical warnings to drivers in advance, such as alerts related to traffic, accidents, hurricanes, storms and fires and suggest new routes. The technology can even prevent erratic driving behavior or imminent infractions by signalling drivers about it, saving fuel and time.

(2. Better freight handling at ports through digital models

Logistics hubs such as airports and container ports are complex systems to manage— imperfections in the system or human errors can create bottlenecks. Chances of such bottlenecks can be reduced with digital twins.

For example, Ericsson and the port of Livorno in Italy are working towards to create a digital twin to remove the inefficiencies in freight handling and loading and unloading of shipments. This is being achieved by creating a real-time digital replica of the port area using 5G network, smart sensors, LiDAR and advanced cameras.

Digital twins can also use satellite and aerial photography and navigation systems to assess in real time the entire journey of shipments — be it on land, air, or sea.

(3. Bringing efficiency at warehouses and distribution centers

Digital twins can also create exact virtual layouts of warehouses and distribution centers. This allows companies to rethink how new designs could enhance activity without damaging current operations.

Further, with recent advancements in warehouse technologies such as automated robots, counting systems, goods-to-person picking, and automated storage and retrieval equipment, companies can combine data generated from these systems to improve physical warehouse layouts while increasing worker productivity.

For example, DHL partnered with food packaging manufacturer Tetra Pak to simulate all activities and machinery using a digital twin at one of their warehouses. It allowed them to study movement of packages and functionality of machinery to enhance productivity.

(4. Providing protection to shipments

Companies can combine product and packaging data to find how different packaging conditions can affect a product, even before the first delivery goes out. Digital twins continuously collect data and help identify potential softness in the production-to-delivery process.

Sensors are used to collect and transmit this data to several data points during the transition of an actual shipment. Data from the previous six months can be saved and periodic trends can be spotted and rectified, protecting and enhancing upcoming operations.

______

-11. Digital twins in Retail

Spurred on by the pandemic, the need for retailers to leverage digital twins for design, planning, operations and more has increased exponentially. The importance of engaging customers online likewise increased overnight, and retailers looked to this technology to create immersive virtual experiences to continue connecting with shoppers. Savvy retailers are embracing digital twins to enhance processes, connect with their customers in new and profound ways, and deliver compelling digital and in-store user experiences.

Top use cases of digital twins in retail:

  • Design and planning – Create 3D virtual stores to visualize and simulate the optimal experience prior to construction; utilize planograms and store planning applications to maximize space, improve efficiency, and collaborate remotely.
  • Sales and marketing – From integrating 3D assets into e-commerce sites to creating virtual showrooms in VR, retailers can use digital twins to increase conversion and make purchase decisions more accurate, which limits returns and mitigates the environmental footprint of e-commerce.
  • Operations – Digital twins of product SKUs and stores can assist in the development of myriad applications to improve operational efficiency, from autonomous checkout to intelligent in-store navigation.

_

In the retail industry, digital twins may come in handy both in the supply chain and in store. To create supply chain simulations, retailers use real-time sensor and equipment data, as well as ERP and other business system data. The models give an overview of a supply chain’s performance, including assets, warehouses, material flows, inventory positions, and people. To create in-store digital replicas, retailers use data captured by RFID readers, motion sensors, and smart shelves. These models allow them to analyze customer movement and purchase behavior, as well as test the optimal placement of products.

_

Benefits:

With digital twins, retailers can:

  • Effectively manage product supplies. Digital twins help retailers identify bottlenecks, supply shortages, and demand curves in seconds. Based on these insights, they can replenish goods, readjust placements of products, and create targeted ads to minimize waste and promote sales.
  • Avoid supply chain disruptions. Retailers can combine their digital twin models with external real-time data like local traffic and weather. By doing so, they can respond to any kind of event that may disrupt their supply chain.
  • Optimize logistics costs. According to Boston Consulting Group, digital twins help retailers minimize capital expenditures by 10%, reduce excess inventory by 5%, and improve EBITDA (earnings before interest, taxes, depreciation and amortization) by 1-3%.

_

Real-life example: eBay launches AI-enabled 3D display feature for sneaker sellers. The global commerce leader is bringing interactivity to their platform with the launch of their 3D TrueView feature for sneakers. Also, French supermarket chain Intermarché created a digital twin of a brick-and-mortar store based on data from IoT-enabled shelves and sales systems. Now, store managers can easily manage inventory and test the effectiveness of different store layouts.

______

-12. Digital Twins in Utilities: Water Supply:

Water utility organizations use digital twins to ensure an uninterrupted water supply and be better prepared for emergency situations. With digital replicas, they can get an accurate assessment of how the current water system behaves, identify failures before they happen, and simulate what-if scenarios. Water utilities create virtual representations of water systems based on sensors and actuators capturing data on the physical system’s performance. Additionally, they use data from information systems in the water industry, such as CMMS (computer maintenance management systems), GIS (geographic information systems), and SCADA (supervisory control and data acquisition).

_

Benefits:

Digital twins help utility companies quickly locate potential leaks and reduce water loss. With virtual simulations, they can test different methods of water system operations to improve emergency response, increase water supply reliability, and save energy. This is confirmed by AdP: they were able to increase operating gains by 23% and shorten the time required to repair pipe bursts by 8%. They also reduced water supply interruptions by 23% and the number of sewer collapses by 54%.

_

Real-Life Example: Aguas do Porto (AdP), a Portuguese utility organization, is responsible for the water supply in the city of Porto. AdP uses digital twins to forecast flooding and water quality issues, improve city services and responsiveness, and ensure resilience of water infrastructure. The solution creates virtual models based on sensor and telemetry data together with information from 20 other sources: customer service management, billing, maintenance, asset accounting, etc. Digital twins enable AdP to monitor the water supply systems in real time. They are also used to create forecasts on water consumption and simulate burst pipe scenarios along with valve and pump shutdowns.

_____

-13. Digital Twins in Healthcare:

In healthcare, a DT is a virtual copy of a physical object or process, such as a patient, their anatomical structure, or a hospital environment. Currently, DTs in healthcare propose to dynamically reflect data sources such as electronic health records (EHRs), disease registries, “-omics” data (e.g., genomics, biomics, proteomics, or metabolomics data) as well as physical markers, demographic, and lifestyle data over time of an individual. Thanks to the evolution of underlying technologies (e.g., IoT, AI) and increasingly diverse, accurate, and accessible data (e.g., biometric, behavioral, emotional, cognitive, and psychological data), research and potential applications of DTs in healthcare have seen increasing interest. Digital twins enable healthcare companies to design and customize complex medical devices for individual patients, making them compatible with the individual’s unique anatomical and physiological systems. They accelerate the design process and reduce costs by lowering the need for surgery and clinical trials, greatly reduce animal testing, and produce zero side effects.

_

Healthcare-related physical entities for digital twinning are grouped into three categories: device, patient, and facility as seen in the figure below.

The first group, devices, includes wearable fitness or health monitoring devices, any medical device, and other relevant smart devices used for healthcare-related purposes. The data collected and stored by these devices, along with the potential analytics, make these devices important parts of the physical entities we observed. They are continuously making a significant contribution to healthcare delivery. Following the expanding IoT and its increasingly strong integration in healthcare, this first group of physical entities have the most straightforward access to the digital twinning process.

In the second group, patients, where more ethical considerations exist and owing to the highly complex nature of human physiology, it is a more challenging path towards complete digital twinning. Nevertheless, several projects are underway to digitalize cell and DNA level data. Other applications have also been identified for specific organs, i.e., the heart and liver, among others, as well as organ systems like the cardiovascular system. Although there is still a long way to go before the human body can be fully digitalized, there is increasing public interest and research in this field.

All healthcare-related facilities are then grouped into the last category, namely, facilities, which includes hospitals, other healthcare institutions, surgery, healthcare services/operations within these facilities, and professionals who carry out those operations, as well as labs, trials, and other research relevant facilities.

Digital twinning of those physical entities creates a monumental IoE-like architecture for providing improved healthcare service. The valuable information and models contained by the digital twin and its mixed functionalities all contribute to the advancing healthcare service in this ever-changing and digitally expanding world. The critical beneficial aspects for healthcare from digital twin technologies according to the applications are summarized for convenience as improvements in the following: information sharing, education, monitoring, diagnosis, precision medicine/treatment, medical resource management, facility operation management, and research advancement.

_

In silico experiments:

In the very beginning of the confluence of medicine and IT technologies, computers were regarded only as an instrument to facilitate the processing of large masses of data. Now science community is thinking of (and working on) holding fully virtual research experiments. A virtual model of a human body would allow medics to create new drugs easier as well as make personalized drugs more available.

The first mention of in silico (literally “in silicon”) experiments dates back to the 1990s. Today this term is used almost as often as in vivo (“within the living”, i.e., experiments on whole living organisms) and in vitro (“in the glass”, i.e., experiments held in a test-tube artificial environment).

By the in silico experiments we mean putting certain parameters in the computer in order to recreate the results that would be obtained during a real experiment on an analyzed system. At present in silico research also includes predicting the behaviors of certain molecules, biochemical processes, and full physiological systems. Special attention should be given to in silico modelling of individual molecules. In this case, a computer models the reaction of various molecular systems, for example, amino acid or nucleotide sequences. Not to be forgotten that computers are now indispensable to studying big masses of data (bioinformatics), just as they were in the beginning.

Empirical force fields-based studies are yet another application. They model the spatial structure of protein molecules depending on structural templates of proteins with related amino acid sequences. A computer can also study interactions like in pairs ligand-receptor and enzyme-substrate — this aspect is widely used in designing new drugs, and it is related to as studies of molecular docking.

So far, we are not at the point where we are able to use in silico research instead of other forms of human experiments. However, it is an auspicious direction for development. The most obvious reason is that in silico models can save time and money spent on running in vitro and in vivo experiments.

Let me give you an example. Within a de novo drug design a computer can design a molecule that would perfectly connect with a necessary part of a target molecule that should be bonded with a drug molecule with which the drug needs to form bonds. Also, it can model the behavior of a new substance by analyzing its chemical structure in order to, for instance, predict the toxicity of reactive impurities. Currently, the majority of in silico models rely on existing databases of substances and their toxicity.

Another promising direction of in silico research is the creation of virtual models of individual patients. It gives scientists an opportunity to hold tests with rare phenotypes that are difficult to find for real-life trials. Also, this approach allows for comparing the efficacy of various treatments in a certain patient without spending too much time.

Recently a joint UK and the Netherlands research team under the management of Alejandro F. Frangi (the University of Leeds) successfully proved that a virtual model offered exactly the same results that conventional trials with real patients. The model in question analyzed the treatment of brain aneurysms with a flow-diverting stent applied to a sample of virtual anatomies available in clinical databases. Put more simply, a stent is a tube placed in the artery in order to reduce the blood flow, thus lowering the risks of rupture and ischemic stroke.

For this experiment scientists selected 82 virtual patients with similar age, sex, nationality, physiology, anatomy, and biochemical characteristics to those of real people who were taking part in a trial of the stent efficiency. For these 82 virtual patients, they developed a program to analyze the influence of the stent on blood flow and then compared their results to those of three real medical trials.

According to the virtual model, the use of the stent was “virtually successful” in 82.9%. The success rate in real trials used for comparison was 86.8%, 74.8%, and 76.8%.

Another good example of the technology’s application is HostSim. It is a virtual model of a host of Mycobacterium tuberculosis developed by a group of scientists of the University of Michigan Medical School. This model analyses the response of the host’s immune system to the pathogen and predicts how the disease is going to be progressing. A virtual model was used because it is very hard to collect samples from the infected lung granulomas of a patient. This virtual primate model had lungs, LN, and blood: the systems that are most affected by the progressing disease.

Another two projects — The Living Brain and The Living Heart — were created by Dassault Systemes (France) on the 3DExperience platform. After studying available information from patients and various studies, the scientists developed a virtual model of blood circulation that allows them to study different treatments. With the help of the Living Heart, they can create new medical gadgets, test the safety of drugs, and develop personalized surgical treatments.

The other project, Living Brain, helps study epilepsy and determine which parts of the brain are associated with seizures. By 2019 the results of this project were so good that the FDA prolongated its work with Dassault for another five years.

In Germany, the Ebenbuild company developed a personalized virtual twin therapy for ARDS patients (Acute Respiratory Distress Syndrome). In order to create a virtual twin, scientists use the patient’s data from CT scans and process it using AI and image analysis. As a result, they get patient-specific lung segmentations that with the application of lung ventilation parameters are used to increase the patient’s chances of survival and recovery. “Local mechanical overload of the lungs due to suboptimal ventilator settings is a major contributor to the high mortality in patients suffering from Acute Respiratory Distress Syndrome (ARDS)”, state Ebenbuild scientists on the official website. “Our technology enables us to provide the best possible protective ventilation protocol for each patient, reducing ventilator-inflicted lung damage. Combining a CT scan of the patient’s lungs with in-depth physiological knowledge, engineering, and physics-based algorithms, we create highly accurate digital twins of the human lungs”.

_

If their potential is fully unfolded, DTs will facilitate the as-yet-unrealized potential of connected care and alter the way lifestyle, health, wellness, and chronic disease will be managed in the future. Accordingly, DTs could be “fed” with diverse and real-time information obtained by wearables and other sources of self-reported data, e.g., from mobile health applications. Care providers could access a patient’s DT to see personalized information far beyond what is currently available while making treatment decisions or providing recommendations as seen in the figure below.

Figure above shows DT healthcare technologies: Working scheme.

_

Liu et al. (2019) describe the operation mechanism of DTs in healthcare with the following stages. First, DT models must be built on a physical object using advanced modeling techniques and tools (e.g., SysML, Modelica, SolidWorks, 3DMAX, and AutoCAD). Second, real-time data connection and exchange between physical and virtual objects should be executed through health IoT and mobile internet technologies. Third, simulation models are tested and validated by quick execution and calibration. Fourth, models are continuously adjusted accordingly to optimize and iterate DT models. Finally, following the behavior of the virtual twin, model results (e.g., diagnosis results) are sent back to the patients. With respect to the DT application on patients, the ultimate vision is to have a lifelong, personalized patient model that is updated with each measurement, scan, or exam, including behavioral and genetic data. Personalized treatments, prevention of diseases and use of patient-specific therapeutic methods based on genetic, biological, phenotypical, physical, and psychosocial peculiarities would become a new medical breakthrough. 

______

Current Applications of DTs in Healthcare:

(1. Precision Medicine and Support to Medical Decision Making:

DT applications in healthcare can contribute to the broad trend of precision medicine to maximize the efficiency and efficacy of the healthcare system by shifting from current clinical practice with ‘one-size-fits-all’ treatments to take inter-individual variability into greater account. ‘‘Precision medicine’’ (more generally referred to as ‘‘personalized medicine’’) is an emerging approach for disease treatment and prevention surrounding the use of new diagnostics and therapeutics targeted to the needs of a patient based on their own genetic, biomarker, phenotypic, physical or psychosocial characteristics. The aim is to deliver the right treatments, at the right time, to the right person. However, most current healthcare systems are not fully able to provide personalized treatment for diseases having multi-stage diagnosis and treatment processes and high variability in terms of disease characteristics (like in the case of cancer treatment). One of the most important barriers to precision medicine is that patients with the same disease do not adequately respond to the same treatment. This is primarily due to the wide gap between the complexity of the focal condition, which may involve altered interactions between thousands of genes that alter across patients with the same diagnosis (thus identifying multiple diseases behind the same diagnosis), and modern healthcare, in which diagnostics often relies on a growing but still a relatively small number of biomarkers of limited sensitivity or specificity. In order to address these limitations, DTs may help in creating a human model defined by all the structural, physical, biological, and historical characteristics of an individual that can be matched with thousands or millions of comparable data from other individuals, facilitating the search and identification of interesting genetic characteristics. Therefore, DTs may ease the prediction of an illness by analyzing the real twin’s personal history and the current context, such as location, time, and activity. Furthermore, DTs may simulate the impact of a treatment on these patients and provide decision support to physicians and other healthcare professionals such as hospital pharmacists.

_

Although DTs’ use cases are still limited in scope in precision medicine, some DTs of organs (e.g., heart) or parts of the human body have already been developed and used as prototypes or pilots. One precursor in the organ DTs’ environment is Dassault Systèmes’s Living Heart project, which is the first functioning computer model of the complete heart, taking into consideration all aspects of functionality, including blood flow, mechanics, and electrical impulses. Functioning Living Heart has been developed and is now available to anyone worldwide. It is used for designing new medical devices, analyzing drug safety, designing personalized surgical treatments, and is also used in biomedical education.

_

Some of the current applications of DTs in healthcare, particularly in precision medicine, are summarized in Table below.

Current DT Application in Precision Medicine and Medical Decision-Making Support:

Target Organ/Disease

Reference
(Company, Journal etc.)

Description

Heart

Living Heart Project, Dassault Systèmes

The Living Heart Project is the first DT organ considering all aspects of the heart’s functionality, such as blood flow, mechanics, and electrical impulses. The Living Heart Project is an international research collaboration dedicated to developing and validating highly accurate personalized digital human heart models based on MRI images and ECG data. The 3D model of the organ has built with a 2D scan of the heart. The Living Heart Model on the 3D EXPERIENCE platform can be used to create new ways to design and test new devices and drug treatments. Research team leverage the digital twin heart to simulate in-vivo (in a living organism) conditions, visualize anatomy that cannot be seen, and refine the designs of cardiological devices faster. For instance, physicians can run hypothetical scenarios like adding a pacemaker or reversing the heart chambers to predict the outcome of treatment on the patient.

Heart

CardioInsight, Medtronic

The CardioInsight Noninvasive 3D Mapping System collects chest electrocardiogram (ECG) signals and combines these signals with computerized tomography (CT) scan data to produce and display simultaneous 3-D cardiac maps. The mapping system enables physicians to characterize abnormal rhythms of the heart through a personalized heart model.

Heart

Siemens Healthineers

Another heart DT has been developed by Siemens Healthineers and used for research purposes by Cardiologists of the Heidelberg University Hospital (HUH) in Germany. Although the first study is still in the data evaluation process, preliminary results are promising.
Siemens Healthineers developed the DT model by exploiting a massive database containing more than 250 million annotated images, reports, and operational data. The AI-based DT model enables digital heart design based on patient data with the same conditions of the given patient (size, ejection fraction, and muscle contraction).

Brain

Blue Brain Project, EPFL and Hewlett Packard Enterprise

Hewlett Packard Enterprise, partnering with Ecole Polytechnique Fédérale de Lausannes (EPFL), builds a DT of brain called the Blue Brain Project. The project is one of the sub-projects of the Human Brain Project and aims to build biologically detailed digital reconstructions (computer models) and simulations of the mouse brain. In 2018, researchers published the first 3D cell atlas for the entire mouse brain.

Human air-way system

Oklahoma State University’s Computational Biofluidics and Biomechanics Laboratory

Researchers developed a prototype of human DT, named ‘‘virtual human V1.0”, with the high-resolution human respiratory system covering the entire conducting and respiratory zones, lung lobes, and body shell. The project aims to study and increase the success rate of cancer-destroying drugs in targeting tumor-only locations.

Brain aneurysm and surrounding blood vessels

Sim&Cure

Sim&Cure developed a DT to treat aneurysms, which are enlarged blood vessels that can result in clots or strokes. DT of the aneurysm and the surrounding blood vessels (represented by a 3D model) allow brain surgeons to run simulations and understand the interactive relationship between the implant and the aneurysm. Although preliminary trials have shown promising results, further evaluation is required.

Multiple Sclerosis (MS)

Frontiers in Immunology (journal)

Multiple sclerosis, also called the ‘disease of a thousand faces’, has high complexity, multidimensionality, and heterogeneity in disease progression and treatment options among patients. This results in extensive data to study the disease. Human DTs are promising in the case of precision medicine for people with MS (pwMS), allowing healthcare professionals to handle this big data, monitor the patient effectively, and provide more personalized treatment and care.

Viral Infection

Science (journal)

Human DTs can predict the viral infection or immune response of a patient infected with a virus by integrating known human physiology and immunology with population and individual clinical data into AI-based models.

Trauma Management

Journal of Medical Systems (journal)

Trauma management is highly critical among time-dependent pathologies. DTs can participate from the pre-hospital phase, where the physician provides the patient first aid and transfers them to the hospital emergency department, to the operative phase, where the trauma team assists the patient in hospital emergency. Although there is no real implementation yet, a system prototype has been developed.

Diabetes

Diabetes (journal)

Human DT can also participate in diabetes management. California-based start-up Twin Health has applied DTs by modeling patient metabolism. The DT model tracks nutrition, sleep, and step changes and monitors patients’ blood sugar levels, liver function, weight, and more. Ongoing clinical trials show that daily precision nutrition guidance based on a continuous glucose monitoring system (CGM), food intake data, and machine learning algorithms can benefit patients with type 2 diabetes.

_____

(2. Clinical Trials Design: 

Beyond their application for supporting diagnosis and treatment in healthcare, DTs might also be useful in the development phase of new treatments, particularly in the conduction of clinical trials. Approximately 80% of clinical studies face delays in the enrolment phase, and 20% of trials fail to meet overall enrolment goals. The problems occurring in the enrolment stage (e.g., finding participants who fit the criteria and are willing and able to participate), coupled with the trend of personalized medicine of identifying smaller target populations, make clinical trials increasingly expensive, time-consuming, and inefficient. DTs may allow the creation of unlimited copies of an actual patient and treat them computationally with a large variety of drug combinations that could act as the control group. This way, DTs of real patients could be used to test early-stage drugs to accelerate clinical research, minimize their hazardous impact and reduce the number of expensive trials required to approve new therapies.

The current use of DTs in clinical trial support is, however, very limited. Some studies show that DTs are promising for addressing the main challenges of clinical trials, such as designing smaller trials with higher statistical power or recovering power in ongoing trials affected by low enrolment or high dropout rates. In the short term, DTs are expected to contribute to randomized controlled trials to improve power and efficiency without introducing bias.

At present, UnlearnAI, a leading company in the field, is working with DTs to accelerate clinical research in Alzheimer’s Disease and Multiple Sclerosis.

Addressing Placebos and Dummy Drugs in Clinical Trials:

The control group in comparative clinical trials sometimes generates ethical issues when the treatment is potentially lifesaving (and the standard of care/placebo is not) or if there are important differences in the treatment’s characteristics (e.g., safety issues, invasive procedures compared to non-invasive ones, etc.). DTs can replace placebo (or standard-of-care) patients and simulate the evolution of health states based on patients’ characteristics, providing a representative view of an intervention’s impact on the virtual twin. The DT will, therefore, create a synthetic control group.   

_______ 

(3. Optimizing Hospital Operations: 

Another potential DT application in healthcare is the optimization of hospital operations and management. Large companies such as GE Healthcare and Siemens Healthlineers have already developed DTs and are currently tailoring their DT services for hospitals to respond to challenges such as growing patient demand, increasing clinical complexity, aging infrastructure, lack of space, increasing waiting times, and rapid advances in medical technology requiring additional equipment implementation. Using DTs, different possible solutions can be tested in virtual environments before scheduling and implementation in the real setting (e.g., bed planning, staff schedules, surgical simulation, and virtual drug experiments). For instance, GE Healthcare developed the Capacity Command Center to build DTs of patient pathways in Johns Hopkins Hospital in Baltimore. By applying simulations and analytics, the hospital can predict the patient activity and plans capacity according to demand, thus significantly improving patient service, safety, experience, and activity volume.

_____

Digital twins improve healthcare:

In the healthcare industry, the list of digital twin advantages is already unbelievable and still growing. From creating a virtual model of a person’s body, even down to the molecular level, to deciding which surgical procedure would be most effective or being capable of anticipating a patient’s response to experimental therapies without putting their lives at risk. Digital twins can offer a safe setting for testing how modifications will affect a system’s performance. They allow for evaluation and monitoring without being nearby. Problems can be foreseen before they occur, giving time to make the necessary adjustments or follow the proper procedures.

(1. More personalised medicine:

Personalised medicine, also called precision medicine, allows for easier tailoring medical treatments to individuals based on their particular genetic makeup, anatomy, behaviour, and other aspects. Precision medicine is gaining popularity as it satisfies specific individuals’ needs according to medical decisions and therapies. It is made possible by digital twins.

(2. Virtual Organs:

Digital twins are being used in medicine to replicate the real body, and create virtual and customisable organ models. In order to assess the evolution of diseases over time or the response to new medications, therapies, or surgical interventions, several companies have been working on virtual hearts that can be tailored to individual patients and updated.

For example, Philips created a Dynamic HeartModel, a clinical tool that enables cardiologists to evaluate various heart functions important for the diagnosis and care of patients with CVD.

Tommaso Mansi and his research team at Siemens Healthineers have been developing a computational model of a digital heart that is accessible on a computer. A prototype that definitely revolutionises precision medicine.

The objective is to make it possible for the doctor to employ the digital twin’s predictive guidance in real-time while performing a surgery, potentially boosting the number of patients who could benefit from the therapy.

Also, European firm FEops has already launched the Heartguide platform, where virtual copies of the heart or its substructures are created from cardiac scans using cutting-edge technology. Through the use of this technology, FEops hopes to enhance and broaden the tailored care for patients with structural heart disease.

(3. Body Scanning:

Jeff Kaditz, an American start-up CEO, designed Q Bio Gemini, the first digital twin platform to scan the whole body. According to the company’s website, its advanced computational physics models, which are more accurate than traditional MRI for many illnesses, can record a whole-body scan in 15 minutes without the use of radiation or breath holds.

(4. Patient Data:

A patient’s health information, including medical records, the results of diagnostic tests and examinations, a history of prescription medications, and data from wearables and other sources, can be gathered and stored to construct a digital twin of the patient. Therefore, there are more and more apps that help keep track of a person’s health changes throughout time. Wearable sensors and cutting-edge algorithms can anticipate the beginning of infection, inflammation, and insulin resistance.

(5. Digital twin simulations: 

Digital twin simulation enables the virtual representation of models that can create data to help in research and development. This technique is very important in healthcare as it allows medical personnel to predict the outcome before selecting treatment or therapy. With the virtual representation, it is possible to find out how specific treatment or what solution will be the best for the patient before testing it on the patient but on the virtual replica of its body or organ.

(6. Digital twin in surgery:

The idea behind a surgical digital twin is that a patient model is built, and surgery may be planned in a multidisciplinary team conference, practised beforehand in a simulator, and referred to during the operation to check anatomy and minimise unintended structural damage. Such methods can help prevent the errors that could occur during an operation and allow doctors to prepare better for what will happen.

(7. Patient’s monitoring:

Predictive analysis through simulation allows for detecting symptoms at an early stage, and especially with some illnesses, preventing and recognising diseases at the right time might save someone’s life. Thanks to the digital twin assistance, doctors have real-time access to information.

_____

Weaknesses of digital twins in healthcare:

Currently, the level of technological advancement does not allow us to predict all possible influences of a newly created drug, so for the moment, computer-based research cannot completely take over real-life clinical trials. Thus, unfortunately, the amount of animal tests and time and investment consumption stays the same. Part of the problem is also the lack of trust to computer tests compared to more traditional approaches.

Yes, technological optimists stay very hopeful about the in silico research, but realists consider it very improbable that the technology would ever be able to model all the processes in a living cell. That would literally mean modelling life itself. For many people, this pragmatic approach to life is still something of the blasphemy towards nature.

Let’s assume that one day our technological level would allow us to decipher all the enigmas of the human body. Even with that, we would still need much more progressive methods of data collection and storage, not to mention the cost of medical devices. Until those are available to a wide market, no advancements can be made.

Another source of worry is data quality. A working computer model needs reliable data, however, at present many databases are not objective enough to be taken into account. For instance, take racial and gender bias: data available for white men strongly dominates in medical research.

So, the first problem to tackle is data collection and storage. For reliable automatic data collection and analysis, we need a large amount of detailed data and unified electronic health records. So far, electronic health records are dissimilar and unstructured, and for the most part, cannot be used for reliable computer analysis. There is also an ethical problem: doctors need full patient consent for data collection and manipulation, and this is where we face the issue of confidentiality.

We also need to further develop the mathematics of how we create digital twins from patient data, how we measure uncertainty in patient data, and how to account for uncertainty in the model in predictions.

Another challenge is how do we get better at predicting how the heart will operate under extreme conditions. We often want to predict when the heart will fail, however, we only have information that is obtained from them under normal operating procedures. 

Last but not least: with all the seeming complexity of a digital twin system it must be user-friendly enough to be used by the medics, the patients, and the computer, and ensure good communication between all three sides.

There is also ethical problem of eugenics that would definitely arise with the introduction of virtual twins. Once computer models can reliably point out genetic profiles with high survivability, it would hoist old ideas of “good” and “bad” genes. From here there is only a tiny step to choosing embryos in regard to their genetic profile, and a little bit bigger step to checking the genetic codes of people applying for a job.

Yet another issue is people’s skepticism towards computer-generated solutions, both from medics and patients. There were separate studies of this problem that revealed the following: doctors tend to not trust AI programs introduced in the medical environment. Their main worries are misdiagnoses, wrongly chosen treatment approaches and, not to neglect, their concern that an AI would in the long run replace medical professionals.

______

-14. Digital twin in insurance:

Here are some ways how insurers can address current challenges and benefit from Digital Twin Technology-

Fraud detection:

Fraud detection helps in reducing insurers loss ratio.  Fraud occurs in about 10% of property-casualty insurance losses in US. Using the Machine Learning algorithms on historic data, insurers can check the inflated claim amount by policyholders. For example, when there is a claim for replacement of an automobile part, ML model can let us know if the part can be repaired versus replaced. The images of prior similar losses can be accessed by the Machine learning model to derive at this conclusion. In the event of recent catastrophic claims, AI/ML models can also flag the fraudulent claims filed by a policy holder who is not in the disaster zone.

Claims:

Digital Twin can aid in reducing the overall claims settlement cycle time. ML models can identify the delay in the process and simulations can help in fixing the inefficiencies. Effective utilization of the claims data available within the insurers data ecosystem can also help to process the low impact and high frequency claims quicker. It can also speed up claims processing through simulation of accident scenario, assessing impact to the property which reduces the time to investigate. All these can aid in improving the overall operational efficiency of the insurer.

Customer retention:

Insurance product has become a commodity.  Existing policyholders can shop around and switch to competitors in few minutes based on just the pricing. In US, average retention rate in insurance industry is 84%. It costs seven to nine times more for insurers to attract a new customer than to retain the existing customers. Historic retention data can help the digital twin to identify customers who are more likely to churn using ML models. A customer who had a pleasant experience during his claims process has more propensity to renew his policy when compared to a dissatisfied customer. Factors like increase in premium during the policy holder’s tenure, Customer lifetime value can be utilized by the model to predict if the customer will renew their policy. Digital Twin can derive a propensity score of the existing policyholders and provide recommendations as next best actions for retention. This will increase the overall renewal rate for an insurer.

Customer satisfaction:

Fraud detection, customer retention, improvement in claims processing aims to improve overall customer experience. Digital twin technology will help in predicting customer behavior and enabling high gain customer interactions. Machine learning algorithm can enable the customer centric product design with the effective usage of historic data. A product can be priced rightly or further analyzed to understand why it did not perform well in the market. This will enable successful launching of the new product with the insights of the buyer habits and risk profile.

Marketing and Sales:

Existing data in the insurer landscape can be leveraged to find the cross-sell upsell opportunity.  This will increase revenue for the insurers through cross sell and upsell. ML models can tap into the existing customer datasets, their buying patterns and other touch points to identify a cross-sell upsell opportunity. Main challenge for an insurer is that majority of the quote does not get converted to a policy. Based on the historic data, twins can identify the gaps and provide insights to increase the quote to policy conversion ratio.  

_____

-15. Digital Twins for Education:

In education, the main stakeholders namely Teachers and Students can leverage the benefits of this “Digital Twin” (DT) technology to a larger extent. The usage of DT starts right from curriculum design and penetrates deeply into various facets of teaching-learning process. In curriculum design it could be used to keep the content up-to-date. It helps to create useful simulation models based on the course requirements. Simulation based leaning according to various research studies has helped the students with the following aspects. Increased motivation, increased self-responsibility for learning, facilitates peer learning and improves the overall learning activity. Again from the teachers end it has enhanced the content delivery, improved use of technology for teaching, ease of demonstration and also helped in student assessment. Moreover, it helps in re-creating the communal classroom experience with more improvised class room engagements.

_

Today, many international organizations are promoting the usage of digital twins because of its inherent potential to reduce the time and expenditure associated with construction and commissioning of new systems. Importantly, schools are being encouraged by industry to incorporate pedagogical digital twins in the education of automation. The students who have actually used the technology says that” Physical equipment is expensive, and the learning process is slow. With digital twins we are very flexible because we can expand the number of machines with a mouse click.”. The teachers and students feel that Digital twin technology could bridge between the digital and physical worlds enabling timely access to the physical system data thereby preventing problems before they occur.  Also, this technology could be an alternative for uninterrupted education services and even help in developing new directions and plans for transforming education. In the classroom, digital twin technology can be used for experimental and experiential learning. The structure of a system and its working modalities could be best understood using this technology. In the laboratory, a digital twin can be used to explore the behaviour and limitations of the system under various simulated conditions. Since the virtual representation employed in a digital twin makes it easier to manipulate than its physical counterpart, students can more rapidly learn and understand system behaviour in a controlled, simulation-driven environment. Digital twins also play a vital role in action research. It allows students to run simulations with the digital twin to explore system behaviour under a variety of what-if conditions, understand failure modes, and develop an understanding for system sensitivities to changes in various system parameters and external disruptions. This understanding helps to minimize the downtime and helps to enhance the throughput of the system. Fidelity of systems model could be best understood with the help of DT in simulation.

_

Digital Twins for Engineering Study: 

At the educational institutions of engineering like universities, trainings centres or schools the digital twins are software-models of industrial plants, which are simulated and visualized similar to its industrial originals and synchronized with them. The practical exercises on the real technological devices are supposed to be a necessary part by the study of engineering sciences at universities and by the training of operators for industry. 

The benefits of digital twins for engineering study are:

  • Easy preparation for training or experiments.
  • No hardware is required to be installed or tested.
  • An unlimited number of people can work with digital twins and apply them.
  • The internship with digital twins can be made as comfortable as in classic laboratory rooms.

The greatest advantage of digital twins for training purposes is the adaptability and expandability depending on the training goal.

_

Some universities have already incorporated Digital Twin technology into their teaching curricula. Stanford University has applied Digital Twin technology to architecture, construction, and engineering in several projects. Copenhagen School of Marine Engineering and Technology Management has integrated Digital Twin content into their teaching curricula as they believe they are the reality of the industry.  Overall, the use of Digital Twin technology in higher education increases student motivation, facilitates and accelerates understanding, and improves the overall learning experience. In addition, industry encourages universities to incorporate pedagogical Digital Twins in automation to give students the initial understanding of tools and skills they will need in their future.

______

-16. Electric grid and digital twin:

It has been reported that approximately one-third of Europe’s electricity grid is over 40 years old, so many assets are prone to vulnerabilities. Not just in Europe but across the world, transmission and distribution (T&D) utilities are striving to access the right data at the right time. To accurately understand their entire operation, a digitalised management system is needed. The crux of this system is the digital twin – a real-time digital reflection of real-world infrastructure. Digital twin technology can represent the grid in fine detail, displaying assets, surroundings, and connecting infrastructure, before including the dimension of time. Linking a digital twin with real-time information provides a living digital twin that is an exact replica of a utility’s real-world grid at a given point in time. Europe’s utility grids can benefit from this because they can visualise and predict how and where investments in infrastructure will provide the largest return, as well as estimating how new loads to the grid may hamper supply or impact existing infrastructure.

In addition, a digital twin can rapidly draw attention to areas of concern and prioritize the urgency of response to the human user. Restructuring of the energy system has resulted in larger amounts of energy being transported across longer distances, which Europe’s grids were not designed for, so prioritization of problems is vital. Similarly, with electricity demand skyrocketing, the integration of renewables causing concern, and environmental challenges such as snowfall on vegetation and storm damage adding further inconvenience to the grid, the adoption of technology to help utility operators can provide deeply valuable insights.

Digital twins can empower utilities, giving them the necessary asset intelligence to respond to unexpected emergencies with urgency, without negatively impacting the grid’s effectiveness.

______

Third-generation AI-powered digital twins can save energy:  

Energy firms have long embraced basic digital twin technology; a virtual model of all operating parts in a system provides insight on how different systems work together and where there are potential problems, such as leakage or inefficient usage. Using those insights, staff can adjust operations to avoid problems or maximize production and efficiency, saving their customers and decreasing resources for their production needs. Digital twins can also help energy firms save money by predicting potential problems due to equipment breakdowns. By closely examining the relationship between components, systems can determine if there is any fluctuation in power usage, production or any other aspect of the system, and alert staff to potential problems.

Current digital twins are based on the First Principles mathematical model, which applies laws of physics, — such as properties of materials and the relationship between them — to understand and provide controls over a process. In energy production, for example, that would entail bringing in data from all sources and evaluating how real-world changes would impact the process, essentially covering all aspects of production and enabling managers to determine how best to deploy resources. According to industry experts, energy firms that have deployed digital twins increased operating reliability by as much as 99%; saved as much as 40% on maintenance; and decreased expenses by $11 million by preventing failures.

Digital twins currently in use indeed do provide a great increase in efficiency and reliability, but they come at a price. The systems to provide models that update themselves based on data entail dozens of technologies — most of which have to be licensed at great expense. And it must be operated by individuals with a deep knowledge of AI systems — a resource that itself is in very short supply.

Despite all that, many utilities have begun utilizing digital twins, and it’s likely that many more will do so in coming years, as the need to reduce resource-use grows more acute. But while the digital twin technologies most utilities use will certainly reduce waste and maximize resource use, it won’t cut costs. The technologies that must be licensed and the high salaries AI experts need to be paid guarantee that, although there is likely to be more power available, it’s going to be more expensive. And smaller utilities that can’t afford those costs, or that serve jurisdictions where power costs are capped by regulators, may not be able to benefit from digital twin technology, at any cost.

The solution for those utilities — and the industry in general — lies in implementation of advanced third-generation digital twins, which automatically provide updates based on data as it comes in, even if that data does not fit the model. With these advanced systems, energy firms can map out all aspects of operations in a plant, a grid or a series of grids based on real-world data — with production or deployment of energy adjusted on the fly. And all data and controls reside in a standard interface that can be understood and controlled by staff, including those not trained in AI management.

The system can be trained to identify ways to optimize operations. If optimization is possible, then the AI algorithms can be trained based on Generative Algorithms and Diffusion Models, similar to what is used in the natural language processing (NLP) space. Unlike NLP, however — which is generally used to create pictures, texts, music and videos — this application of the technology helps solve problems in industrial plants, manufacturing systems and power plants. And many of the real-world problems they address can be used to develop solutions essential to achieve zero carbon goals.

Thus, if a power station is out of commission due to storm damage, a third-generation digital twin can automatically funnel power to connected substations to make up for the shortfall — temporarily reducing energy availability in areas where there is less usage or demand. The advanced technology provides clear models that will help ensure that the lights stay on and the heating systems remain online. Staff can respond to challenges and crises in real-time, using a standard interface, ensuring as steady and efficient a flow of power as possible.

Third-generation digital twins can also help make maintenance more efficient. By collecting and analyzing data as it comes in and matching it to a constantly updated model, producers can tell right away if there is a problem and trace it to a specific piece of equipment — giving repair crews the opportunity to repair or replace it before it fails. These systems also make scaling much easier, providing clear data on how real-world changes to systems, such as satisfying additional demand that immediately requires the deployment of additional resources.

______

-17. Earth Simulation Twins:

Digital twins are even being applied to climate modeling. NVIDIA’s recently launched Earth-2 initiative aims to build digital twins of the Earth to address one of the most pressing challenges of our time, climate change. Earth-2 aims to improve our predictions of extreme weather, projections of climate change, and accelerate the development of effective mitigation and adaptation strategies — all using the most advanced and scientifically principled machine learning methods at unprecedented scale. Combining accelerated computing with physics-informed machine learning at scale, on the largest supercomputing systems today, Earth-2 will provide actionable weather and climate information at regional scales.

Separately, the European Union has launched Destination Earth, an effort to build a digital simulation of the planet. The plan is to help scientists accurately map climate development as well as extreme weather. Supporting an EU mandate for achieving climate neutrality by 2050, the digital twin effort would be rendered at one-kilometer scale and based on continuously updated observational data from climate, atmospheric and meteorological sensors. It also plans to take into account measurements of the environmental impacts of human activities. It is predicted that the Destination Earth digital twin project would require a system with 20,000 GPUs to operate at full scale, according to a paper published in Nature Computational Science. Simulation insights can enable scientists to develop and test scenarios. This can help inform policy decisions and sustainable development planning. Such work can help assess drought risk, monitor rising sea levels and track changes in the polar regions. It can also be used for planning on food and water issues, and renewable energy such as wind farms and solar plants. The goal is for the main digital modeling platform to be operating by 2023, with the digital twin live by 2027.

________

-18. Digital twins repurpose sustainability, promote environmental protection and reduce carbon emissions:

Companies are gradually adopting digital twins to manage their most critical assets. In return, digital twins are helping monitor and identify ways to become more efficient, prevent downtimes, and even plan for the future. Digital twins can inform decisions on whether to reuse, recondition, recycle, or scrap plastic products along with where to collect them from. Digital twins also offer the possibility of improving and selecting the right materials for manufacturing purposes targeting energy savings or simply being more efficient at the storage of renewable energy. Digital twins can indeed help companies to repurpose their sense of sustainability and take it to a plausible level. In the end, development with a good sense of sustainability is key for corporations to thrive while limiting their environmental footprint.

_

Digital twin technology provides a tremendous opportunity to achieve sustainability targets. Digital twins enable us to do more with fewer resources: from designing a new product to planning the production and manufacturing it — even all the way to repairing and recycling. We need to apply ecological thinking when designing products. As a first step, it’s necessary to create transparency while tracking carbon emissions — not only during production, but from the initial design phase through the entire supply chain. Green Digital Twin application uses information from the entire supply chain, including sourced parts, tools, and devices. It creates transparency with regard to both the current and future carbon footprint while enabling easy and early calculation of emissions. In addition, this application allows maximum flexibility for designing parts in a way that enables low-carbon repair and remanufacturing while still meeting the requirements for materials and processes. This thorough approach results in efficient and durable products.

_

Plastics are essential to industries ranging from packaging and textiles to construction, transportation, healthcare, electronics, and many others. In the last 50 years, global production of plastics has increased 20-fold, from 15 million tons to 311 million tons, with only 9% of all plastic produced being recycled. A staggering 91% ends up causing USD $80-$120 billion per year lost to the global economy while producing major environmental damages. In response, BASF is developing its blockchain-enabled platform reciChain to improve traceability of recycled plastics and is running major pilot test centers in both South and North America. The project involves the use of digital twins to enable the physical and digital tracking of recycling material. BASF has partnered with Optel and Security Matters. Both companies have developed unique barcoding systems that can withstand manufacturing and recycling processes without any alteration of recycled materials. Using digital tracers, these barcodes aim to capture all information to authenticate sustainability claims and make the sorting process more accurate. With such technology in place, BASF is expecting to systematically unlock the value of recycling materials that often end up in landfills.

_

British company Integrated Environmental Solutions (IES) has recently developed the Intelligent Communities Lifecycle (ICL). This environmental digital twin technology reduces carbon emissions of buildings and cities worldwide. ICL bridges the gap between the real and virtual worlds to enable energy-efficient design and continuous optimization by incorporating real-world operational data, which keeps digital-twin accurate. Thus, buildings can provide information on potential anomalies that were previously unlikely to be captured.

One of the current users of ICL technology is the Nanyang Technological University (NTU) in Singapore. The ICL technology covers the whole of NTU’s 250-hectare campus and the adjoining 50-hectare JTC Corporation CleanTech Business Park, with more than 200 buildings on site covering a 1.1 million square meter floor area.

The intelligent community design master planning model is 91% accurate for total and 97% for chiller energy consumption. ICL has helped to unleash 10% energy savings (USD $3.9 million per year) and locked the annual emissions of 8.2 kilotons of carbon. ICL also helped to visualize potential savings based on more efficient technologies related to building envelope, lighting and occupancy sensors, plug load management, and high-performing optimized chillers, which demonstrated an additional 31% average of energy savings (USD $4.7 million).

______

-19. Role of Digital Twin in Telecom Industry:

Three main types of digital twins for communication service providers are listed below:

  • Network twin: This twin can help forecast breakdown points during periods of extraordinarily heavy network demand, such as during natural catastrophes. Network twin is particularly valuable because it can account for weather patterns and their potential impact on signal strength as well as the overall network health.
  • Customer twin: Creating a virtual representation of a company’s customer persona can assist a telco to detect specific problems that may affect users. This allows the operator to keep these consumers informed, preventing future support concerns.
  • Process twin: Most of the core network business processes that keep the system running smoothly are modelled by these twins.

_

Networking is an area where digital twins are reducing downtime for data centers. Over time, networks have become more complicated. The scale of networks, the number of nodes and the interoperability between components add to their complexity, affecting preproduction and staging operations. Network digital twins speed up initial deployments by pretesting routing, security, automation and monitoring in simulation. They also enhance ongoing operations, including validating network change requests in simulation, which reduces maintenance times. Networking operations have also evolved to more advanced capabilities with the use of APIs and automation. And streaming telemetry — think IoT-connected sensors for devices and machines — allows for constant collection of data and analytics on the network for visibility into problems and issues. The NVIDIA Air infrastructure simulation platform enables network engineers to host digital twins of data center networks.

_

The role of digital twin in the communication industry are as follows:

  • Tower Management

Effective monitoring is critical for the proper functioning of cell towers. However, the growing complexity of expanding networks increased operational expenses, and security concerns exacerbate the difficulties of remotely monitoring cell towers. Digital twin enables tower owners to remotely monitor and control their telecom assets, irrespective of their location. Remote sensors could indeed gather data on aspects including temperature, proximity, motion, and position, evaluate these data using AI/ML algorithms, and integrate them into the DT of the tower. Operations and management teams might be capable of addressing any issues that arise by examining the digital twin of the tower.

  • Field Service Management

Field service staff often visit sites with very little knowledge. No matter how much a site or tower is equipped to withstand failures, they are always prone to failures. Inefficiencies in any field service function can create significant delays. Field staff will benefit greatly from a DT equipped with augmented reality. They can gain key information and find solutions prior to the site visit, and professionals from the command center will be able to assist them in real-time by viewing the DT that replicates what is happening on the site.

  • Planning and design of network

Another area where a DT may tremendously assist providers is network capacity planning and design. Network configurations that have been deployed might vary over time and whenever new equipment is introduced. Having a track of configuration changes and retaining an adequate list of network elements has always been a difficulty for operators. The Internet of Things enabled digital twins to carefully track their existing infrastructure, which speeds up the process of expanding, upgrading, and modifying it. Digital twin’s machine learning component also allows for deeper analysis of usage patterns, network abnormalities, and fault forecasts. All of these tools, when used collectively, would dramatically improve network planning and design.

_

Real-life example: Rolling out 5G with Twins.

Ericsson, a provider of telecommunications equipment and services, is combining decades of radio network simulation expertise with NVIDIA Omniverse Enterprise. The global company is building city-scale digital twins to help accurately simulate the interplay between 5G micro cells and towers and their environment to maximize performance and coverage.

_____

-20. Digital twins in Hydrocarbon Exploration:

Oil companies face huge risks in seeking to tap new reservoirs or reassess production stage fields with the least financial and environmental downside. Drilling can cost hundreds of millions of dollars. After locating hydrocarbons, these energy companies need to quickly figure out the most profitable strategies for new or ongoing production. Digital twins for reservoir simulations can save many millions of dollars and avoid environmental problems. Using technical software applications, these companies can model how water and hydrocarbons flow under the ground amid wells. This allows them to evaluate potentially problematic situations and virtual production strategies on supercomputers. Having assessed the risks beforehand, in the digital twins, these exploration companies can minimize losses when committing to new projects. Real-world versions in production can also be optimized for better output based on analytics from their digital doppelganger.

______

-21. Digital twins in Airport Efficiencies:

Digital twins can enable airports to improve customer experiences. For instance, video cameras could monitor the Transportation Security Administration, or TSA, and apply AI to look for ways to analyze bottlenecks at peak hours. Those could be addressed in digital models, and then moved into production to reduce missed flights. Baggage handling video can be assessed to improve ways in the digital environment to ensure luggage arrives on time. Airplane turnarounds can benefit, too. Many vendors service arriving planes to get them turned around and back on the runway for departures. Video can help airlines track these vendors to ensure timely turnarounds. Digital twins can also analyze the services coordination to optimize workflows before changing things up. Airlines can then hold their vendors accountable to quickly carrying out services. Caterers, cleaners, refueling, trash and waste removal and other service providers all have what’s known as service-level agreements with airlines to help keep the planes running on time. All of these activities can be run in simulations in the digital world and then applied to scheduling in production for real-world results to help reduce departure delays. NVIDIA Metropolis helps to process massive amounts of video from the edge so that airports and other industries can analyze operations in real time and derive insights from analytics.

______

-22. Digital Twin of an Organization (DTO):

As the capabilities of digital twins have become more advanced, more and more industries are creating these models to help improve performance, leading to better outcomes. As digital twins have become more mainstream, it’s become increasingly clear that anything can have a digital twin. While the concept has traditionally been used with equipment or hardware, it’s no longer limited to these areas. Given that anything–from a building to a pair of shoes–can have a digital twin, it’s not surprising that the concept of a digital twin of an organization (DTO) is gaining traction. The idea of a DTO–developed by Gartner–was driven by the goal of using a digital representation of an organization to support the implementation of changes or new initiatives. A DTO provides a virtual model of a business that leaders can analyze and tweak as needed. When fully implemented, it provides a full twin in the context of operations. Notably, the data used in DTO models can be consistently updated, which gives businesses real-time information on how the organization is functioning, utilizing resources, responding to changes, and meeting customer needs. Obviously, access to such a model gives businesses a myriad of competitive advantages.

______

-23. Digital twins in agriculture:

The agricultural sector has initiated to leverage digital twin in farming and other activities. It has already started vertical farming with other major cutting-edge technologies in some smart cities. The digital twin is popular to bring smart farming for boosting farming productivity and sustainability. The farm management leverages digital twin to allow decoupling of physical flows in the whole process, especially in the COVID-19 pandemic. The farmers can easily detect the faults and errors and start the operations remotely by depending on the real-time digital information through digital twin technology. The modern agricultural sector depends on real-time information through smart devices such as drones and smart equipment while incorporating Cloud computing, IoT, AR, as well as Robotics. The digital twin technology enables the agricultural sector to plan, analyze and enhance multiple crop growths throughout the year while maximizing yields. It is well-known that farming is driven by heavy workflows. The cutting-edge technology helps to categorize valuable and crucial actions in necessary fields and be at the top position in the value chain. It also helps farmers in having a clear understanding of the content, quality, and capacity of soil for different crops to boost productivity. The digital twin in farming speculates some in-depth insights regarding expected yield in a year, the specific volumes of fertilizers and water, an appropriate amount of sunlight, weather predictions, as well as descriptions of potential crops. Multiple visual and data analytics are applied to the digital twin in the agricultural sector for measuring patterns and traits in different kinds of crop seasons like plant health, height, growing velocity, and other minute details to maintain the best quality crops. AgTech companies can research and study to drive new innovations and solutions for the agricultural sector through the digital twin. That being said, the digital twin is thriving in the agricultural sector for its multiple useful applications throughout different crop seasons. Leveraging the digital twin technology in farming can boost crop productivity and gain international recognition for best quality crops. It helps to drive revenue for a country like India where farming contributes 18% to the GDP.

The Benefits of Digital Twins in Agriculture:

-Greater yields on the same acreage

-Profitability maximization

-More resilience to weather

-More sustainable operations

-Faster time to market

With a digital twin to analyze, agricultural operations can simulate how changes in weather patterns will affect their yields, whether they can save money and do less harm to the ecosystem by reducing their agricultural inputs (without negatively impacting the quality of their crops), and spot warning signs like discolored foliage, the presence of blight or pests, over-irrigation, or falling carbon in the soil. And they can do it fast enough to respond before too much damage is done. Even better, they can use predictive modeling based on their past and current data to make guesses with a high degree of probability that a problem will occur in the near-term and head it off.

The data that powers digital twins is already proving invaluable for agricultural operations around the globe. Belgian agricultural technology company 2Grow, for example, helps its partners measure things like stem width with unique sensors. Stem thickness is a good proxy for the flow of water within a plant and 2Grow claims its sensors enable tomato farmers to reduce the surface area and water needs for a plot of tomato plants -by as much as 20-percent.

_

Leveraging Remote Sensing in Agricultural Settings:

Though the benefits are many, the process of implementing a digital twin system in agriculture can be burdensome, requiring a large number and wide variety of IoT sensors deployed across an entire farm. Farmers have discovered some smart shortcuts, however, like mounting imaging sensors on their irrigation hubs. Irrigation systems necessarily already run throughout most farms and hence provide a practical existing platform to piggyback on.

Agricultural Data Sources:

-Expected Yields

-Inputs (e.g. Fertilizer, Water, Sunlight)

-Soil Carbon Levels

-Field Imaging

-Weather History and Forecast

The digital revolution in farming came at just the right time. For generations, institutional knowledge was siloed in individual farmers. These new digital systems are democratizing access and accelerating the advancement of the sciences behind growing great crops — which is vital because there is still a lot left to learn.

“Light, plant nutrition, air, build-up of gasses, water recycling, weather conditions and rain – the combinations are endless and every stage of growth is different,” explains Dave Scott of Intelligent Growth Solutions, an agricultural IoT firm. “Plant science is massively misunderstood. We understand more about how a fish works than a plant.”

Furthermore, despite the advent of high tech tools like GPS monitoring, drone flyovers, satellite imaging, and laser land surveying, the goal of true precision agriculture and digital farm management is still out of reach for many.  

_

Siemens takes Digital Twins underwater:

German technology company Siemens has even taken Digital Twins underwater. Siemens and Nemo’s Garden announced an entirely new use case: a reliable method of cultivating herbs, fruit, and vegetables underwater. Founded in 2021 by the President of Italian scuba diving equipment manufacturer Ocean Reef Group, Nemo’s Garden has an ambitious goal: to create an entirely new system of agriculture that doesn’t rely on traditional methods or the variability of typical ecosystems and uses less ecological resources. They are focused on “areas where environmental conditions, economical, or morphologic reasons make plants growth extremely difficult.” They are growing plants in biospheres under the ocean. But the challenge was in monitoring the biospheres continuously.

The team has built a comprehensive digital twin of the current biosphere model. Using Siemens’ Simcenter™ STAR-CCM+™, the Nemo’s Garden team can simulate growing conditions inside of the biosphere and how the biosphere might affect the flow of water around it to minimize impact to the natural ecosystems where they’ll eventually be placed. With this digital twin, Nemo’s Garden doesn’t have to wait for real-world weather conditions or seasonality — or keep putting divers into risky situations — to quickly understand how they can refine their design.

The team uses digital twins and machine learning to monitor the health of their crops using automation, to eventually eliminate the need to send trained divers underwater. With its MindSphere service, Siemens used video of crop growing cycles and compared it to traditional farming to create an algorithm that monitors plant growth holistically. In the future, Siemens will deploy its Industrial Edge computing devices in each biosphere and connect them to actuators to automatically adjust air circulation, irrigation, and nutritional dosing throughout an entire growing season. That’s how Nemo’s Garden imagines scaling its business to a global agricultural service that can be deployed in any ocean.

_

Is Digital Twins in Agriculture a Hype?

The value of any technology ultimately lies in the ability to optimize costs and resources. Having the ability to preempt outcomes gives food growers the benefit of foresight which can then be applied in real life. An example of real-life application and commercialization of Digital Twin technology is the mechanistic model which was developed by Tom De Swaef at Gantt University. Belgian company 2Grow leverages this model to measure variations in the flow of water and stem thickness in tomato plants. The company aims to reduce the 20% surface area spent on plant production.

It is still unclear if the community is making an effort to adopt digital twins in its operations. What’s more, it can be argued that in most cases digital twin technology is not actually necessary. Advances in machine learning have made it possible to predict key events without building a full model that would require large amounts of high quality data which is also expensive to get. As a food grower wanting to predict certain properties, focusing on measuring and monitoring key changes might be all that is needed to build a successful predictive model. What’s more, this is dramatically more affordable, making it attainable for food growers that need to see immediate ROI on the implementation of predictive models.

For example, if you grow potatoes it is important to have indicators for pests such as late blight disease, caused by a fungus-like organism that can result in crop failures in a short period if appropriate control measures are not adopted. For this type of row crop on large acres of open field, having cameras mounted on pivot irrigation systems can efficiently and effectively identify diseases or issues. The data needed to create a digital twin for an open field of potatoes would cost a fortune, and creating an entire model at such a scale to get insights that can be obtained with simpler and more affordable technology, just doesn’t make sense.

There are specific use cases where it makes financial sense to build a digital twin, such as for plant breeding, where a model could allow you to predict early on if a specific variety is not commercially viable. But in many cases, there’s no need to crack a nut with a sledgehammer.

______  

______

Innovative applications of digital twins:

-1. How an athlete’s digital twin can prevent injury:

Combining several technologies to create a digital replica of an athlete’s body has made a giant leap forward in understanding what goes on. Australian basketballer Maddison Rocci is standing perfectly still in a film studio, more than 100 cameras trained on every part of her body. She stands, arms outstretched, as the shutters click away. Behind the lenses is a team of filmmakers who usually work with Hollywood directors. Today, however, they’re working with scientists. Once the cameras go quiet, a team of biomechanical engineers and software programmers from Griffith University’s Centre for Biomedical and Rehabilitation Engineering (GCORE) and movie animators from Myriad Studios and Naughty Monkey use data to create a digital twin of Rocci, that replicates her anatomy, inside and out.

This is the ‘digital athlete’. It uses 3D body scans, MRI and motion capture data to give a detailed representation of the body shape, bones, joints, muscles and other soft tissues in Rocci’s daily performance environment. The scientists can now see inside her body as she runs, jumps, twists, turns and shoots. The stress on her muscles and joints is captured, the data helping coaches devise a better training routine, or a tweak in technique. For example, coaches can examine in real time when Rocci (or any athlete) does a side-step, which is both a common movement and cause of many anterior cruciate ligament injuries in the knee. The information is both instant and personalised, which is critical, because each athlete will experience different stresses due to their unique physiology.

Maintaining health of and/or repairing joint tissues requires ”ideal” loading and strain of tissue. Recent research has shown such ‘biofeedback’ can be achieved by integrating a patient’s personalised digital twin and motion capture with wireless wearable devices.

The personalised digital twins break down movements into smaller, predictable movements, and works with neuromusculoskeletal rigid body models and real-time code optimisation and artificial intelligence (AI) or machine learning methods.

The ability to non-invasively and accurately predict internal tissue loading in real-time in the real-world has long been considered a holy grail for biomechanists. With the development of this technology, it is possible to imagine that training and rehabilitation may soon be guided by biofeedback systems based on a digital twin of any person’s musculoskeletal system. Real-time visual biofeedback has enabled people to adjust their knee and hip movements on demand using their innate solutions or trainer directed instructions. Importantly, when patients effectively altered their movements it was associated with clinically meaningful improvements in their hip pain and function.

Optimising performance for athletes is one thing. Creating a digital twin this way has other potential applications, including for the military and the disabled. This could be used to prevent common musculoskeletal injuries that are very common in the military, and for neurorehabilitation of people with spinal cord injuries. A complete integrated system, called BioSpine, is currently undergoing trials of augmented reality-based training to enable spinal cord injured people to walk in a metaverse, or do actual physical cycling with muscle electrical stimulation and motor assistance all controlled by the person’s thoughts in an immersive augmented reality environment. With development, the technology has the potential to help quadriplegics and paraplegics “walk” again.

______

-2. New Biomarkers for Atherosclerosis found using Digital Twins:

Newly identified markers of atherosclerotic coronary artery disease (ASCAD) were pinpointed in a study using digital twins. The work was carried out by multi-omics specialist G3 Therapeutics and AI company Aitia, which has Gemini Digital Twin technology. These new findings suggest triglyceride-rich LDL particles could be a novel diagnostic marker for ASCAD and could also open up potential novel treatment targets for the condition.

“For decades, we have singularly focused on LDL-cholesterol as the sole treatment target in atherosclerotic coronary artery disease. Now, this new discovery of triglyceride-rich LDL particles opens a previously untapped set of opportunities to bring novel diagnostic and therapeutic approaches to our patients suffering from devastating cardiovascular disease,” said Szilard Voros, MD, CEO, and founder of G3 Therapeutics.

This team analyzed 665 patients from G3 Therapeutics’ GLOBAL clinical study. De novo Bayesian networks built out of 37,000 molecular measurements and 99 conventional biomarkers per patient examined the potential causality of specific biomarkers. They found the impact of triglyceride-rich LDL particles was independent of the cholesterol content of LDL particles. In the Bayesian analysis, LDL-TG was directly linked to atherosclerosis in over 95% of the ensembles. These particles’ potential causality was further confirmed by genetic validation, based on the hepatic lipase gene. The team’s analysis revealed that atherogenic lipoproteins, inflammation, and endothelial dysfunction are involved in ASCAD, lending additional credence to the novel findings.

_____

-3. Digital Twins as Testbeds for Vision-Based Post-earthquake Inspections of Buildings:

Manual visual inspections typically conducted after an earthquake are high-risk, subjective, and time-consuming. Delays from inspections often exacerbate the social and economic impact of the disaster on affected communities. Rapid and autonomous inspection using images acquired from unmanned aerial vehicles offer the potential to reduce such delays. Indeed, a vast amount of research has been conducted toward developing automated vision-based methods to assess the health of infrastructure at the component and structure level. Most proposed methods typically rely on images of the damaged structure, but seldom consider how the images were acquired. To achieve autonomous inspections, methods must be evaluated in a comprehensive end-to-end manner, incorporating both data acquisition and data processing. Authors leverage recent advances in computer generated imagery (CGI) to construct a 3D synthetic environment with a digital twin for simulation of post-earthquake inspections that allows for comprehensive evaluation and validation of autonomous inspection strategies. A critical issue is how to simulate and subsequently render the damage in the structure after an earthquake. To this end, a high-fidelity nonlinear finite element model is incorporated in the synthetic environment to provide a representation of earthquake-induced damage; this finite element model, combined with photo-realistic rendering of the damage, is termed herein a physics-based graphics models (PBGM). The 3D synthetic environment with PBGM as a digital twin provides a comprehensive end-to-end approach for development and validation of autonomous post-earthquake strategies using UAVs.

_____

-4. Reducing food waste with DT technology: 

Roughly one-third of the food produced in the world for human consumption every year—approximately 1.3 billion tonnes—gets lost or wasted. A large amount of food produced gets wasted in different parts of the supply chain such as farms, storehouses, logistics, and processing units. Along with this, a significant amount of food wastage is also experienced at the consumer end. Of the total global food wastage, 40–45% is fruits, vegetables, and root crops.

Food wastage and spoilage within the food supply chain can be attributed to uncertainties in demand and supply, time delay, and changes in environmental conditions at different stages of the food supply chain. This is equivalent to an annual economic loss of USD 1.2 trillion. It also causes a significant environmental footprint amounting to about 8% of greenhouse gas emissions worldwide.

Food wastage is at the level of a global crisis. Producers, logistics companies, governments, food health enforcement bodies, sustainability advocates, and concerned citizens are trying multiple options to reduce wastage. Real-time monitoring of food quality has emerged as a viable solution that can benefit all stages of the food supply chain, starting from farmers to end consumers like us. It helps reduce significant economic losses which occur due to food spoilage and wastage while retaining quality and nutritional value. A novel integrated ‘smart digital platform’ is being developed to estimate and predict the food quality. A platform that can gather data from multiple types of sensors, with data processing capabilities, can provide much value in reducing wastage. This will enable all the stakeholders of the food supply chain to make decisions dynamically regarding altering supply chain logistics and storage conditions, for repurposing and minimizing food spoilage and wastage.

A food freshness platform from Tata Consultancy Services, for example, monitors food quality along the entire product journey. The system takes inputs from connected sensors installed at every stage of the supply chain, from ‘farm to fork’, to assess food products’ freshness. This live data is also fed into a digital twin of the real-world supply chain to simulate different environmental conditions that can affect the lifespan of food. This can include changes in temperature, humidity, air quality and light intensity. For example, the platform can anticipate the shelf-life of potatoes for different applications – as chips, fries or potato starch – by considering aspects such as fungal growth or change in sugar content over time. Suppliers and retailers can then use this information to adapt how they are handled at each stage of the ripening process.

Digital twin of fruits:

A digital twin is a promising tool to minimize fruit waste by monitoring and predicting the status of fresh produce throughout its life. This is a new approach to create a machine learning-based digital twin of banana fruit to monitor its quality changes throughout storage. The thermal camera has been used as a data acquisition tool due to its capability to detect the surface and physiological changes of fruits throughout the storage. The infrared thermal imaging detects temperature differences between bruised tissues and sound tissues caused by the difference in thermal diffusion coefficients. The utilization of this technology allows detecting the heat given off by an object in the form of infrared radiation. This radiation generates electrical signals and provides images. Fruit with bruises has a lower reflection since damaged cells beneath the skin become filled with water. Hence, infrared radiation penetrates the skin and reaches this water accumulation where it will be more strongly absorbed compared to undamaged fruit flesh. This leads to contrast in thermal images where the bruises appear darker than undamaged parts from the same fruit. More interestingly, the detection of thermal images works well in all brightness levels to distinguish the targets easily from the background depending on the radiation difference. This technique is very efficient to detect bruises before they become visible to the naked eye. Through this principle, thermal images that are captured in specific ranges made it possible to distinguish between areas with defects in the tissue and the sound ones.

_____

-5. Digital Twins for nuclear reactors and nuclear power plants:

Digital twins are currently being used and developed in the nuclear energy industry to improve the safety, operation, design, and more of nuclear reactors. These technologies have been applied by advanced nuclear reactor designers, nuclear utilities, national laboratories, and more to assist in numerous applications throughout a plant’s life cycle. The success of this digital twin could revolutionize the advanced nuclear industry…Autonomous control is a significant cost-savings and safety feature that will enable new microreactors to come online more readily and quickly meet regulatory standards.

Predictive Maintenance:

A key application for digital twins in nuclear is predictive maintenance. A digital twin has intelligent modules that continuously monitor the condition of the individual components and the whole of a system which could be crucial for nuclear reactors. It could help predict that a certain component will fail before it actually does, meaning that an unexpected outage could be prevented. This also allows the scientists to optimize the reactors’ maintenance schedule and tasks.

According to a recent report published by INL and the U.S. Nuclear Regulatory Commission, many advanced reactor developers like Kairos Power, X-energy, GE, and Westinghouse are developing digital twin technologies aimed at predictive maintenance, downtime reduction, aging, and degradation management in future reactors.

The nuclear digital twins will serve as training simulators for the new generation of operators and as a simulation environment for engineering studies.

Oversight & Compliance Monitoring:

Another digital twin application in nuclear is in the oversight and compliance monitoring of a nuclear plant’s operation. A digital twin can provide real-time information, integration, and analysis related to compliance with technical specifications. This could provide key information to operations like history, currency performance, and technical judgments.

For example, Westinghouse currently uses digital twins for facility inspections and audits, such as an automated analysis of inspection of monitoring data. This can be used in cases like detecting a concrete crack using drones, monitoring neutron noise of reactor structures, and managing a severe reactor accident in real time.

Nuclear Reactor Design and Simulation:

Digital twins can also help in the design of a new nuclear reactor, allowing engineers to try out different approaches to optimize performance and safety. The simulations digital twins can run save time and money in the design process because they can better inform more focused experimental tests.

For example, thanks to a federal funding project given to companies to develop advanced nuclear reactors with digital twins, Kairos Power is currently working on a digital twin for its fluoride-salt-cooled, high-temperature reactor.

Emergency Response:

Digital twins could also help in the rare event of an emergency or accident. For example, a digital twin could avoid or reduce the severity of the events by increasing response efficiency, integrating live data and models, and coordinating requests for emergency response support from agencies and responders.  

______

______

Digital Twin Market & Companies: 

According to Gartner, 13% of organizations implementing IoT projects already use digital twins, while 62% are either in the process of establishing digital twin use or plan to do so. The digital twin market is estimated to grow from $3.8 billion in 2019 to $35.8 billion by 2025, at a CAGR of 37.8%, according to the latest report from Markets and Markets.

North America is one the early adopters of digital twin technology and most of the digital twin providers are located in the North American region, including Microsoft (US), IBM (US), Amazon (ASW) (US), and General Electric (US). National Aeronautics and Space Administration (NASA) was the early adopter of digital twin in North America. The region is concentrated with aerospace companies such as Bell Flight (US), Bye Aerospace (US), Lockheed Martin (US), and Boeing (US) who constantly invest in digital twin technology and carry out huge R&D with the technology to optimize their production and reduce the downtime and operational cost.

World’s Leading Digital Twin Companies: Top 10 by Revenue:

-1. Microsoft Corporation

-2. Bosch

-3. General Electric Company

-4. IBM Corporation

-5. Siemens

-6. Oracle Corporation

-7. Cisco Systems

-8. Dassault Systemes

-9. Ansys

-10. PTC Inc.

______

______

Section-14

Digital twin and cybersecurity:

Digital twin and cybersecurity can be discussed in 2 segments.

-1. Security of digital twin and its physical twin

-2. Use of digital twin to help cybersecurity

Use of digital twin technology to breach cybersecurity is already discussed in the paragraph of evil twins.

_

Segment-1:

Digital twin security:

Cybersecurity a major concern with digital twins: 

The security of digital twins needs to be considered from two sides – the security of the digital twin and the system in which it is implemented, and the security of the real-world object, system, or process it represents.

Although digital twins can help secure IoT devices and processes, the twins themselves can be exposed to risk. Digital twins are software representations that are usually deployed on standard PCs using cloud services for data collection and processing. Software, PCs, and cloud computing have known risks. However, those risks can be mitigated with the proper measures.

_

The faster a new type of technology spreads, the less attention tends to be paid to security at the outset. This forces companies to scramble to put out metaphorical fires when vulnerabilities are exploited, leading to the loss of time and profits. Because digital twins are based in the cloud and don’t require physical infrastructure, the associated security risks are somewhat lower than with other types of systems. However, the massive amounts of data being collected and utilized is drawn from numerous endpoints, each of which represents a potential area of weakness. Every time a new connection is made and more data flows between devices and the cloud, the potential risk for compromise increases. Therefore, businesses considering digital twin technology must be careful not to rush into adoption without assessing and updating current security protocols. Areas of greatest importance include:

  • Data encryption
  • Access privileges, including clear definition of user roles
  • Principle of lease privilege
  • Addressing known device vulnerabilities
  • Routine security audits

_

Because digital twins take in vast amounts of data from systems across an organization and its partners, cybersecurity is a serious concern. Because communication between digital twins and the systems that it interacts with is bi-directional, hackers who gain control of a digital twin can wreak havoc, taking over control of real-world systems, manipulating or stealing sensitive data and/or introducing malware that can spread to other systems. The privacy and security risks associated with digital twin deployment are manifold. Reinforced data security and privacy measures are, therefore, indispensable prior to digital twin deployments.

_

With digital twins being such close and synchronised representations of the real world, unintended access could give hackers an opportunity not only to access the digital twin, but also to conduct penetration testing before hacking the physical system it represents. If hackers gain access, loss of intellectual property and data, loss of control, and potentially huge costs resulting from down-time, faulty outputs, or data being held to ransom are the most significant risks. Therefore, to minimise digital twin security risks, it is essential to have a security strategy baked in and implemented from the ground up. A systematic cybersecurity strategy will help eliminate gaps that can otherwise occur between the physical and digital security mirroring and ensuring ongoing hardening of the software.

_

Despite various digital twins use cases that could reinforce security in system engineering or during the operation phase of CPS, the emergence of stealthy threats allows attackers to exploit digital twins to launch attacks on the CPS. Digital twins, being the virtual (digital) replicas of their physical counterparts, share functional requirements and operational behavior of the underlying systems. Therefore, digital twins may act as a potential source of data breaches, leading to the abuse case of digital twins. Attackers may exploit the deep knowledge about the physical process and devices accessible through digital twins with a two-stage strategy: use the key input data source namely, digital twins into a malicious state, and then through that state manipulate the underlying physical system’s behavior covertly. For example, manipulating the behavior of digital twins by modifying their defined states which would correspond to a direct attack on field devices, particularly when automated feedback loops are enabled between the physical objects and their digital counterparts. It is necessary to ensure the trustworthiness of digital twins for timely corrections because ignoring such pre-emptive measures may lead to a feedback loop of erroneous data into the system resulting in the Garbage In Garbage Out (GIGO) problem. Moreover, in human-machine collaboration scenarios, a slight system dysfunction caused by mirroring of malicious replicas may pose a severe threat to human safety. The repercussions of exploiting digital twins may have severe consequences within the digital thread that links data across multiple digital twin instances or CPS lifecycle phases. These links are an attractive target for attacks because the entire product lifecycle can be targeted in a data breach such as when manipulating high valued design artifacts. Furthermore, attack on digital thread may affect the next generation CPS system where digital twins data can be used as historical data.

_

To understand the anatomy of a cyberattack on digital twins, we need to understand the adversary tactics from attackers’ perspective as seen in the table below:

Attacker’s artifacts

Goals

Product lifecycle

•        Manipulate benign behavior of digital twins to steer the CPS into an insecure state

•        Exploit digital thread as it links data throughout the entire product lifecycle

Replication mode

Run direct cyclic state updates by replicating the virtual behavior of digital twins to the corresponding program states of physical devices

Simulation mode

•        Learn system behavior by re-running test simulations

•        Manipulate simulation parameters or system specifications’ data during security tests

Design phase

Exploit specification-based or machine learning-based process knowledge of digital twins

Decommissioning phase

•        Reuse system’s knowledge due to improper disposal of the digital twins

•        Use data security breach such as unauthorized access to gain access to archived digital twins’ data

Lateral movement

•        Gain control over high-value assets such as design artifacts

•        Manipulate sensor readings or simulation parameters at random intervals while ensuring that the new values do not deviate significantly from the real process values

_____

The countermeasures to thwart cyberattacks on digital twins:

-1. Blockchain-based digital twins:

Enhancing digital twin platform security using the blockchain:

Given that the digital twin data is used as an input source to the CPS physical processes, the digital twin must be built on trusted data. In this context, empowering digital twins with blockchain allow industries to manage data on a distributed ledger while ensuring trusted digital twin data coordination across multiple stakeholders. Platform security of the digital twin platform and data is crucial. Blockchain technology enables security from hackers due to its encryption features and provides data history transparency. Digital twins will benefit from these features, as it can transmit data securely. Both SME and large companies are beginning to utilize blockchain technology to store physical objects, and connecting digital twin technology will soon be the next step. In order for this to succeed, specialized distributed ledger platforms that allow sharing information amongst digital twins are needed.  Digital twins on a blockchain will also help with brand protection and counterfeit items. Creating a digital twin of a product on blockchain would see its transaction records saved, along with information about where it was made, and its previous owners. In addition to this, for the first time in history companies who combine blockchain with virtual simulations will be able to track their product globally with extreme accuracy – this will totally eliminate the number of stolen goods across the globe.

Companies such as IBM have invested heavily in the blockchain, creating a specified department for digital twins. IBM Watson IoT, a cognitive system that learns from, and infuses intelligence into the physical world, has begun creating digital twins by using blockchain, and also implementing blockchain to existing digital twins. Blockchain will also increase the cost-effectiveness of digital twins, allowing the technology to be created for more industries. Utilizing blockchain with digital twins will provide a secure, cost-effective and scalable solution, bringing together the digital identity and data tracking through blockchain traceability. On top of that, the blockchain will help companies in terms of ensuring that the large amounts of information that is being processed are held privately at their own internal servers. The improvement of digital twins through blockchain technology is set to transform industries such as manufacturing, healthcare and beyond.

Why may blockchain fail?

Blockchain mechanisms do not guarantee the trustworthiness of data at the source of the information. Thus, any weak link in the process chain, either from a physical or virtual environment, can let the attacker enter and carry out malicious activities in the system.

_

-2. Bringing gamification to digital twin’s security: 

Although digital twins operate virtually in an environment distinct from the live system, they are prone to attacks. To thwart attacks on digital twins, one potential solution is to assess the security level of digital twins by launching attacks on them. However, such assessment must be performed in an isolated environment without negatively affecting the operation of digital twin modes (especially replication). To this end, a gamification approach is proposed that provides twin assessment and a learning environment for security analysts. Gamification is the process of incorporating game mechanics into non-game environments. In cybersecurity, the gamification approach aims to provide security analysts with a controlled, supportive virtual training environment. To investigate the resilience of physical processes against attack, determining the potential loss such as disruption of services or machinery breakdown can be gamified. For instance, the redblue team cybersecurity exercises, penetration testing, Capture-The-Flag (CTF) challenges, or using cyber range to provide hands-on cyber skills and security posture testing are among well-known approaches in the existing literature. By simulating attack and defense scenarios, without risking critical infrastructures, such solutions reap the following benefits: (i) we can train security analysts by defining context, environment, and learning objectives to gain practical knowledge and skills during an exercise or challenge, and (ii) we can evaluate the security of digital twins and eventually the physical asset. Leveraging gamification for security-awareness training, can further complement automated security testing of digital twins through incident response which may benefit in lateral movement. 

_____

The Industry IoT Consortium (IIC) and Digital Twin Consortium have developed a series of guidelines to address the security of digital twins. Many organizations are adopting digital twins to simulate activity and predict events across their physical infrastructure. However, as with many technology projects, security is often an afterthought. Yet, digital twins could be subject to massive forms malicious activity or data leakage. To meet the growing requirements associated with digital twins, the Industry IoT Consortium (IIC) and Digital Twin Consortium have developed a series of guidelines to address digital twin security. “Risks must be considered to all aspects of the system, including various technologies, governance and operations,” the co-developers of the guidelines state. The result of their effort is the IIC’s “IoT Security Maturity Model (SMM),” intended to help organize and manage security concerns arising with digital twins. The SMM recognizes that not all IoT systems require the same strength of protection mechanisms and the same procedures to be deemed “secure enough.” Instead, it is intended to help “an organization decide what their security target state should be and what their current state is. Repeatedly comparing the target and current states identifies where further improvement can be made.”

______

Security fundamentals you need to have covered:

If you are building and deploying a digital twin, you need it to be fully dependable as decisions based on faulty digital twin outputs can dramatically cost your organisation. Here are some fundamentals you and your digital twin provider need to consider:

-1. Data security and cybersecurity:

The digital twin build and deployment needs to be both secure and trustworthy. Your provider should be securing the data, access to the digital twin, and the real object it’s connected to using robust governance [LS2] practices, authentication, and encryption. Look for two-factor or even multi-factor authentication (2FA or MFA) of the person – not just their device – or use hardware keys. Plus, you need to encrypt data in transit. However, increasingly, it is wise to encrypt data at rest and in use too.

-2. Resilience and Dependability:

‘Dependability’ encompasses availability, reliability, safety, confidentiality, integrity, and maintainability. These are the attributes that digital twins need to be able to deliver consistently so the outputs can be trusted for business decision making. However, Laprie suggests the higher need is resilience. He defines resilience as ‘the persistence of service delivery that can justifiably be trusted, when facing changes.’ This is fundamental to digital twins, which need to deliver a high level of service for with correct and trustworthy outputs.

The key threats to dependability are faults, failures, and errors.

So, you need to make sure your digital twin provider implements and maintain the means for dependability – fault prevention, tolerance, removal, and forecasting – so you can be sure of consistent, reliable outputs you can trust.

-3. Privacy:

Information privacy is core to any data collection and processing. Data is personally identifiable, such as location or medical information, its use needs to adhere to all relevant privacy laws such as Europe’s GDPR and California’s CCPA. But even where the data is not personally identifiable, data may need to be kept private to protect intellectual property. Masking, redaction, differential privacy, encryption, and lifecycle management are just a few of the methods that should be available to you in your digital twin deployment to keep data private.

-4. Safety:

Finally, you need to ensure the safety of the digital twin. Safety may sound basic, but it is a key part of protecting your digital twin and the real system it simulates. There need [LS3] to be physical locks on doors to the data centre and computer and server rooms, and IT equipment should not be at risk of being damaged either by people, other equipment, or through means like overheating. Never underestimate the fundamentals.

_____

_____

Segment-2

Digital twins can help cybersecurity:

Digital twins aren’t just a security liability for companies. Some enterprises are using them to improve their cybersecurity—as an early-warning system of attack, a honey trap, and as a testing sandbox. Digital twins can help organizations weed out vulnerabilities in systems by creating virtual clones to use for security testing. They can help cybersecurity because they can react to cyber vulnerabilities in a way that mirrors an actual system. And they can be used to test expensive physical systems for vulnerabilities before they go into production, such as in avionics. You can’t just walk up to an aircraft and apply some kind of threat, because you will invalidate the whole aircraft certification process. You attack that digital twin, and you discover any potential vulnerabilities.

_

A digital twin can also act as a kind of spider’s web or threat detection system—an incursion by an attacker will create ripples that can be felt by cybersecurity teams. One company using digital twins as a kind of highly sensitive sensor layer is GE, which is building something they call “digital ghosts”. You are using a twin, but you don’t want the attacker to know about it, so it’s a digital ghost. For example, if an adversary attacks the controls of a key piece of critical infrastructure, even if they are able to fake the output of that particular sensor, the digital twin as a whole will recognize that something is wrong because the entire system will no longer act as predicted or won’t match the information flowing from other sensors. In fact, the more complex the system, the better, because it will have more sensors and thus more observability. Critical infrastructure are perfect examples of how digital twins can be deployed to help with cybersecurity. Digital ghosts can be used to secure not just critical infrastructure but also operational technology in an organization’s data center. The vision of digital ghosts is more about the way the underlying physical assets operate. What we need to understand is, what are the physics of what normal looks like, how do the controls normally operate these assets. And if you had that knowledge and a lot of either simulated data or historical data, you could build a really good representation of how an asset should be operating. The digital ghost would be able to detect if something’s wrong and tell you precisely what sensor is compromised. That alone typically takes operators days or weeks to pinpoint where the problem is. The digital ghost does that within seconds.

______

According to future projections, nearly everything in the physical world will have a digital twin, including infrastructure, supply chains, consumer products, and more. Metaverses are becoming big tech changers by connecting virtual experiences and simulations to create new real-world experiences. In this context, digital twins are key technologies to connect both realms. By simulating risks in an environment that mirrors the real world, companies can predict better where hackers will strike, how the attack may be done, and how damaging it will be. This information is essential for organizations to anticipate cyber-attacks, manage the risks efficiently and respond with effective methods.  Digital twins can also help to improve the existing security approaches. In particular, combined with artificial intelligence, they allow organizations to scale up defenses and situational awareness. Digital twins can prevent cyber-attacks by providing modelling/prediction capability or through increased visibility of the system behaviours of their physical counterpart. The use of digital twins in cybersecurity potentially empowers security teams to get ahead of sophisticated threat actors and reduce risks to cyber-physical systems in manufacturing, IoT devices and consumer smart devices.

Here are some exciting use cases for digital twins in cybersecurity.

-1. Improve security on the system design

The digital twin can simulate various types of system compromises and can be used to analyze how the system behaves under attack and estimate potential damages. This facilitates designing the security and safety mechanisms to create more robust and fault-tolerant designs. As a result, the virtual representation may help reducing the attack surface by revealing weak spots in the architecture, unnecessary functionalities on the devices or unprotected services. Whether the device you want to secure is a cyber-physical system used within a smart grid, a self-driving car or an IoT blood pressure monitor, digital twins let security professionals simulate a slew of cyberattacks on physical systems to see how they react while under attack. The results of these simulated attacks can be fed back into the systems’ design before these crucial devices ever leave plant floors. Analyzing how the system reacts in response to different types of cyberattacks helps to create more robust designs with greater fault tolerance built-in. Digital twins also improve the security of a system’s design by reducing its attack surface. Leaving aside any attack simulations, thoroughly analyzing the system’s architecture, communications protocols and traffic flows during normal system use can flag weak spots that malicious outsiders could feasibly exploit. Unneeded services could be taken out of the design to reduce the system’s attack surface.   

-2. Security testing and modelling cybercrime impact

Security tests in OT environments are a critical activity, due to the tests can produce business interruption or even physical damages. In this context, digital twins can be used to understand the attacker’s mindset, potential attack patterns, and their impact. Ethical hackers can perform security tests on the digital twins instead of on real systems. As a result, the digital twins can be used to stress the system, evaluate the vulnerabilities and security controls on a realistic environment without affecting the production environment. The ability to attack a live copy of the production environment without putting the business at risk allows security teams to test better the system without compromising operations.

-3. Assist on the intrusion detection

Digital twins can help to create Intrusion Detection Systems based on the system behavior. The virtual representation identifies any behavior deviation at runtime between the virtual replica and the physical system.

The advantage of this detection approach is that it does not run on constrained devices such as Programmable Logic Controllers.  As a result, it is possible to use methods that require increased computing resources such as machine learning and deep learning. Also, executing the intrusion detection algorithm on the digital twin helps to preserve the computing resources of the real system and has no adverse impact on its efficiency. This point is important, since most OT environments have real-time constraints. Intrusion detection capabilities in OT environments are one of the more exciting ways to use digital twins in cybersecurity. As cyberattacks targeting these environments increase due to growing interconnectivity, ICS systems – including Supervisory Control and Data Acquisition (SCADA) systems – and Distributed Control Systems (DCS) need intrusion detection to precisely monitor for malicious activities or corporate security policy violations.

Note:

Operational technology (OT) keeps critical infrastructure and industrial environments functioning. OT is made up of software and hardware used to manage, secure and control industrial control systems (ICS) systems, devices and processes in your OT environment.

-4. Standardizing and Harmonizing Firmware Variants

Even the simplest of smart machines is composed of numerous software components, each supplied by a different third-party vendor. Each vendor develops its software according to its own needs, using different operating systems, hardware architectures, frameworks and more. With so many different components and technologies thrown into the mix, gaining the security analysis expertise and sourcing the right mix of tools to cover all grounds becomes an expensive, time-consuming challenge. By providing standard, highly detailed representations of the firmware, cyber digital twins facilitate analysis by product security experts and via multiple analysis tools for multiple use cases such as CVE scanning, zero-day analysis and detecting privacy violations.

-5. Eliminating IP Protection Concerns

Ensuring the integrity of the proprietary source code of the firmware embedded within each software component is of vital importance to the vendors who develop them. However, manufacturers still need access to the firmware code for security, testing, compliance and more. With cyber digital twins, there’s no need for vendors to share the actual firmware that contains their valuable IP. Instead, they can share the cyber digital twin, which includes all the critical information that cybersecurity professionals require to do their analysis. In this manner, vendors can safely share critical information about their products without having to worry about the integrity of their IP. This new reality empowers new levels of transparency throughout the entire supply chain.

-6. Empowering Security Teams

Every software component used in smart machines undergoes rigorous research and assessments for security and regulatory compliance purposes. As mentioned above, the extensive use of so many different technology variants by each supplier complicates matters. This means that the number of researchers needed to perform the required research and assessments grows in proportion to the number of components in use. Cyber digital twins do the heavy lifting for the researchers. Instead of spending precious — and expensive — time analyzing the binary code to get to the info they need, researchers can focus on running their tests and assessments. In this manner, cyber digital twins enable a streamlined research and assessment process and minimize time to discovery and compliance without the need for advanced security research skills.

-7. Improve incident response

A digital twin can assist on the decision-making to mitigate security incidents. Using simulations on different scenarios and conditions, the virtual representation can create decision trees to determine the most appropriate response. All the collected data and the testing activity in the digital twin can be registered in a database, to automatically learn and create more sophisticated decisions to efficiently protect the system. Besides, digital twins can evaluate the system changes as if they were deployed on the real environment. To sum up, security teams can use resources on the most important tasks to respond efficiently and measure which configurations, technologies or actions are most effective to reduce the impact of the attack.

-8. Patch Management

Patch management on OT environments is challenging due to the difficulty of understanding the impact of applying a software update or configuration change on the overall system. In addition, testing in isolation is either expensive, time-consuming or simply does not cover system wide impacts. Digital twins can help to overcome these challenges. The impact of applying a patch can be tested without impacting the production environment. Also, they can provide automation of the security tests as part of regression testing for new functionalities in the development life-cycle.

-9. Training

Digital twins are a training tool for operators, engineers and cyber security practitioners to recognize compromise of the real system and practice response procedures accordingly. During the training, the operators can practice, for example, how to diagnose the system to identify the extent of the compromise, the availability and integrity of the system control, when to declare an emergency, initiate the safe shutdown process or take appropriate response actions to secure the OT process according to the level of compromise. Besides, digital twin cyber-range environment helps to promote practical engagements and war games.

_____

_____

Section-15

Cognitive digital twin (CDT):

Some authors propose the notion of a Cognitive DT which is inspired by advances in cognitive science. In a recent study, Al Faruque et al. (2021) proposed the CDT for manufacturing systems based on the advances of cognitive science, artificial intelligence technologies and machine learning techniques. According to the fundamental aspects of cognition, they define CDT as digital twin with additional cognitive capabilities including: perception (forming useful representations of data related to the physical twin and its physical environment), attention (focusing selectively on a task or a goal or certain sensory information either by intent or driven by environmental signals and circumstances), memory (encoding information, storing and maintaining information, and retrieving information), reasoning (drawing conclusions consistent with a starting point), problem-solving (finding a solution for a given problem or achieving a given goal), and learning (transforming experience of the physical twin into reusable knowledge for a new experience). Cognitive Digital Twins learn by themselves, foresee the future, and act accordingly. Mortlock et al. propose using graph learning as the pathway towards creating cognitive functionalities in DTs. The graph is built using data models that use a Graph Neural Network (GNN).

_

The CDT concept is built on the basis of DT. Therefore, a CDT should include all the essential characteristics of DT. For example, both DT and CDT enable digital representations of a physical system, meaning that they both contain the three basic elements: (1) physical entities in the real space; (2) virtual entities in the virtual space; (3) the connections between physical and virtual entities. Moreover, the virtual entities of both DT and CDT can communicate with their corresponding physical systems and update dynamically. From this point of view, the CDT concept can be treated as a subset of DT, as depicted in figure below. It means that all CDTs are certain kind of DT with extended characteristics such as cognitive capabilities, cross-lifecycle phases and multiple system levels.

Figure above shows relation between digital twin and cognitive digital twin. 

The main differences between them are their structure complexity and cognitive capability.

First, a CDT is usually more complex than a DT in terms of architecture and number of lifecycle phases involved. Most DTs correspond to a single system (or product, subsystem, component etc.), and focus on one of the lifecycle phases. In contrast, a CDT should consist of multiple digital models corresponding to different subsystems and components of a complex system, and focus on multiple lifecycle phases of the system. In many cases, a CDT can be constructed by integrating multiple related DTs using ontology definition, semantic modelling and lifecycle management technologies. These DTs may correspond to different subsystems and components mapped to different lifecycle phases, and each of them evolves along with the system lifecycle.

Second, cognitive capabilities are essential for CDTs, whereas DTs do not necessarily possess such capabilities. Most existing DTs are applied for services of visibility, analytic and predictability, such as condition monitoring, functional simulation, dynamic scheduling, abnormal detection, predictive maintenance and so on. These services are usually enabled by data-based and model-based algorithms using the data collected from the physical entities. Cognitive capability is required to reach higher level automation and intelligence, for example, to enable sensing complex and unpredicted behaviours and generating dynamic strategies autonomously. To achieve this target, the data-based and model-based algorithms are incapable of integrating the complex data and models from different systems and lifecycle phases with heterogeneous specifications and standards. To address this challenge, it requires more technologies such as semantic modelling, systems engineering and product lifecycle management and so on. For instance, a unified ontology can represent physical entities, virtual entities and the topology between them which is the basis to realise the cognitive capability. Top level ontologies can be used to integrate different ontologies synchronised with virtual entities across the lifecycle in order to support reasoning for cognitive decision-makings.

It is important to emphasise that, the CDT concept is not proposed to replace DT, instead, it is an extended and federated version of the current DT paradigm. A trade-off should be made according to the requirements of the application scenarios and stakeholders’ requirements. CDTs aim at complex systems that consist of multiple interlinked subsystems and components, and require interactions between different lifecycle phases. It can provide more advanced cognitive capabilities, whereas the implementation is more challenging in terms of technology readiness, risk level, economic and time cost etc. In comparison, the enabling technologies of DT are more mature, and many successful cases are available as references.

_

The main difference between CDT and DT is that cognitive entities of CDT include multiple virtual models across the entire system lifecycle. Each of the models has its corresponding ontology descriptions as illustrated in figure below. The ontology of virtual models describes the features of cross-domain models, with the fact that it identifies the interrelationships between different virtual models. As shown in figure below, it supposes the physical entity is an engine; then, the virtual entities may include CAD models, performance models, information models, FEM models, and CFD models etc. These models are used in the different phases of the engine’s lifecycle. The ontology is defined as a representational artifact, comprising a taxonomy as proper part, whose representations are intended to designate some combination of universals, defined classes, and certain relations between them. It is developed as the core to formalize the meaning of engine models and interrelationships between all the models.

Figure above shows comparison between digital twins and cognitive digital twins.

______

Key enabling technologies for CDT:

Advanced technologies, such as semantic technologies, IIoT, artificial intelligence etc., are the pillars for modern industrial systems. Since the CDT concept is built upon DT concept, all the enabling technologies of DT are essential for CDT as well. Some of them are particularly important for the CDT paradigm. Here are some of the key enabling technologies for CDT development.

-1. Semantic technologies:

Semantic technologies are considered as the core of the CDT concept due to their advantages for improving data interoperability and constructing cognitive capabilities. CDT models usually involve heterogeneous data, information and knowledge making it difficult to align among different DTs and stakeholders. Semantic technologies such as ontologies and knowledge graphs provide promising solutions for this challenge.

  • Ontology engineering:

Ontology is considered as a promising core for cognitive systems owing to its capability of formalising the ontological features of the physical entities which is compatible with the perspective of human common sense (El Kadiri and Kiritsis 2015). Ontology engineering refers to a set of activities that concern the ontology development process and the ontology lifecycle, the methods and methodologies for building ontologies, and the tool suites and languages that support them (Gruber 1993; El Kadiri and Kiritsis 2015).

Nowadays there exists various of mature methodologies, tools and languages to support ontology development. However, numerous ontologies have been created based on various languages and tools for different application scenarios. It becomes a challenging task to integrate different ontologies in a unified framework to assure their interoperability. Thus this is a common problem when developing CDT models for complex systems. A possible solution is to make use of a hierarchical methodology to unify the application ontologies under a common top-level ontology which contains a set of general vocabularies commonly used across all domains. These vocabularies are properly structured and formally defined according to certain methodology. Such top-level ontologies provide a common foundation for developing lower-level ontologies such as domain-specific ontologies and more detailed application ontologies. The adoption of the top-level ontology assures semantic interoperability among these lower-level ontologies. Currently, many top-level ontologies have been developed and widely applied by different communities such as Basic Formal Ontology (BFO) (Arp and Smith 2008) and Descriptive Ontology for Linguistic and Cognitive Engineering (DOLCE) (Masolo et al. 2003).

Some recent efforts have been spent on unifying and standardising existing domain ontologies based on certain top-level ontologies. For example, the Industrial Ontologies Foundry (IOF) (IOF 2021) which is an ongoing initiative aims to co-create a set of open ontologies to support the manufacturing for industrial needs and to promote data interoperability. IOF provides a multi-layer architecture to guide ontology development, consisting of four layers: top-level foundation ontology, domain-level (domain independent and domain specific) reference ontologies, subdomain ontologies and application ontologies. It uses BFO as a foundation, which experts from different industrial domains work jointly to create open and principles-based ontologies.

A top-level ontology enables the semantic interoperability among the lower-level ontologies in the same family. However, several top-level ontologies have been developed and each of them has their own followers. It is necessary to develop unified specifications for these top-level ontologies through cross-domain collaborations involving most relevant stakeholders. A set of standards and specifications are necessary to integrate existing and widely used top-level ontologies. Moreover, certain ontology alignment approaches are expected to maximise the use of existing domain ontologies developed under different top-level ontologies.

  • Knowledge graph:

Knowledge graph is considered as the most promising enabler for realising CDT vision. It uses a graph model composed of nodes and edges to describe the topology of both structured and unstructured data enabling formal, semantic, and structured representation of knowledge (Nguyen, Vu, and Jung 2020). The nodes in a knowledge graph are used to represent entities or raw values encoded as literals; and connected edges are used to describe the semantic relations between nodes. Knowledge graph enables to construct semantic connections among heterogeneous data sources thus to capture underlying knowledge from them. This is especially important for the CDT vision where a large amount of data and models are involved.

Since Google released its knowledge graph for its searching engine in 2012 (Singhal 2012), knowledge graph and relevant technologies have attracted significant attentions from both academic researchers and industrial practitioners (Ehrlinger and Wöß 2016; Nguyen, Vu, and Jung 2020). Most of these studies are from information technology domain such as recommendation systems, question-answering systems, cybersecurity systems and semantic search systems etc. Nguyen, Vu, and Jung (2020). The applications of knowledge graph in production sector are relatively less. In some recent studies (Banerjee et al. 2017; Boschert, Heinrich, and Rosen 2018; Gómez-Berbís and de Amescua-Seco 2019; Lu, Zheng et al. 2020), it has been used to enhance digital twins which is one of the main driving forces for constructing CDT concept as introduced in the related work. In many cases, knowledge graph is applied together with ontologies to create a knowledge base or knowledge management system. A typical approach is to use knowledge graph to acquire information from raw data and to integrate the information into an ontology where a reasoner can be executed to derive new knowledge (Ehrlinger and Wöß 2016; Nguyen, Vu, and Jung 2020).

Within the CDT reference architecture, knowledge graph and ontologies are the backbone for the functional layers of data ingestion and processing, model management, service management and twin management. They are the core to integrate different models from different systems, subsystems and components across different lifecycle phases.

_

-2. Model-based systems engineering

Systems Engineering provides a set of principles and concepts, as well as scientific, technological, and management methods to support realisation, use and retirement of engineered systems (Sillitto et al. 2019). It is an efficient approach to manage highly complex industrial systems which is the target of CDTs. More specifically, Model-based systems engineering (MBSE) enables the formalism of system architectures to support tasks such as model-based requirement elicitation, specification, development and testing etc. Lu, Ma et al. (2020). It emerged around 1993 not only from industry (Oliver 1993) but also from academia (Wymore 2018), which is proposed for supporting complex system development such as aerospace. MBSE is a key enabling technology corresponding to the formalism of system hierarchy levels of the CDT reference architecture. It can be used to formalise the hierarchy of a complex industrial system and specify the relationships among digital models from different hierarchy levels.

Currently, MBSE is widely used to support digital twin development and integration, and cognition construction for digital twins. For example, Liu et al. (2021) developed a shop floor digital twin based on MBSE. Bachelor et al. (2020) proposed an MBSE approach for digital twin development and integration in the aerospace industry using SysML and linked data. Moreover, Lu, Zheng et al. (2020) proposed a CDT structure based on a systems engineering methodology and demonstrated how MBSE can facilitate the cognition construction during CDT development.

_

-3. Product lifecycle management

As presented in the reference architecture, a CDT should support the integration of digital models across different phases of entire lifecycle. Product Lifecycle Management (PLM) plays a critical role on the lifecycle management. PLM is a strategic approach for managing product related information efficiently during its whole product lifecycle, including BOL, MOL and EOL (Kiritsis 2011). Its main objectives include universal and secure management of product information, maintaining the integrity of that product information throughout the product lifecycle and management of business processes for creating, managing, disseminating and using the information.

Nowadays, PLM has been widely used in different industrial sectors, especially in manufacturing domain, as an important strategy to maintain the sustainable and competitive advantages of enterprises (Liu and Liang 2015; Zhang et al. 2017). PLM has also been impacted by many emerging technologies such as IoT, data mining and semantic technologies etc. These technologies makes possible of capturing hidden knowledge and patterns from massive lifecycle data, thus to improve various data-driven services (Denkena, Schmidt, and Krüger 2014), which aligns with the CDT vision. These applications and advancements provide substantial basis for CDT development.

_

-4. Industrial data management technologies

IIoT is a subset of IoT with applications in industrial domain by deploying large number of smart things in industrial systems to enable real-time sensing, collecting, and processing. IIoT systems usually require higher levels of security and reliable communications to assure high production performances (Liao, Loures, and Deschamps 2018; Khan et al. 2020). The rapid development of IIoT and relevant technologies has been one of the main driving forces for Industry 4.0 and smart manufacturing, which is also the foundation of CDT. The big data generated by IIoT devices provides input for the data ingestion and processing layer of the CDT reference architecture for constructing data-driven services as basis.

During the past decade, large amount of efforts from both academia and industry have been spent on IIoT technologies and their applications thanks to the sufficient investment all over the globe. The achievements of these efforts provide a starting point for CDT development which is a matter of properly utilising and reusing these relevant technologies when adopting them to the CDT vision. The wide deployment of IIoT devices in modern industrial systems generates large volume of industrial data with different data formats and structure. These big industrial data is the basis for all data-driven services of CDT. On the other hand, collection, storage, sharing and processing of these data remain a challenging task which requires multiple advanced technologies.

  • Cloud/Fog/Edge computing:

Cloud computing has been used as one of the most important tools to cope with the big data challenge in many sectors. It has become an essential digital infrastructure for many industrial enterprises where a mature market has been created. Companies such as Amazon, Google, IBM, and Microsoft etc., are providing flexible computing services and solutions using their own cloud computing platforms (Qi and Tao 2019). However, the communication between the cloud server and local data sources requires high bandwidth leading to heavy delay which is not acceptable in many application cases especially for CDT vision where real-time processing is critical.

To tackle this problem, fog computing and edge computing have been proposed to bring the cloud computing closer to data generators and consumers (Mohan and Kangasharju 2016). Fog and edge computing are considered as extensions of cloud computing (Roman, Lopez, and Mambo 2018). Fog computing allows network devices run cloud applications on their native architecture (Solutions, Cisco Fog Computing 2015), while edge computing uses edge devices such as tablets, smartphones, nano data centres and single board computers etc. as a cloud to perform basic computing tasks (Chandra, Weissman, and Heintz 2013; Ordieres-Meré, Villalba-Díez, and Zheng 2019).

Fog and edge computing enables transferring part of the computing, storage, and networking capabilities of cloud to the local network which makes possible of low latency, real time response, reduction of network traffic, etc. Qi and Tao (2019) and Wang et al. (2021). These characteristics make them essential components for intelligent industrial systems. They are key enabling tools for the physical entity layer and the data ingestion and processing layer of the proposed CDT reference architecture when developing the basic bridges between physical world and digital world.

  • Natural language processing:

Natural language processing (NLP) is a complementary tool for semantics technologies and machine learning during the CDT development. NLP aims to gather knowledge following a similar way that human beings understand and use language based on a set of appropriate tools and techniques (Chowdhury 2003). In a CDT system, both structured and unstructured data are involved. Among them, a large part of the raw data are stored with natural languages such as technical documents, logs, orders and etc. It is required to transform these information into computer-understandable data so that they can be integrated with other data sources which makes it possible to create a comprehensive digital model. Currently, NLP has been a popular research topic for several decades as a subfield of linguistics, computer science, and artificial intelligence. There are numerous NLP algorithms and techniques available for performing tasks like text and speech processing, optical character recognition, morphological analysis and syntactic analysis etc. Similar to machine learning, it is also a matter of selecting suitable solutions based on the specific requirements of the CDT application scenarios.

  • Distributed ledger technology:

A CDT system requires integration of data from different stakeholders. The main concerns during data sharing are about data security and privacy and intellectual property (IP) protection. To deal with these concerns, reliable cybersecurity infrastructure and data encryption mechanisms are necessary. Boschert, Heinrich, and Rosen (2018) proposed two ways to control the degree of transparency of CDTs, i.e. using encapsulated models to guarantee IP protection and open models for realising integrated development processes. The recent advancement of Distributed Ledger Technologies (DLT), including blockchain, provides a decentralised solution for protecting data security and privacy during data sharing. DLT removes the dominant administrator and central database compared with traditional data sharing approaches which enables secure data sharing in a trustless environment. It has attracted increasing attention from both researchers and practitioners recently. Various DLT systems and platforms have been developed including private ledgers, permissioned ledgers and public ledgers etc. As an example, Sun et al. (2020) developed a IIoT data handling architecture using a public ledger and IOTA Tangle (Popov 2018; Chen et al. 2020), to ensure data privacy and to protect data ownership in a distributed platform. Considering the advantages of DLT, it will be a promising technology to accelerate the CDT vision and should be taken into consideration during future CDT development.

______

______

Case Study: A Cognitive Digital Twin in a Steel Pipe Manufacturing Factory:  

A cognitive digital twin for a spirally welded steel pipe manufacturing machine can detect anomalies, provide predictive maintenance, or estimate the remaining useful life of the SWP machine components.

Problem:

Steel pipe manufacturing occurs seven days a week, 24 hours a day. When a single unexpected defect may stop the whole production unwantedly, these unplanned stops result in high costs for maintenance and production. Plus, due to these stops, delays in meeting customers’ orders may happen. Without a system that continuously enables real-time monitoring of manufacturing and predicting anomalies, there is no ability to prevent unpredictable downtimes. As a result, manufacturing happens at low levels of production efficiency. It is highly costly to maintain the system. The energy consumption and the associated carbon footprint are high, and organizations must reduce them to lower levels for competitiveness and revenue reasons.

Solution:

Around 60 different new sensors have been installed on the machine components, summing up to a total of 125 sensors with preinstalled ones. The organization designed and developed a sensor network using an IIoT Platform, and data has been acquired, transmitted, processed, and stored. They used the platform-independent solution to detect anomalies in real-time. Big data accumulated has been reported, analyzed, and visualized without delay. A real-time online monitoring system enables users of different levels (maintenance operators and managers) to make better and informed decisions. They used machine learning and deep learning models to train data-based artificial intelligent models. They also used trained models on stream data for anomaly detection and predictive maintenance. 3D models of the machine components display to the users as part of the cognitive digital twin visualization functionalities.

Benefits:

Below, find a list of benefits at Turkey’s second largest steel pipe manufacturing company.

-10% decrease in energy consumption in spirally welded steel pipe manufacturing

-The reduced carbon footprint for a sustainable world and life

-Detection of machine failures in advance they occur

-Enabled predictive maintenance

-Decrease in SWP maintenance costs

-Increased Overall Equipment Effectiveness

-Increased production efficiency due to decreased unpredictable downtimes

-Decreased machine downtime due to unplanned stops

-High level of accuracy of machine learning models in measuring Remaining Useful Life

______

______

The potential danger of constructing cognitive digital twin entities:

Although the possibility of figuring out how to construct a conscious digital twin model of any physical object out there is compelling for many reasons, there could be a few potential risks involved. Why is that? Well, if we are trying to develop a technology that is able to process cognitive tasks, we must realize that we are trying to bring life into technology. This reason will make everything more complex than it seems. A fully thinking digital twin will act like AI that can make its own calculated decisions, process thoughts and execute actions just like a real, functioning organism. This may involve the conscious entity of developing itself outside of the limitations that were implied by humans.   

_____

_____

Section-16

Digital twin of human:    

With 3D models proliferating, digital twin technology is progressing rapidly towards the metaverse, which promises to create a virtual world where people can work, socialize and shop. Any digital world will need digital people, and just as a factory or power plant can have a virtual twin, so too can humans – and we’re talking much more than just an avatar. A digital twin of a healthcare patient could help that person to track a variety of healthcare indicators. Digital human twins could also help simulate the impact of new strategies in urban transportation or social services, to determine the most effective approaches.

_

The rating of popular digital technologies in the sector of healthcare LIFT Radar 2021 put a human digital twin on second place and the IEEE Computer Society rating of 2020 ranked it third. An analytical leader Gartner in 2020 declared that digital human models will be a highly valued technology from a social, healthcare and business point of view within the next 10 years.

So far virtual twin technology is only in its infancy. Some of its approaches are used for the creation of new drugs and certain types of personalized therapies. However, the same approaches are already widely applied in other spheres — in the construction, manufacturing, and automobile and aerospace industries. Engineers use them when designing new systems, managing exploitation and calculating possible equipment wear in real time.

It is impossible to create a model of a human patient without a constant update of genomics, biomics, proteomics, and metabolomics data, as well as physical markers, demographic, and lifestyle with regard to the timeline. However, as medics have more and more devices for data collection, the enthusiasm for the digital twin grows accordingly. Ideally, all this data should be taken in real time from the real patient and delivered to his virtual twin. This way the medics would have the most up-to-date status of the health of the patient and be able to find the best treatment accordingly. Chinese scientists Lui, Zhang, Zhou, together with their colleagues in 2019 formulated the conditions for the creation of a virtual human twin [vide supra].

_

Technology analyst Rob Enderle believes that we will have the first versions of thinking human digital twins “before the end of the decade”. “The emergence of these will need a huge amount of thought and ethical consideration, because a thinking replica of us could be incredibly useful to employers,” he told BBC. In Meta’s virtual reality platform, you may be able to give your avatar a similar face to your own, but you can’t provide it with any legs because the technology is still evolving. Prof Sandra Wachter, a senior AI research fellow at Oxford University, told BBC: “We have a long way to go until we can model a person’s life from beginning to end, assuming that is ever possible.” 

_

When replicating a human or a patient, a digital twin is constructed using vital sign monitoring in combination with anatomical and physiological data. In the world of ubiquitous wearable devices and biomedical sensors, these data can come from multiple sources. For example, a smart watch can collect real-time information about the blood pressure, body temperature, pulse rate, sleep patterns and overall physical activity levels of the patient. Similarly, when the patient visits a clinic or a hospital, the virtual patient model can be updated with the data from the laboratory tests and diagnostic imaging studies conducted during the visit. Furthermore, genetic and behavioral data as well as social determinants of an individual could also be coded to the digital twin. When all these data are combined into a single virtual representation of a patient, a more complete picture of the medical history of the patient is available to support decision making.

_

Figure above shows digital Twin of a Human.

For humans, we’re not there – yet. A human digital twin would incorporate a vast quantity of data about a person’s preferences, biases and behaviours, and be able to have information about a user’s immediate physical and social environment to make predictions. These requirements mean that achieving a true digital twin are a remote possibility for the near future.  

_

A human digital twin makes it possible to expand the range of human activities from the real world to cyberspace. Interaction between oneself in the real world and all digital twins in cyberspace is done through one’s digital twin. Moreover, feeding back the results of that interaction to oneself in the real world makes it possible to use the experience gained from activities in cyberspace in the real world. The aim of human Digital Twin Computing (DTC) is to not only digitally represent a person’s outer self in terms of, for example, physical and physiological characteristics but also the person’s inner self, for example, personality and thoughts. A human DTC model that reproduces the individuality and characteristics of humans (for example, a personality/thinking model that models behavioral tendency, personality, and values or an ability model that models perception, knowledge, language ability, physical ability, etc.) defines the behavior of a digital twin. By reacting to other people’s actions in cyberspace as if they were our real-world selves and by making them behave autonomously in a virtual society, human digital twins can engage with others as our real-life selves. Human DTC is defined as a whole system that develops and enables human digital twins.

_

A human digital twin is not only composed of data showing your state and behavior, it is also composed of a model that expresses your individuality and emotions such as your tendencies regarding judgment and behavior. As a result, it enables you to interact with others in a virtual society and carry out autonomous activities there as if it were yourself. This interaction essentially has three advantages. The first is being able to be used for various tasks that are beyond our physical constraints. Regarding communication in cyberspace, a huge amount of processing can be performed at high speed beyond the real-time processing performed during communication between real people. The second advantage is that having a model that expresses emotions makes it possible to transmit various emotions as they are. In this virtual society, you can convey nuances, so you can create a world in which your thoughts do not have discrepancies. The third advantage is being able to interact with people on the basis of social aspects and diversity, and this ability will make it possible to (i) precisely analyze those interactions while taking into account the individuality of humans and (ii) predict the future.

_

There are many potential applications for these virtual replicas of humans. For example, a digital twin of a patient together with AI models can be used in precision medicine to make proactive decisions about the right treatment options for the specific patient. A virtual human model could also be used in a simulation for testing novel medical therapies and drugs, as discussed by the FDA, that would otherwise be too risky or time-consuming if conducted on a real patient. For instance, a selection of chemotherapy drugs could be tested against the genetics and physiological processes of the patient in order to identify the best treatment response. Virtual models of individual organs can also be used in developing and testing new medical devices, such as heart models for designing pacemakers. These types of studies are commonly known as in silico medicine, which can be used to support or even potentially replace clinical trials in the future. From the perspective of a patient, a digital twin with vital sign monitoring allows proactive management of chronic diseases, fitness levels and overall health status. With all these data combined, people can make more informed decisions for personal health and thus achieve a healthier lifestyle.

______

Challenges Facing Human Digital Twin Computing and Its Future Prospects, a 2023 study:

Human Digital Twin Computing (DTC) aims to digitally express not only a person’s outer self, such as physical characteristics, but also that person’s inner self such as personality and thoughts. Authors believe that digitizing information that includes the inner self can create unprecedented value. Collective consensus building, creating empathy and understanding of others, and future prediction and growth support for individuals and societies are characteristic use cases of human DTC.  Research on human DTC has just begun but the base technology, for example, the technology for recognition and generation of speech, language, images, etc., has greatly evolved by using artificial intelligence (AI). With respect to these technologies, authors want to accelerate research and development on human DTC by focusing on the individuality of people and delving further into people’s inner self. Furthermore, from the perspectives of individuality and the inner self as well as permeation into society, it is necessary to expand human DTC to fields other than AI and engineering, namely, biology and medical (including brain science) and ethics, philosophy, and the other humanities (such as behavioral economics).

_____

_____

Section-17

Future of digital twins:   

Having considered the forces that are steering the evolution of the Digital Twin concept, we can now ask ourselves what kind of future can be expected for Digital Twins.

Capabilities of Digital Twins:

In the near future we can expect Digital Twins to proactively search for data, harvest data, and request that sensors capture certain types of data with customized sensitivity. In addition, as they become smarter, we can expect Digital Twins to develop their own model of the world, in other words, to become increasingly aware of their environment.  As another major step forward, we can expect them to interact with other Digital Twins and with their Physical Twin at a semantic level, thus getting rid of the need to have predefined syntax (standards) for enabling communications.  Furthermore, we can expect them to become capable of playing a role of proxy in the cyberspace and, through actuators, in the physical space. Additionally, they will be able to replicate themselves in several instances, as a need arises, essentially creating instances that can act in parallel.

Finally, we can expect Digital Twins to learn from their environment and experiences and be able to self-assess the quality of the lessons learned (to determine which ones to retain) – Figure below presents the expected evolution of Digital Twins capabilities.

_

Digital Twins Communication Fabric:  

Digital Twins can be seen as data banks and, through mutual interactions, can create federated data spaces. For example, the Gaia-X model being developed in Europe with the participation of 6G compared to 5G of all major world stakeholders. Additionally, the local embedded intelligence in Digital Twins will transform these data spaces into semantic hubs, and eventually, 6G will leverage these hubs as communication nodes for the future 6G network fabrics.

6G is still on the drawing board, but the path towards creating mesh networks at the edges through the interaction/cooperation of autonomous systems is already visible. Digital Twins can be the agents having both the semantic awareness of what kind of connectivity is needed at a specific time, and the ability to coordinate with other DTs, which is required for the dynamic construction of a local mesh network.

_

Aggregation of Digital Twins – Meta Digital Twins: 

We are already seeing examples of aggregating Digital Twins. For example, the recent San Giorgio Bridge in Genoa, is a Meta Digital Twin, or in other words, a Digital Twin credit: Autostrade made up of hundreds of other Digital Twins that represent various aspects of the bridge. This Meta Digital Twin interacts with the Digital Twin of the Harbour urban highway to manage traffic flow across the city of Genoa.

Buildings are other examples of DT clusters (ARUP is using this technology in their new buildings). Cities such as Singapore are being managed through a Digital Twin, resulting from the aggregating of hundreds of DTs.

Even Countries and Continents may, in the future, have their own Digital Twin. Europe is funding a big initiative, Destination Earth, with the goal to create a Meta Digital Twin to represent various aspects of European infrastructures as well as its social fabric. The Spatial web, a 3D representation of the cyberspace will likely make use of meta DTs.

_____

Five trends of the Digital Twins are:

-1. The digital twin can play a significant role in developing 6G network from 5G communication services by allowing users to explore and monitor the real world without any spatial constraint

-2. Industries have started adopting digital twin technology to enhance asset performance and ROI with the help of remote operations and production assurance

-3. Digital twin is required in the pharmaceutical industry to reduce the number of physical and real-world tests to discover new drugs for the welfare of society

-4. Digital twin technology will provide immense scope to 3D printing, metal printing and mapping in multiple industries across the world through its fast model building capacity

-5. IoT and IIoT will drive digital twin adoption rate in Industry 4.0 to enhance the interconnected environments for developing the best management solutions 

______

Twins become Triplets:

A digital twin is the digital representation of a physical object in a computer. Digital triplet is an embedding avatar of digital twins in lower dimension. In recent years, we have taken a three-tiered approach to leveraging inspection data in order to drive cross-generational improvements in product design and manufacturing. With the digital triplet, we go one step further and use inspections as an ongoing input to continuously bring the digital twin closer to the real asset. Digital twins, in particular, allow the digital simulation of a part to be combined with its “performance” data to predict its process and performance under various conditions through its design and life cycle. For example, an aircraft can be fully simulated on a computer to understand how a bird strike would affect its ability to land safely. This would provide a much more accurate prediction of the engine’s condition and, thus, its safety. Industrial inspections not only provide a new layer of information but lead to a new paradigm by connecting the digital twin with the real world. In addition to the operational parameters and monitoring, it can provide deep insights into the actual condition of the monitored asset, forming the digital triplet. Adding an additional layer of information to your digital twin to create a digital triplet, while attractive and enabling, requires some changes in operations, systems and, most importantly, culture.

_

Some researchers see ‘Digital Triplet’ differently. In addition to physical and cyber worlds, Digital Triplet also includes the ‘intelligent activity world’ where humans will solve problems by using DT. Unlike DT, where systems and process are automated, in Digital Triplet human interaction with the system and process is also accounted for, thus creating value from data using human intelligence and knowledge. The aim of ‘Digital Triplet’ is to support engineering activities throughout a product’s life cycle including design, manufacturing, use, maintenance, remanufacturing, and circulation of resources by integrating all three worlds, just like DT.

_____

Digital Twins lead the way in future of semiconductor industry:

Digital twin, defined simply as a real-time simulation of the real world, has recently started to gain momentum in the semiconductor world. For example, AR is integral to Intel’s worldwide manufacturing processing from maintenance and repair, enabling them to communicate remotely to troubleshoot internationally and prepare interactive training materials. Semiconductor manufacturing is a long, and complex process. Setting up a wafer fabrication plant requires precision, clean environments, expensive equipment and time. To illustrate, it typically takes three months for leading semiconductor manufacturer GlobalFoundries to etch and fabricate silicon wafers into multilayer semiconductors. And it becomes difficult to increase production during up cycles when there is a chip shortage, since it takes years to get new factories up and running.  Advancements in AI and digital twin technology offer the potential to accelerate the chip design and production process and help manufacturers close the supply-demand gap quickly.

DT will pave the way for collaboration and radically change worker training. The lithography equipment tools are most vital to a semiconductor fab and cost up to $40 million each (for a 300mm wafer size). But, of course, the cost will shoot up as we move towards lower nodes. An important aspect of setting up any new fab is a technology transfer fee that includes specialists coming to the new facility and training the workforce on the proposed node technology. Creating a simulated environment, however, reduces the time to get the production running at any new plant. A guidepost to digital twins by NVIDIA explains that workers can be trained on expensive systems before they’re even installed using this novel technology. Once trained, workers can qualify, operate and service these machines without having to set foot in the ultra-clean rooms where they’re installed. Thus, a virtual fab will allow specialists to design and test new processes quicker and cheaper without disrupting operations at a physical plant.  Along with creating a digital copy of an entire factory, manufacturers can use AI to process data from sensors inside actual factories and find new means to route material that can reduce waste and speed operations.

_______

Digital Twin for disaster management:

Digital twins can be used to simulate and analyze different disaster scenarios, such as fires, floods, and earthquakes, to understand the impacts on communities and infrastructure better. They can help to identify potential risks and vulnerabilities and to develop and test response plans. They monitor and track the progress of disasters in real-time, providing critical information to disaster response teams and enabling them to make informed decisions about how to respond. Digital twins enable optimization and allocation of resources, such as personnel, equipment, and supplies, to areas where they are needed most. The digital twin can ensure that resources are used efficiently and effectively during a disaster response. Digital twins can be used to assess the damage caused by disasters and to develop repair and reconstruction plans. This helps prioritize the repair and reconstruction efforts and ensures that resources are used most effectively.

______

______

Moral of the story:    

_

-1. The fourth industrial revolution or Industry 4.0 is a concept that refers to the current trend of technology automation and data exchange, which includes cyber-physical systems (CPSs), the Internet of things (IoT), cloud computing, cognitive computing and developing smart businesses and smart factories. Digital Twin is at the core of this new industrial revolution bringing in unlimited possibilities. The traditional approach of building something and then tweaking it in new versions and releases is now obsolete. With a virtually-based system of designing, the best possible efficiency level of a product, process, or system can be identified and created simply by understanding its specific features, its performance abilities, and the potential issues that may arise. 

_

-2. Digital Twin (DT) can be defined as a software representation of a physical asset, system or process designed to detect, prevent, predict and optimize through real time analytics to deliver business value. There are many definitions of a digital twin but the general consensus is around this definition “a virtual representation of an object or system that spans its lifecycle, is updated from real-time data; and uses simulation, machine learning, and reasoning to help decision-making.”  A digital twin is a virtual instance of a physical system (twin) that is continually updated with the latter’s performance, maintenance, and health status data throughout the physical system’s life cycle. IBM defines it as “a virtual representation of a physical object or system across its lifecycle, using real-time data to enable understanding, learning and reasoning”.

A digital twin leverages Internet of Things, big data & data analytics, artificial intelligence, blockchain, augmented & virtual reality technologies, collaborative platforms, APIs, and open standards.

Initially, digital twins were deployed in the cloud and usually built on IoT platforms. The platforms provided an IT infrastructure including data management, analytics, and a variety of services. As adoption of performance twins advanced, limitations were found, and edge computing deployment alternatives evolved. Edge computing enables transferring part of the computing, storage, and networking capabilities of cloud to the local network which makes possible of low latency, real time response, with reduction of network traffic.

_

-3. It is a common misconception that Digital Twins only come into use once the physical system has been created but Digital Twins are digital replicas of real and potential physical assets (i.e., “Physical Twins”). The digital twin can be any of the following three types as it evolves: 1) digital twin prototype (DTP); 2) digital twin instance (DTI); and 3) digital twin aggregate (DTA). A DTP is a constructed digital model of an object that has not yet been created in the physical world, e.g., 3D modelling of a component. The primary purpose of a DTP is to build an ideal product, covering all the important requirements of the physical world. On the other hand, a DTI is a virtual twin of an already existing object, focusing on only one of its aspects. Finally, a DTA is an aggregate of multiple DTIs that may be an exact digital copy of the physical twin; for example, the digital twins of a spacecraft structure.

_

-4. Digital twins play the same role for complex machines and processes as food tasters for monarchs or stunt doubles for movie stars: They prevent harm that otherwise could be done to precious assets. A digital twin is the digital representation of a physical object in a computer. For example, an aircraft can be fully simulated on a computer to understand how a bird strike would affect its ability to land safely. This would provide a much more accurate prediction of the engine’s condition and thus its safety.

_

-5. Digital engineering provides the tools and techniques that are used to create, manage, and analyze digital twins, while digital twins provide a means by which to apply these techniques and tools to specific systems and applications. Digital engineering and digital twins are often used together in a variety of applications, such as engineering design, manufacturing, supply chain management, and maintenance.    

_

-6. From a pragmatic Systems Engineering perspective, Physical Twins represent “Systems-of-Systems” and Digital Twins represent “Simulations-of-Simulations” of “Systems-of-Systems”. Digital Twins are produced by systematically applying Digital Engineering technologies, including both dynamic (behavioral) and mathematical (parametric) modeling and simulation (M&S), in a synergistic manner that blurs the usual distinctions between real physical systems and virtual logical systems. From a Systems Integration & Testing perspective, a Digital Twin should be indistinguishable from its Physical Twin counterpart.

-7. Generally, there are two types of DTs — physics-based twins and data-based twins (machine learning based).

Physics-based twins rely on physical laws and expert knowledge. They can be built from CAD files and used to simulate the work of comparatively simple objects with predictable behavior — like a piece of machinery on the production line. Physics-based simulation uses analytical and numerical methods to model the behavior of products and systems. The key downside is that updating such twins takes hours rather than minutes or seconds. So, the approach makes sense in areas where you don’t need to make immediate decisions.

Contrasted with the physics-based type, data-based twins (ML based twins) don’t require deep engineering expertise. Instead of understanding the physical principles behind the system, they use machine learning algorithms (typically, neural networks) to find hidden relationships between input and output. The data-based method offers more accurate and quicker results and is applicable to products or processes with complex interactions and a large number of impact factors involved. On the other hand, to produce valid results it needs a vast amount of data not limited to live streams from sensors.

The key functionality of digital twin implementation through physics based models and data driven analytics is to provide accurate operational pictures of the assets. 

The hybrid twin is a simulation model working at the intersection of physics-based and data-based digital twins. A hybrid twin increases the accuracy of simulations by reducing the errors to near-zero. As a result, it enables you to study large systems or a system of systems. This would be completely impractical using physics-based simulation models alone. Physics-based simulation model, often deviates from reality, especially when dealing with complex and large systems. The computational resources and time required to model such systems is impractical from both a time and a cost perspective. Companies building complex and interconnected products, those adopting simulation-driven product development, and those building large systems stand to benefit from a hybrid twin approach. Hybrid digital twins can make more accurate, data-driven predictions that lead to more efficient operations, less prototyping and material waste, as well as less energy usage.   

_

-8. A digital twin system contains hardware and software components with middleware for data management in between. Hardware includes Internet of Things (IoT) sensors, actuators, network devices like routers, edge servers, and IoT gateways, etc. The middleware platform’s bare-bones element is a centralized repository to accumulate data from different sources and it takes care of such tasks as connectivity, data integration, data processing, data quality control, data visualization, data modeling and governance, and more. The Software takes the data from the sensors, gathered and stored by the middleware, and turns those observations into valuable insights about the process. In many cases, Software is powered by machine learning models. Other must-have pieces of a DT puzzle are dashboards for real-time monitoring, design tools for modeling, and simulation software.

_

-9. Digital twins are not standalone applications. Digital twins integrate into the organization’s existing enterprise application suite to support the intended business outcomes. Digital Twin is not off-the-shelf technology. Each DT is as unique as a product or process it represents. While ready-to-use infrastructures, platforms, and models can facilitate the development, but they won’t do all the work. You will still need experts in data, machine learning, cloud technologies, and, of course, engineers, capable of integrating different parts of hardware and software puzzles.

_

-10. Several technologies come together to make DT a reality such as 3D simulations, IoT/IIoT, AI & ML, big data analytics, cloud computing, edge computing, 5G connectivity and augmented/virtual/mixed reality. To create a digital twin, the engineers collect and collate all kinds of data. This includes physical, manufacturing and operational data, which is then synthesized with the help of Artificial Intelligence (AI) algorithms and analytics software. The end result is a virtual mirror model for each physical model that has the capability of analyzing, evaluating, optimizing and predicting. It is important to constantly maintain synchronous state between the physical asset and the digital twin, so that the consistent flow of data helps engineers analyze and monitor it.

For simple applications, digital twin technology offers value without having to employ machine learning. Simple applications are characterized by a limited number of variables and an easily discoverable linear relationship between inputs and outputs. However, most real-world systems that contend with multiple data streams stand to benefit from machine learning and analytics to make sense of the data. Machine learning, in this context, implies any algorithm that is applied to a data stream to uncover/discover patterns that can be subsequently exploited in a variety of ways.

Oftentimes, cloud computing is the best platform for processing and analyzing big data. Additionally, an intelligent DT system can only be developed by applying advanced AI techniques on the collected data. To this end, intelligence is achieved by allowing the DT to detect (e.g., best process strategy, best resource allocation, safety detection, fault detection), predict (e.g., health status and early maintenance), optimize (e.g., planning, process control, scheduler, assembly line), and take decisions dynamically based on physical sensor data and/or virtual twin data. In short, IoT is used to harvest big data from the physical environment. Later, the data is fed to an AI model for the creation of a digital twin. Then, the developed DT can be employed to optimize other processes in the industry.

Integrating digital twins with AI helps to identify outcomes to complex virtual scenarios thereby improving product quality and efficiency. AI also help in understanding complex virtual data, creating multiple variables that would not be possible in real-world data. The real boom for digital twins comes from AI and its predictive capabilities. 

_

-11. Digital twins involve ingesting large volumes of data in order to arrive at actionable insights. Visualization is equally important, so that managers and executives can fully understand the data, and drive actions based on the insights provided. Augmented Reality (AR) and Virtual Reality (VR) offer immersive experiences to visualize such insights. While AR and VR constitute the avenue of data ingestion and assimilation, they are not essential building blocks of the digital twin, but rather convenient technologies for obtaining a holistic, immersive understanding of the asset by leveraging the digital twin paradigm.

_

-12. While digital twin is sometimes mistaken as IoT or Computer-Aided Design (CAD), it is fundamentally different as IoT is characterized solely by the physical implementation, whereas CAD focuses exclusively on stand-alone representation in the digital domain. Although the CAD models, IoT dashboards, 3D renderings/immersive walkthroughs, and gaming environments are not Digital Twins in themselves, they represent useful visualization and building blocks of Digital Twin solutions, and often represent the first steps in a customer’s Digital Twin journey.  Digital twin’s dynamic, real-time, and bi-directional data connection features are keys to distinguishing the digital twin from other closely related technologies. These features bring various benefits as the physical product can adapt to modify its real-time behaviour concurrently to the feedback generated by the digital twin. Conversely, the connection allows the simulation to be able to precisely mirror the real-world condition of the physical body.  

_

-13. Digital twin continuously updates itself from multiple sources to represent its near real-time status, working condition or position. Digital twin is data-driven learning system, learns from itself, using sensors that conveys data of various aspects of its operating condition; from human experts, such as engineers with deep and relevant industry domain knowledge; from other similar machines; and from the larger systems and environment which it may be a part of. A digital twin also uses the data from past machine usage to factor into its digital model.

_

-14. Model is the core of digital twin. This means a model that emulates the behavior of the physical system, such as a simulation, so that when you give it an input, the model returns a response output. Models of digital twin comprise semantic data models and physical models. Semantic data models are trained by known inputs and outputs, using artificial intelligence methods. Physical models require comprehensive understanding of the physical properties and their mutual interaction. Thus, multi-physics modelling is essential for high-fidelity modelling of digital twin. Virtual models in digital twin system ought to be faithful replicas of physical entities, which reproduce the physical geometries, properties, behaviors, and rules.

_

-15. Digital twins can be made using 3D models, 3D scans, and even 2D documentation. While many digital twins utilize a 3D visual representation of some kind, this is not essential. The requirement to qualify as a digital twin is the link between the physical and virtual worlds where data is transmitted bi-directionally automatically between the two worlds. The data is essential, and many organizations choose to visualize that data through a 2D or 3D model of some kind – on PC displays as well as through augmented reality (AR) and virtual reality (VR) devices. 

3D models and digital twins are easily confused because they look similar at first glance. In both cases. what you see on the screen is a detailed visualisation of your physical asset in three dimensions. The difference – and it’s a big one – is the data that appears on the 3D model. 3D models populated by static data simply aren’t that useful, particularly in operations. Quite often the data is out of date. Because the data isn’t always updated, it isn’t 100% trusted. The main difference between the 3D model and a digital twin can be expressed in two words: dynamic data. Unlike static data, dynamic data changes and updates in real-time on the 3D model as soon as new information becomes available. Even though many organizations use the term ‘Digital Twin’ synonymously to 3D model, a 3D model is only a part of DT.  

_

-16. CAD (3D model) is used for both simulation technology and digital twin technology. Both simulations and digital twins utilize digital models to replicate a system’s various processes. But optional one-time data exchange builds the simulation from physical counterpart using 3D model while continuous bidirectional data exchange builds digital twin from physical counterpart using 3D model.

A simulation is the imitation of the operation of a real-world process or system over time. Simulations require the use of models; the model represents the key characteristics or behaviors of the selected system or process, whereas the simulation represents the evolution of the model over time. Computers are used to execute the simulation.  A simulation is a model that mimics the operation of an existing or proposed system, providing evidence for decision-making by being able to test different scenarios or process changes. The terms simulation and digital twin are often used interchangeably, but they are different things. Although simulation technology and digital twins share the ability to execute virtual simulations, they are not the same. In both cases, the simulation is happening on a virtual model. CAD simulations are theoretical, static and limited by the imaginations of their designers, while digital twin simulations are active and use actual dynamic data. Simulation is an important aspect of digital twin. Digital twin simulation enables virtual model to interact with physical entity bi-directionally in real-time. This interaction means information can flow in either direction to create a cyber-physical system, where a change in one affects the other. Multi-physics, multi-scale simulation is one of the most important visions of digital twin. You can’t have a digital twin without simulation. The advantages of digital twin over a more basic, non-integrated CAD-based simulation are evident for monitoring valuable products such as wind turbines. However, digital twins can be costly, requiring the fitting of sensors and their integration with analytical software and a user interface. For this reason, digital twins are usually only employed for more critical assets or procedures, where the cost is justifiable to a business.     

Example:
While a simulation can mirror a physical process, place, person, or product – it never once measures this counterpart. For example, it is very possible to simulate New York City’s traffic map: routes, lights, signs, buildings, crosswalks – everything. If, however, a massive sink hole opens up and traffic has to be re-routed, the simulation will not reflect this. It is an unanchored digital representation of a physical location but without the constant measuring and reflecting that goes on in a digital twin.  

_

-17. A digital twin fuses CAD (3D model), simulation and sensor data to create actionable data—data that can be used to predict outcomes based on certain scenarios—and to help businesses make productive decisions. It works by having the 3D geometry of a part or system, connecting it with sensor data coming from operation, so that we can see a digital replica in 3D of the behavior of the part or system in service. With this it is possible to run 2D and 3D simulations to predict behavior. 2D simulation uses machine learning that looks at historical data to make predictions which is used for better control of the process and predictive maintenance. 3D simulation uses physics to run innovative virtual scenarios to find new ways of improving efficiency of operation and reduce cost. Combining 2D and 3D gives the user a complete remote control of their physical asset, as in hybrid digital twin.

Note:

The performance of 2D machine learning far exceeds that of 3D machine learning mainly because of the relatively small size of the available 3D datasets. Also, 3D ML requires significantly higher computation power than 2D ML analyses and the solution time taken by 2D ML simulation is orders of magnitude less than 3D ML.    

_   

-18. BIM stands for Building Information Modelling and as the name implies, the core of BIM is INFORMATION. BIM is an intelligent model-based process/methodology for creating 3D building design models that incorporate all of the information required for a building’s construction and help in optimizing the work of different disciplines such as architecture and construction. BIM is a fundamental part of constructing a digital twin of a building, but they should not be confused with digital twins, because they are different. The fundamental difference between digital twins and BIM is that the latter is a static digital model, while the former is a dynamic digital representation. In other words, BIM lacks spatial and temporal context elements that characterize digital twins. BIM builds static models which do not include the dynamics of a live digital twin. In comparison, digital twins harness live data to evolve and replicate the real world. Building maintenance and operations are best served by digital twins whereas construction and design are better addressed by BIM.

_  

-19. Data is the basis of digital twin. One thing that binds most definitions of DT other than being a virtual representation of a physical object is the bidirectional transfer or sharing of data between the physical counterpart and the digital one, including quantitative and qualitative data (related to material, manufacturing, process, etc.), historical data, environmental data, and most importantly, real-time data. DT deals with multi-temporal scale, multi-dimension, multi-source, and heterogeneous data. Some data is obtained from physical entities, including static attribute data and dynamic condition data. Some data is generated by virtual models, which reflects the simulation result. Some data is obtained from services, which describes the service invocation and execution. Some data is knowledge, which is provided by domain experts or extracted from existing data. Some data is fusion data, which is generated as a result of fusion of all the aforementioned data. The basic purpose of data processing is to extract and derive data that are valuable and meaningful from large, potentially cluttered, incomprehensible amounts of data.

_

-20. Data is currency when it comes to building digital twins. These ‘living’ and constantly evolving models need to have a high-level of validity, and up-to-date data is crucial to this. Drones have a role to play here, as they provide an effective data capture tool, collecting highly-accurate information, safely and efficiently. This is especially important for hard-to-reach assets, or in environments where asset downtime needs to be kept to the absolute minimum. Drones are capable of collecting complete, reliable and well-organised baseline datasets to flow into the complex and layered process of building a digital twin. The ease of deployment and cost effectiveness of drones enables organisations to conduct the regular surveys needed to help digital twins constantly evolve to stay relevant and up to date.  

_

-21. Digital thread is the most important basic technology for digital manufacturing. Digital thread is defined as a communication network that enables a connected flow of data as well as an integrated view of an asset’s data across its lifetime through various isolated functional perspectives. This concept promotes transmission of the correct information, to the correct place, at the correct time. The digital thread is the communication framework that links all information that belongs to a product instance across the development process and IT systems and ultimately enables manufacturers to re-purpose, reuse and trace product information throughout the product development lifecycle and supply chain. The digital thread increases the supply chain efficiency by 16% and enables delivery of new products to the market 20% faster.      

You can have digital thread without digital twin but you can’t have digital twin without digital thread. In the context of digital twins, the digital thread is the link between the real-world product, machine, or process and its digital twin. Digital threads comprise all digital twins’ temporal details, including design data, simulation, testing and performance data, software and hardware development strands, supply chain data, production data, and actual performance data after the product is used. Digital threads are complete records of all details of specific aspects of product definition, development, and deployment, from conception through end of life.

_

-22. IoT is the Backbone for Digital Twin. Conversely Digital Twins anchor the IoT. You need to deploy IoT devices to gather live data about the real object and transfer the data to the digital replica in a computer server. Since IoTs can communicate among themselves and with a central server, it becomes easy to collect holistic data about the physical object you monitor. IoT devices constantly feed data so the digital twin model can analyze and present performance insights. Clearly, the availability of cheap, reliable IoT sensors is partly responsible for making digital twins possible besides availability low-cost data storage & computing power, and robust, highspeed wired and wireless networks.

_

-23. The Metaverse is a term that refers to a virtual world that exists entirely in a digital form. Metaverse requires a digitalized copy of the real world as an entry point to provide fully connected, immersive, and engaging 3D experiences. Many businesses and enterprises are now exploring and building on the metaverse-based fundamentals to introduce new possibilities and experiences for digitally-driven consumers. By deploying digital twins, organizations can introduce dimensionally precise real-life spaces into the metaverse virtual mirror world. Digital twins are one of the metaverse’s core building blocks because of their intrinsic qualities. While the metaverse can help us create virtual worlds and experiences beyond our dreams, it will also be useful in constructing exact replicas of reality. With their inherent features and functionalities, digital twins can bring realism to the digital world.  

_

-24. The excellent report by ARUP uses classification of digital twins by looking at their sophistication. This sophistication is expressed in terms of autonomy, intelligence, learning and fidelity. The levels range from 1, being a non-intelligent, non-autonomous digital twin, to 5, which is a digital twin that replaces human beings for certain non-trivial tasks. The framework moves through five levels, beginning with a simple digital model. As the model evolves, feedback and prediction increase in importance. At higher levels, machine learning capacity, domain-generality and scaling potential all come into play. By the highest levels, the twin is able to reason and act autonomously, and to operate at a network scale (incorporating lower-level twins, for example).

_

-25. At first the complexity and the cost involved in building digital twins limited their use to the aerospace and defense sectors as the physical objects were high-value, mission-critical assets operating in challenging environments that could benefit from simulation. Relatively few other applications shared the same combination of high-value assets and inaccessible operating conditions to justify the investment. That situation is changing rapidly. Digital twins are now proving invaluable across multiple industries, especially those that involve costly or scarce physical objects. While digital twins have great potential, their use is not a necessity for every fabricated product. Not all objects can be considered as complicated enough in order to need the intense and tactic flow data sensors required by digital twins, neither is it always worth it from a financial point of view to invest important resources for the creation of a digital twin. Therefore, the industrial sectors that have a need to employ digital twins are those that develop and produce products of the niche sector.   

_

-26. North America is one the early adopters of digital twin technology and most of the digital twin providers are located in the North American region, including Microsoft (US), IBM (US), Amazon (ASW) (US), and General Electric (US). National Aeronautics and Space Administration (NASA) was the early adopter of digital twin in North America. The region is concentrated with aerospace companies such as Bell Flight (US), Bye Aerospace (US), Lockheed Martin (US), and Boeing (US) who constantly invest in digital twin technology and carry out huge R&D with the technology to optimize their production and reduce the downtime and operational cost.

_

-27. Digital twins are virtual depictions of the forces, interactions, and movements that assets may experience in the real world. This enables users to interact with three-dimensional, dynamic content that responds in real time to their actions. They may accurately mimic real-world situations, what-if scenarios, and any situation imaginable in this virtual environment, and instantaneously view the results on any platform, including mobile devices, computers, and augmented, mixed, and virtual reality (AR/MR/VR) devices. The digital twin is the key to effective decision making in this new world.  Making better decisions, faster, that can be executed perfectly every time is vital for delivering superior results.

_

-28. Digital twins enable you to optimize, improve efficiencies, automate, and evaluate future performance. The twin harnesses the power of data and models to provide early warning detection, continuous prediction, and dynamic optimization capabilities. You can use the models for other purposes such as virtual commissioning or to influence next-generation designs. A digital twin can be used to save time and money whenever a product or process needs to be tested, whether in design, implementation, monitoring or improvement.

_

-29. Digital twins (DTs) in complex industrial and engineering applications have proven benefits that include increased operational efficiencies, enhanced safety and reliability, reduced errors, faster information sharing, and better predictions. It can be a great tool for companies to increase their competitiveness, productivity, and efficiency.

_

-30. DT enables cost-effective prototyping. Due to DT involving mostly virtual resources for its creation, the overall cost of prototyping decreases with time. In traditional prototyping, redesigning a product is time-consuming as well as expensive because of the use of physical materials and labour, and on top of that, a destructive test means the end of that costly prototype, whereas using DT, products can be recreated and put through destructive tests without any additional material cost. Thus, assuming even if the cost is equal at the start, the physical costs keep increasing as inflation rises but the virtual cost decreases significantly as time progresses. DT allows the testing of products under different operating scenarios, including destructive scenarios, without any additional costs. Moreover, DT can reduce operating costs and extend the life of equipment and assets once implemented.

_

-31. DT can predict problems and help System Planning. Using DT, we can predict the problems and errors for future states of its physical twin, providing us an opportunity to plan the systems accordingly. Due to the real-time data flowing between the physical asset and its DT, it can predict problems at different stages of the product lifecycle. This is beneficial especially for products that have multiple parts, complex structures, and are made up of multiple materials such as aircraft, vehicles, factory equipment, etc., because as the complexity of any product increases, it gets harder to predict component failures using conventional methods.

_

-32. DT enables remote accessibility. The physical device can be controlled and monitored remotely using its DT. Unlike physical systems, which are restricted by their geographical location, virtual systems such as DT can be widely shared and can be remotely accessed. Remote monitoring and controlling of equipment and systems becomes a necessity in a situation where local access is limited, like during the COVID-19 pandemic when lockdowns have been enforced by governments and working remotely or non-contact is the only viable option. In industries such as oil and gas or mining where the working conditions are extreme and hazardous, the capability of DT to remotely access its physical twin, as well as its predictive nature, can reduce the risk of accidents and hazardous failures.

_

-33. Manufacturing, retail, supply chain, logistics, healthcare, renewable energy, construction, business, aerospace, smart city, agriculture, cargo shipping, drilling platform, automobile, electricity, natural disaster detection, communication, security and climate change are all served by digital twins.  

_

-34. Unilever implemented digital twins of its manufacturing production line process to increase productivity, reduce waste, and make better operational decisions. Boeing achieved a 40% improvement rate in the first-time quality of parts using the digital twin concept. SHoP Architects use real-time 3D digital twins to envision skyscrapers before they’re built. Using digital twin patterns, GE has realized these specific benefits: 93 to 99% increased reliability in less than 2 years, 40% reduced reactive maintenance in less than 1 year, 75% reduced time to achieve outcomes, $11M avoidance in lost production by detecting and preventing failures.

_

-35. Smart cities are building 3D replicas of themselves to run simulations. Digital twins of smart cities can enable better planning of construction as well as constant improvements in municipalities. These digital twins help optimize traffic flow, parking, street lighting and many other aspects to make life better in cities, and these improvements can be implemented in the real world.

_

-36. The biggest difficulty that hinders architects in terms of coming up with new building designs and projects is the practicality of them. Meaning, that construction design developers are forced to limit their creativity because if a new building design or concept is created, it must be approved to match all of the safety requirements. This presupposes that the new design has to be tested in the real world, which will take up an immense amount of capital, time and other resources just to find out if it can be done or not. If building developers would have the ability to test out their ideas in reality-based simulations that would involve all the necessary real-world factors (e.g. gravity, weather, wind etc.), the total timeframe during which they could share their ideas and get them approved would be cut 100x. The safety, practicality, and sustainability of the new building designs would get tested in a simulation and the feedback would be just as accurate as it would be if the test was executed in real life because the simulation directly derives data from the world.

_

-37. Digital twins help retailers minimize capital expenditures by 10%, reduce excess inventory by 5%, and improve EBITDA (earnings before interest, taxes, depreciation and amortization) by 1-3%.

With digital twins, retailers can:

  • Effectively manage product supplies.
  • Avoid supply chain disruptions
  • Optimize logistics costs.

_

-38. Digital twin technology is drastically transforming the supply chain. As the global manufacturing and distribution network continues to expand, new risks arise for both manufacturers and consumers. Indeed, a more complex and far-reaching supply chain can become vulnerable to inefficiencies, counterfeit, and fraudulent activities. By creating a digital twin of the supply chain, manufacturers are able to gain visibility into each step of the product’s journey to the end user.

_

-39. Aguas do Porto (AdP), a Portuguese utility organization, is responsible for the water supply in the city of Porto. AdP uses digital twins to forecast flooding and water quality issues, improve city services and responsiveness, and ensure resilience of water infrastructure. Digital twins enable AdP to monitor the water supply systems in real time. They are also used to create forecasts on water consumption and simulate burst pipe scenarios along with valve and pump shutdowns. This is confirmed by AdP: they were able to increase operating gains by 23% and shorten the time required to repair pipe bursts by 8%. They also reduced water supply interruptions by 23% and the number of sewer collapses by 54%.

_

-40. If a power station is out of commission due to storm damage, a digital twin can automatically funnel power to connected substations to make up for the shortfall — temporarily reducing energy availability in areas where there is less usage or demand. The advanced technology provides clear models that will help ensure that the lights stay on and the heating systems remain online. Staff can respond to challenges and crises in real-time, using a standard interface, ensuring as steady and efficient a flow of power as possible. Digital twins can empower utilities, giving them the necessary asset intelligence to respond to unexpected emergencies with urgency, without negatively impacting the grid’s effectiveness.

_

-41. Digital twins are being applied to climate modeling.  Such work can help assess drought risk, monitor rising sea levels and track changes in the polar regions. It can also be used for planning on food and water issues, and renewable energy such as wind farms and solar plants.

_

-42. Network digital twins speed up initial deployments by pretesting routing, security, automation and monitoring in simulation. They also enhance ongoing operations, including validating network change requests in simulation, which reduces maintenance times.

_

-43. Beginning from the design to operation, digital twin technology opens the avenues for cost-effective and efficient development of sustainable electric vehicle technologies.

_

-44. The critical beneficial aspects for healthcare from digital twin technologies are summarized as improvements in the following: information sharing, education, monitoring, diagnosis, precision medicine/treatment, medical resource management, facility operation management, and research advancement.

_

-45. Precision medicine (more generally referred to as personalized medicine) is an emerging approach for disease treatment and prevention surrounding the use of new diagnostics and therapeutics targeted to the needs of a patient based on their own genetic, biomarker, phenotypic, physical or psychosocial characteristics. The aim is to deliver the right treatments, at the right time, to the right person. DT applications in healthcare can contribute to the broad trend of precision medicine to maximize the efficiency and efficacy of the healthcare system by shifting from current clinical practice with ‘one-size-fits-all’ treatments to take inter-individual variability into greater account.

_

-46. Approximately 80% of clinical trials face delays in the enrolment phase, and 20% of trials fail to meet overall enrolment goals. The problems occurring in the enrolment stage (e.g., finding participants who fit the criteria and are willing and able to participate), coupled with the trend of personalized medicine of identifying smaller target populations, make clinical trials increasingly expensive, time-consuming, and inefficient. DTs may allow the creation of unlimited copies of an actual patient and treat them computationally with a large variety of drug combinations that could act as the control group. This way, DTs of real patients could be used to test early-stage drugs to accelerate clinical research, minimize their hazardous impact and reduce the number of expensive trials required to approve new therapies. However, currently the level of technological advancement does not allow us to predict all possible influences of a newly created drug, so for the moment, computer-based research cannot completely take over real-life clinical trials.  

_

-47. Predictive maintenance is, quite simply, the ability to fix what isn’t broken…yet. The digital twin can predict – with incredible accuracy – when and where the next breakdown will occur. Even if an organization chooses not to pursue predictive maintenance capabilities, digital twin readout will still be very useful for corrective maintenance operations.

_

-48. In education, the main stakeholders namely Teachers and Students can leverage the benefits of Digital Twin technology to a larger extent. The usage of DT starts right from curriculum design and penetrates deeply into various facets of teaching-learning process. At the educational institutions of engineering digital twins are software-models of industrial plants, which are simulated and visualized similar to its industrial originals and synchronized with them for training purposes.  

_

-49. Digital twin technology provides a tremendous opportunity to achieve sustainability targets. Digital twins enable us to do more with fewer resources: from designing a new product to planning the production and manufacturing it — even all the way to repairing and recycling.

_

-50. The environmental digital twin technology can reduce carbon emissions of buildings and cities worldwide.

_

-51. The digital twin brings smart farming for boosting farming productivity and sustainability.

The benefits of Digital Twins in agriculture include:

-Greater yields on the same acreage

-Profitability maximization

-More resilience to weather

-More sustainable operations

-Faster time to market

_

-52. Reducing food waste with DT technology: 

Roughly one-third of the food produced in the world for human consumption every year—approximately 1.3 billion tonnes—gets lost or wasted. A large amount of food produced gets wasted in different parts of the supply chain such as farms, storehouses, logistics, and processing units. Along with this, a significant amount of food wastage is also experienced at the consumer end. Of the total global food wastage, 40–45% is fruits, vegetables, and root crops.

Food wastage and spoilage within the food supply chain can be attributed to uncertainties in demand and supply, time delay, and changes in environmental conditions at different stages of the food supply chain. This is equivalent to an annual economic loss of USD 1.2 trillion. It also causes a significant environmental footprint amounting to about 8% of greenhouse gas emissions worldwide.

Food wastage is at the level of a global crisis.

Real-time monitoring of food quality has emerged as a viable solution that can benefit all stages of the food supply chain, starting from farmers to end consumers like us. It helps reduce significant economic losses which occur due to food spoilage and wastage while retaining quality and nutritional value.

_

-53. Two of the most promising and near-term applications of the digital twins in the nuclear industry are to help the transition, 1) from hands-on to autonomous control, and 2) from planned to predictive and risk-informed maintenance. Autonomous control is a significant cost-savings and safety feature that will enable new microreactors to come online more readily and quickly meet regulatory standards. DT could help predict that a certain component will fail before it actually does, meaning that an unexpected outage could be prevented. This also allows the scientists to optimize the reactors’ maintenance schedule and tasks. 

_

-54. There is a need to build faster and more efficient communication interfaces such as 5G to enable real-time data connectivity and operational efficiency for the DT. On the other hand, 5G deployment can be accelerated with the network digital twin as network digital twins create a replica of 5G networks, devices and user activity to validate 5G network functions in conditions that closely mimic real-world environments.

_

-55. Once the real-world version is in place, the digital twin will likely need to be updated to reflect any changes made in the real-world that weren’t captured by the digital version. Otherwise, the digital twin becomes an inaccurate representation of the real-world situation, and using it for future updates, modifications, or maintenance could lead to wasted effort, time, and cost.

_

-56. One of the biggest challenges DT needs to overcome to reach its full potential is the high cost associated with its implementation. The whole process of developing ultra-high-fidelity computer models and its simulation of processes to create a DT is a time-consuming and labour-intensive exercise that also requires a huge amount of computational power to run, thus making DT an expensive investment. A paper published by West and Blackburn gives a glimpse of the scale of cost and time invested in bringing DT into reality. The authors claim that it could cost trillions of dollars and hundreds of years to completely implement Digital Threads/Digital Twins of weapons systems for the U.S. Air Force, making it impractical to fully realize the technology. This makes it very crucial for industries to perform cost-benefit analysis before implementing DT. Due to the expensiveness of DT implementations, their accessibility is limited in developing countries.

_

-57. Important concern over DT technology is related to the products that have long life cycles such as buildings, aircraft, ships, machinery, or even cities. The life cycles of such products are far longer than the validity of the software used for designing or simulating the DT as well as for storing and analyzing the data for DT. That means that there is a high risk, in the future at some point of time, of either the formats used by software becoming obsolete or becoming locked with the same vendor for new versions of software or other authoring tools.

_

-58. Despite all promises and even proven examples of success, digital twins still don’t see wide adoption. To some extent, the complexity of their creation is to blame. Another reason is a scarcity of industry standards that restrains communication and data exchange across platforms and apps from different vendors.

_

-59. Bad data that is either inconsistent, inaccurate, or incomplete cannot be trusted as a source for creating simulations. It’s a risk and one that needs addressing urgently to avoid wasted budgets and poor decisions. Don’t allow bad data to ruin your digital twin plans. For digital twins, poor data quality is the Achilles heel. Imagine the scenario where the underlying machine learning and AI algorithms are designed using poor-quality data. Imagine this also running within digital twins built using the same poor-quality data. It exacerbates the problem and will lead to inaccurate anomaly alerts and predictions that cannot be trusted. In short, a waste of time and money, with what is an excellent technology for actually saving time and money.

_

-60. At the most advanced level, a digital twin could be something far more complex and multi-layered, incorporating virtual projections of almost anything. This is the scale of digital twin deployed to inform decisions and test solutions to population growth, congestion, climate change, you name it. A model of this complexity would be very challenging for a contractual framework to govern. Issues such as data ownership, causation and liability could all potentially be unclear, difficult to unravel, and contentious. The range of potential legal issues will no doubt expand as the use of digital twins evolves.

_

-61. Most digital twins exist in public clouds, for the obvious reason that they are much cheaper to run and can access all of the cloud’s storage and processing, as well as special services such as AI and analytics, to support the twin. Moreover, the cloud offers purpose-built services for creating and running twins. The ease of building, provisioning, and deploying twins has led to a few issues where the digital twin becomes the evil twin and does more harm than good. The point is that attaching simulations to real devices, machines, and even humans has a great deal of room for error. Most of these can be tracked back to the people creating twins for the first time who don’t find their mistakes until shortly after deployment. The problem is that you could crash an airplane, scare the hell out of a patient, or have a factory robot go aflame. DT technology could also be used by adversaries for ransomware, phishing and even cyber warfare.   

_

-62. Digital twin and cybersecurity can be portrayed in 2 segments.

(1. Security of digital twin and its physical twin

(2. Use of digital twin to help cybersecurity

(1. The security of digital twins needs to be considered from two sides – the security of the digital twin and the system in which it is implemented, and the security of the real-world object, system, or process it represents. Because digital twins take in vast amounts of data from systems across an organization and its partners, cybersecurity is a serious concern. Given the multitude of data collected by digital twins, it can prove catastrophic for the organization unless they are well-prepared and address all the security concerns beforehand. Because communication between digital twins and the systems that it interacts with is bi-directional, hackers who gain control of a digital twin can wreak havoc, taking over control of real-world systems, manipulating or stealing sensitive data and/or introducing malware that can spread to other systems. The privacy and security risks associated with digital twin deployment are manifold. If hackers gain access, loss of intellectual property and data, loss of control, and potentially huge costs resulting from down-time, faulty outputs, or data being held to ransom are the most significant risks.

Blockchain provides a decentralised solution for protecting data security and privacy during data sharing. Blockchain technology enables security from hackers due to its encryption features and provides data history transparency. Digital twins will benefit from these features, as it can transmit data securely.

(2. Digital twins can prevent cyber-attacks by providing modelling/prediction capability or through increased visibility of the system behaviours of their physical counterpart. By simulating risks in an environment that mirrors the real world, companies can predict better where hackers will strike, how the attack may be done, and how damaging it will be. This information is essential for organizations to anticipate cyber-attacks, manage the risks efficiently and respond with effective methods.  Digital twins can also help to improve the existing security approaches.

_

-63. The main differences between digital twin (DT) and cognitive digital twin (CDT) are their structure complexity and cognitive capability. A cognitive entity is an entity that can process thought, acquire input knowledge and use it to solve problems all by itself. In other words, it has a developed set of algorithms that allow it to acquire, process and store information to think and make decisions. This means that artificial intelligence that is capable of such ability will be able to solve problems, simulate the future by itself using the information it has acquired from previous sets of data and control its physical twin using this cognitive function. Cognitive Digital Twins learn by themselves, foresee the future, and act accordingly. The range of benefits that humans will be able to receive by creating cognitive digital twins might change the world of business.

_

-64. It is highly implausible that the digital twin technology would ever be able to model all the processes in a living cell let alone a human.

______

______

Dr. Rajiv Desai. MD.

March 12, 2023

_____

Postscript:

In a lighter vein, if I could construct digital twins of my adversaries in India, I could predict when, where and how they would persecute me as digital twins offer us access to new knowledge.     

_____ 

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *

Designed by @fraz699.