Why composability is key to scaling digital twins

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Digital twins enable enterprises to model and simulate buildings, products, manufacturing lines, facilities and processes. This can improve performance, quickly flag quality errors and support better decision-making. Today, most digital twin projects are one-off efforts. A team may create one digital twin for a new gearbox and start all over when modeling a wind turbine that includes this part or the business process that repairs this part. 

Ideally, engineers would like to quickly assemble more complex digital twins to represent turbines, wind farms, power grids and energy businesses. This is complicated by the different components that go into digital twins beyond the physical models, such as data management, semantic labels, security and the user interface (UI). New approaches for composing digital elements into larger assemblies and models could help simplify this process. 

Gartner has predicted that the digital twin market will cross the chasm in 2026 to reach $183 billion by 2031, with composite digital twins presenting the largest opportunity. It recommends that product leaders build ecosystems and libraries of prebuilt functions and vertical market templates to drive competitiveness in the digital twin market. The industry is starting to take note.

The Digital Twin Consortium recently released the Capabilities Periodic Table framework (CPT) to help organizations develop composable digital twins. It organizes the landscape of supporting technologies to help teams create the foundation for integrating individual digital twins. 

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

A new kind of model

Significant similarities and differences exist in the modeling used to build digital twins compared with other analytics and artificial intelligence (AI) models. All these efforts start with appropriate and timely historical data to inform the model design and calibrate the current state with model results.

However, digital twin simulations are unique compared to traditional statistical learning approaches in that the model structures are not directly learned from the data, Bret Greenstein, data, analytics and AI partner at PwC, told VentureBeat. Instead, a model structure is surfaced by modelers through interviews, research and design sessions with domain experts to align with the strategic or operational questions that are defined upfront.

As a result, domain experts need to be involved in informing and validating the model structure. This time investment can limit the scope of simulations to applications where ongoing scenario analysis is required. Greenstein also finds that developing a digital twin model is an ongoing exercise. Model granularity and systems boundaries must be carefully considered and defined to balance time investment and model appropriateness to the questions they are intended to support. 

“If organizations are not able to effectively draw boundaries around the details that a simulation model captures, ROI will be extremely difficult to achieve,” Greenstein said.

For example, an organization may create a network digital twin at the millisecond timescale to model network resiliency and capacity. It may also have a customer adoption model to understand demand at the scale of months. This exploration of customer demand and usage behavior at a macro level can serve as input into a micro simulation of the network infrastructure. 

Composable digital twins

This is where the DTC’s new CPT framework comes in. Pieter van Schalkwyk, CEO at XMPRO and cochair for Natural Resources Work Group at Digital Twin Consortium, said the CPT provides a common approach for multidisciplinary teams to collaborate earlier in the development cycle. A key element is a reference framework for thinking about six capability categories including data services, integration, intelligence, UX, management and trustworthiness.  

This can help enterprises identify composability gaps they need to address in-house or from external tools. The framework also helps to identify specific integrations at a capabilities level. The result is that organizations can think about building a portfolio of reusable capabilities. This reduces duplication of services and effort.

This approach goes beyond how engineers currently integrate multiple components into larger structures in computer-aided design tools. Schalkwyk said, “Design tools enable engineering teams to combine models such as CAD, 3D and BIM into design assemblies but are not typically suited to instantiating multi use case digital twins and synchronizing data at a required twinning rate.”

Packaging capabilities

In contrast, a composable digital twin draws from six clusters of capabilities that help manage the integrated model and other digital twin instances based on the model. It can also combine IoT and other data services to provide an up-to-date representation of the entity the digital twin represents. The CPT represents these different capabilities as a periodic table to make it agnostic to any particular technology or architecture. 

“The objective is to describe a business requirement or a use case in capability terms only,” Schalkwyk explained. 

Describing the digital twin in terms of capabilities helps match a specific implementation to the technologies that provide the appropriate capability. This mirrors the broader industry trend towards composable business applications. This approach allows different roles, such as engineers, scientists and other subject-matter experts, to compose and recompose digital twins for different business requirements. 

It also creates an opportunity for new packaged business capabilities that could be used across industries. For example, a “leak detection” packaged business capability could combine data integration and engineering analytics to provide a reusable component that can be used in a multitude of digital twins use cases, Schalkwyk explained. It could be used in digital twins for oil & gas, process manufacturing, mining, agriculture and water utilities.

Composability challenges

Alisha Mittal, practice director at Everest Group, said, “Many digital twin projects today are in pilot stages or are focused on very singular assets or processes.”

Everest research has found that only about 15% of enterprises have successfully implemented digital twins across multiple entities. 

“While digital twins offer immense potential for operational efficiency and cost reduction, the key reason for this sluggish scaled adoption is the composability challenges,” Mittal said. 

Engineers struggle to integrate the different ways equipment and sensors collect, process and format data. This complexity gets further compounded due to the lack of common standards and reference frameworks to enable easy data exchange. 

Suseel Menon, senior analyst at Everest Group, said some of the critical challenges they heard from companies trying to scale digital twins include:

  • Nascent data landscape: Polishing data architectures and data flow is often one of the biggest barriers to overcome before fully scaling digital twins to a factory or enterprise scale.
  • System complexity: It is rare for two physical things within a large operation to be similar, complicating integration and scalability. 
  • Talent availability: Enterprises struggle to find talent with the appropriate engineering and IT skills. 
  • Limited verticalization in off-the-shelf platforms and solutions: Solutions that work for assets or processes in one industry may not work in another. 

Threading the pieces together

Schalkwyk said the next step is to develop the composability framework at a second layer with more granular capabilities descriptions. A separate effort on a ‘digital-twin-capabilities-as-a-service’ model will describe how digital twin capabilities could be described and provisioned in a zero-touch approach from a capabilities marketplace. 

Eventually, these efforts could also lay the foundation for digital threads that help connect processes that span multiple digital twins. 

“In the near future, we believe a digital thread-centric approach will take center stage to enable integration both at a data platform silo level as well as the organizational level,” Mittal said. “DataOps-as-a-service for data transformation, harmonization and integration across platforms will be a critical capability to enable composable and scalable digital twin initiatives.”

Originally appeared on: TheSpuzz

Scoophot
Logo