Image: Adobe Stock

Digital twins: needs, challenges and understanding

Tue Oct 22, 2019
$download_content = get_field('download_content');

Find out about our user research, which aims to explore people’s understanding of the concept and definition of digital twins, and to get views about how a national digital twin might be implemented

Find out how we’ve been exploring people’s understanding of ‘digital twins’, and gathering views about how a national digital twin might be implemented

A ‘digital twin’ is a digital model of a physical asset that can be used to simulate impacts, and monitor the physical asset, sometimes in real time. Its functionality can better support the design, operation and maintenance of our built environment, making it safer and more efficient.

As well as the value of digital twins for individual assets, there is a vision for a ‘national digital twin’ (NDT) – an ecosystem of connected digital twins across all sectors and at a national scale.

To support the creation of an NDT, the Digital Framework Task Group (DFTG) – run by the Centre for Digital Built Britain – is leading the development of a ‘digital framework for secure sharing of infrastructure data’. It also includes nine principles – the Gemini Principles – to guide the creation of an NDT.

User research

Our R&D project on digital twins aims to understand how digital twins can connect and interact with each other. As a part of the discovery phase, we carried out some user research to better understand:

  1. What’s new about the concept of digital twins? Is there an agreement on what it is?
  2. Are there relevant pain points to highlight when implementing digital twins – and recommended best practices?
  3. What are the thoughts on the idea of an NDT? What would be necessary for an ecosystem where twins are connected?
  4. How would be an NDT be established under The Gemini Principles?

To help us in our research we interviewed eight people working in relevant areas:

  • Four researchers working on research projects related to modelling simulation in areas such as energy, sensor development, development of models for live application and simulation modelling in the built environment.
  • Two representatives from the private sector: one from the engineering sector; and one from the nuclear fuel reprocessing industry.
  • We also talked to two members of DFTG to find out more about their expectations and vision regarding the creation of an NDT

The main insights

Here we outline the key insights from our research:

About the concept of digital twins

There is still a need for clarification

Very few people outside those working on digital twins understand the concept and what it represents

  • It seems that there’s still a big gap in understanding in the industry around the difference between a model and a digital twin. This identifies the first requirement: to use standard language and to understand what is a digital twin is and what is not.

Cheap sensors and updated function

  • There was also the question of what makes a digital twin new and how it differs from things such as building information modelling (BIM). Most interviewees mentioned BIM, but what makes digital twins new and different from BIM is: updated function; cheap sensors; and cloud computing.

Digital twins need to be used to intervene on the physical asset

  • There are many things similar to digital twins, roughly under the ‘model’ terminology. The consensus we heard is that the specificity of digital twins is in how they are used to affect/intervene on their physical counterpart. This means that digital twins could be used to improve efficiency and inform better decisions in sectors such as building, energy or cities. Other domains, such as weather or policy, can be modelled but are not at this point considered in scope for digital twins.

About the needs and challenges when implementing twins

Digital twins as a way to support better decisions

  • One of the most agreed points was that the model used to gain insights from sensor data has to be created in a way that supports decisions, which implies the need for validation/metrics to generate a meaningful model. Along the same lines, practitioners noted that those models are required to be maintainable, trackable, reproducible, flexible and as human-centred as possible.

The challenges of standards, efficient implementation and old systems

  • Interviewees recognised the challenge of creating a common language for different stakeholders with different domain expertise. Other challenges are: whether the digital twin can be built fast enough; how to deal with complex systems; and how to deal with old physical systems.

About the national digital twin and The Gemini Principles

Opportunities and challenges

  • Most interviewees believe an NDT would enable new players to be involved, sharing information and insights and avoiding a single big player entirely taking over the digital twin sector. A key challenge for the DFTG is how to strike the right balance between organic growth with little intervention, and mandatory regulation and standards.
  • This opportunity to implement digital twins across different sectors and share knowledge also creates the challenge of creating and using a standard language and frameworks across sectors.

Given the rate of change in technology, the DFTG expects the creation of NDT to be a multi-decade journey, where value-based principles are designed to pass from one generation to the next. This highlights the importance of iterating and learning from others, and why, although the ‘commons’ piece of the NDT roadmap is challenging, It is critical.

Interviewees also mentioned things that should be considered when planning an NDT:

  • The stakeholders: Digital twins involve different actors that which all need to be considered when creating the national digital twin: the modeller, sensor deployers, facility managers, model validators, data scientists, customers and consumers, and suppliers.
  • Translation costs: in response to the different domains and languages – again, a reference to creating standards.
  • Levels of sharing data: for an NDT, it is not about integrating the twins together directly, but more about sharing data and visualisations at the right levels.
  • Data access processes including the technical aspects
  • Decision-making processes:and governance on the NDT
  • Unification of tech across disciplines
  • Regulations – and invested time needed to develop them
  • Is connecting twins always necessary?
  • Finding good examples that others could rely on and learn from
  • Building trust in an NDT

The Gemini Principles and the potential internal tensions

As noted above, the timescale for implementing an NDT is so long that the technology will evolve alongside the NDT development. That’s why the DFTG is focusing on creating high-level principles. Practitioners and the DFTG recognise the Gemini Principles offer a guide to test individual twins against, but at the same time – and as we identified – the Gemini principles have some internal tensions making them hard to apply consistently. The practitioners also mention some tensions:

  • Security and openness: How to create an open model, but also ensure security?
  • Value creation and public good: How to assess whether an NDT is creating value and impact while ensuring it is working for the ‘public good’.
  • Quality: if there are many models from different domains, how would them be validated?

Get in touch

These interviews have helped us dig into the state of digital twins and the expectations that the DFTG has, but there are still questions in terms of how to address those concerns and what technical aspects will be required when networking digital twins.

The next phase of our R&D project will aim to address those questions and offer guidance/recommendations – for example, by working on proposals for vocabulary to be adopted across the practice.

Please get in touch if you want to help us testing out this project’s outputs.