Securing The Digital Twin

Securing The Digital Twin - securely open, flexibly opinionated - Securing The Digital Twin

A guest blog Securing The Digital Twin by our CTO Ian Bailey, was featured by TechUK as a part of their Digital Twin campaign.

Digital Twins, by their nature, bring together data from many sources.  The data tends to vary in importance – some of it is fast-changing, low-value data such as that from IoT sensors and some of it is slow-moving, high-value data such as engineering designs, systems connectivity information, operating parameters, etc.  The requirements to protect that data are just as varied – from highly confidential proprietary information right down to widely available public domain information.  Digital twins are expensive and time-consuming to build, and to get a return on that kind of investment, they need to be widely used.  So how do we maximise the amount of information that can be shared while restricting who can see the confidential information?  This can be quite a challenging problem in a Digital Twin where all the data has been tightly integrated into a single, complex model.

The first and most obvious thing to ensure is that the architecture of the digital twin is secure by designNCSC’s zero trust principles are a great place to start.  Network security and policy are just a part of the solution though – how we manage access controls on data is where digital twins present the biggest challenge. It’s easy to build a strict access control system, but then no one gets to see the data.  Equally, it’s easy to build a wide-open system where everyone can see it.  Neither approach is acceptable in most scenarios, doubly so in the case of digital twins. Because digital twins integrate data from multiple sources, often at very fine granularity, the challenge of selective data access is about as hard as it gets – it is vital to record where each data element originates from and how the data owner requires that data to be protected.

The UK National Digital Twin (NDT) programme, run by the Dept for Business and Trade, is creating a technical and policy framework to facilitate the creation of connected digital twins. Much of the data that is required to produce a useful digital twin will either be proprietary, and often provided by companies that don’t want that information falling into the hands of their competitors, or it will be sensitive, meaning that it cannot be shared without with anyone who does not have a justifiable and verified reason to see it. The NDT integration architecture requires that data is labelled before entering the secure platform. The labels hold meta-data, including the required security handling approach. The labels can be applied at a very fine grain – right down to individual fields and relationships.  The labels stay with the data wherever it goes in the digital twin platform, and conform to the UK Government Enterprise Data Headers specification. When a user accesses data from the digital twin platform, their attributes (e.g. the company they work for, their nationality, their security clearance, etc.) are compared with the data and only that data they’re allowed to see gets through the API. The platform is secure by design which allows digital twin app developers to work confidently with the APIs and security endpoints knowing that access control is taken care of.

The whole platform is audited for data access – data owners can see the processes their information has gone through, and who has accessed it. All the data is indexed for search, as a knowledge graph, and as a geo index. Each index applies access controls at a fine-grain level to ensure that the user sees all they’re allowed to, but no more.

Telicent’s CORE platform is being built to meet these security requirements and those of a number of other government customers.  CORE is free, open-source software (Apache 2.0 license) and is due for public release in Q2 2024.