Digital twin technology is one of the leading industry trends of 2021 so far. Digital Bulletin speaks with Philip Farah, AVP Head of Digital Transformation Services, Global Accounts at World Wide Technology to find out about the evolution of digital twins and its future role in enterprise digitalisation.
Digital twins are big business right now – what is driving this trend?
The idea of modelling the physical world has been around for centuries. This is what applied mathematics is about. With the emergence of electronics, scientists started applying various wave functions to an electronic black box to derive the architecture and composition of electronic components inside the box. In 1982 Richard Feynman proposed the idea of simulating how atoms ‘physically interact’ using quantum computers.
Today advances in computing, data management, machine learning, automation and IoT (sensors connecting the physical and virtual fabrics) are coming together to enable us to replicate the physical reality in the virtual world. The drivers behind this trend is the omnipresent human thirst to accelerate and de-risk – one can run many more simulations at low to no risk in the virtual world.
How much has the emergence of hyper-automation furthered digital twin technology?
Significantly. Automation is the result of adding a software layer (and intelligence) on top of physical items (cars, networks, cities, etc). The problem with automation is that the productivity savings that it enables come at a risk. The concept of the ‘nuclear option’ is one where an automation engine brings an entire infrastructure down on a large scale because of a bug or an un-forecasted reinforcement loop.
For example, think about trading bots creating flash crashes in the capital markets. As such, extensive simulation to test automation at a scale never required before drives the need for virtual environments designed with that purpose in mind.
How has WWT seen digital twin technology evolving over the last 12-24 months?
For the longest time, the financial services Industry has used simulation to try and model market changes, credit risk or business risk. Five years ago the concept of omnichannel came to life, it meant bringing physical and virtual interactions together – for instance by recognising a customer within a bank branch (based on mobile phone or facial recognition) and showing relevant content on digital displays based on this customer’s prior interactions online or via mobile.
More recently advances in IoT, cloud computing and AI have contributed to taking the digital twin concept to the next level and 2021 could be the inflection point.
To what extent is 2021 the year it moves beyond early adopters and becomes a must-have?
Today the combination of AI/machine learning and IoT are coming together to take these concepts to a new height with the ability to replicate the physical world and simulate multiple scenarios and responses from financial services organisations.
We see this trend across conventional physical assets e.g., connected cars, homes (in insurance), technology assets e.g., IT Infrastructure (traffic generators in networking), security (simulated cyber attacks on physical IT infrastructure), mobile, wearables, and even humans i.e., creating a digital twin of a person (which can be used for credit decisions or to train conversational bots).
How can digital twins be used to accelerate and augment decision-making?
The recipe for success is to connect data harvesting to AI/machine learning to derive insights, and then generate a response based on those insights (whether acted upon by humans or automated).
Examples in the Insurance industry include predictive and prescriptive maintenance on your car, your building (and in the future possibly your body). Within IT infrastructure we’re seeing the same opportunity in AIOPS. WWT is engaged with clients in understanding and modelling key IT infrastructure events and potential responses to them, all while helping the same clients accelerate their infrastructure automation capabilities. Ultimately these two ideas will merge so that key infrastructure actions are automated.
A few years ago, digital twins of self-driving cars were created in virtual environments to test the vehicle’s AI systems over millions of miles of simulated roads. The learnings from these trials resulted in autonomous vehicles driving on our real-world roads today. Very soon, sensors connected to our infrastructure will enable us to replicate physical environments and people, test multiple scenarios in the virtual world and only replicate the successful ones in the real world.
Recent studies have shown that analysing 300 likes on Facebook enables ‘the AI’ to forecast one’s preferences better than their spouse. It’s easy to see how our digital twins will rapidly move from limited imitations to high-resolution replicas. From there, they could even start representing us more effectively than we can ourselves.
How are digital twins being used to advance autonomous vehicles and smart cities?
Modelling cities, roads, trains and vehicles through digital twins is already in full motion. A painful example I lived through this year is that of the failure of the power generation and distribution grid in Texas: simulating the impact of changing load conditions due to an increase in population density, weather conditions or cyber attacks are all examples of what is feasible today. Emergency and public services will be key beneficiaries of this technology as will be all the commercial functions that power cities.
How are digital twins being leveraged by enterprises to enable digital/tech transformations?
The specific use cases are very industry-specific. However the process and the underlying infrastructure (sensors, AI, data management, skills etc.) are quite similar, so the current focus is on building the underlying human and technical infrastructure required to take advantage of digital twins to generate competitive advantage.
Compliance is a key consideration for enterprise, can digital twins be used to better oversee compliance issues?
Compliance is closely linked to risk management, so simulation and digital twins are key enablers of improved scenario analysis and risk management policies and strategies. Digital twins have the potential to ensure organisations are more resilient to technical (including cybersecurity), nature-made or man-made risks.
What work is there to do in terms of interoperability, integration and industry standards when it comes to digital twins?
Interoperability will be key even more so as we go forward, such as the aforementioned system optimisation example. I anticipate the ‘standardisation’ to be driven by the major innovators (leaders in gaming, transportation, etc.) and hyperscalers in a similar model to the one we have seen evolve in AI and automation.
How do you see digital twins developing and evolving over the next three to five years?
As IoT expands, and the ability to automate advances in parallel, the scale and complexity will increase and with it the need to optimise at a system level. This means an elevation of the need for simulation and intelligence – one cannot optimise traffic flow in a city without understanding (and simulating) the impact on related services that will be impacted (from charging station locations to emergency services, to road design, to commercial services like lodging, food or entertainment). All of these factors will ultimately impact investment flows and banking.
Our ability to optimise larger interconnected systems is still in its infancy. This is about to change fuelled by the expansion of the cloud footprint and machine learning advances. A key technology to monitor carefully on that front is Quantum Computing.
Is there anything else you’d like to add?
Digital twins, modelling and simulation are often designed to optimise an outcome defined by those who own the model. As a civilisation, we need to be very conscious of the side effects of optimising a limited set of outcomes – for instance, what will happen to the environment if we over optimise our ability to extract non-replenishable resources. There is a reason why nature (and biology) has so much redundancy built into it. We should be very conscious of the limits of over-optimisation and the tradeoffs between reducing the frequency of negative events while significantly increasing the severity of those we miss.