In the age of Industry 4.0, managing asset health increasingly requires managing digital health to ensure that prescriptive recommendations are unbiased and verifiable
by Jim Stuart, Lloyd’s Register
21 October 2019
Recently, at Maximo World, I delivered a presentation to more than 1,000 asset management professionals on digital transformation. Specifically, I discussed things to consider when adding advanced capabilities such as digital twins, machine learning, and artificial intelligence to a comprehensive asset performance management (APM) program in the pursuit of enhancing failure elimination, identifying and mitigating risk, and improving safety.
In this new digital world, advanced methods are being used to better understand asset health in real time, predict failures, and (using AI) prescribe required corrective actions. By using new risk and reliability methodologies, failure data libraries, modeling tools, and advanced analytics to process vast amounts of inspection and maintenance data, modern facilities are obtaining actionable insights that pinpoint risk and enhance asset and plant performance and reliability.
It’s about results at the end of the day, and through industry collaborations with their digital partners, operators I have observed have realized up to 20% gains in overall production using these technologies with failure risk reduced by 80% and cost savings of up to 50% achieved. More important, all these projects across multiple industries continually add to the ever-growing knowledge base and library of failure and risk data.
So what’s next? How do we move our organizations to the next level? The next frontier of failure elimination is maturing your asset management program from the advanced analytics of predictive APM methodologies to the artificial intelligence/ Industry 4.0 world of prescriptive recommendations. But it is clear that the speed of change in the digital age has brought with it a new set of challenges for business leaders: namely, how to ensure that AI recommendations and model simulations aren’t biased and that the results are verifiable.
Governance requirements and decision assurance in the digital age
The concept of digital twins is predicated on a real-time data connection; without this connection, digital twin technology would not exist. This connectivity is created by sensors on the physical asset that obtain data and communicate it back to the system. Digital twin technology strictly depends on monitoring the physical twin and how the environment and people interact with it – in other words, it is theoretically failure-proof from the moment it is built, but only if the data integrity has gone through a diligent validation process.
New governance requirements are emerging around data integrity and decision validation with these new technologies. As artificial intelligence, machine learning, and digital twin modeling are introduced in Industry 4.0 applications, we now must have a process for how these advanced technologies and their technical integrity are being assured to deliver the right answers. This is not a tomorrow concern. Governance of digital is a current and pressing problem that demands immediate and substantive efforts to address.
Operating plants, especially in the oil and gas and chemical processing industries, for example, is a dynamic and continuous endeavor where operating conditions continually change. For a digital twin model to be a true reflection of the physical asset, an entirely new set of processes is required, with each new process delivering new data, insights, and actions.
Asset health requires digital health management
All of these new activities require validation to confirm the accuracy of the models, and this new area of governance requirements is earning a name: digital health management, or DHM. This new term encompasses all of the digital technologies and systems that are used to gather data and insights on an asset’s health, which incorporates digital twin technology.
Furthermore, a digital twin can be defined as a “multiphysics, data-driven representation of a physical asset, often residing in a cloud-based environment using data streamed from the physical asset” with varying applications from designers and operators to autonomy. In other words, a digital twin is a dynamic digital representation of a physical piece of equipment or asset. This understanding of the digital landscape and its complexity drives the requirement for a comprehensive governance program to assure accurate output.
On the shop floor, the result is helping operators improve aspects of their operational performance and maintenance regimes through insights generated by the twins as part of the DHM. Key elements of DHM are ensuring digital model accuracy and answering questions critical to the success of any digital twin initiative:
• What standards are applied, and what is the human role, if any, in the validation of models and digital twins?
• Do the digital simulation, data structure, and technology take into consideration safety and license-to-operate standards?
• How is sensitive company data used for digital twins kept secure?
Alarmingly, the data security question isn’t getting the focus required by the risk ranking in most facilities. Best-in-class facilities take this risk seriously and dedicate significant resources to addressing vital digital security questions. In addition to security issues, operators face challenges in ensuring accurate data streams from sensors and IIoT devices as well as accuracy from remote inspection technologies. The unintended consequences of a digital program without a rigorous governance protocol in place to ensure digital integrity could be bad decisions, lost profits, and, in the worst case, a catastrophic event.
Beyond the fear of Skynet
If you’re a fan of the Terminator franchise, you know that August 29 was the day that Skynet became self-aware. Could this happen in real life? Could artificial intelligence technology become self-aware and determine that humans are a threat? Lucky for us, technology isn’t this advanced yet, and Skynet isn’t real. While we can poke fun at AI becoming “self-aware” like in a science-fiction movie, the reality is that a digital governance program provides assurance that your digital program has the procedures and protocols in place to deliver what it was designed to do.
It’s important that digital solutions be well-balanced across the technical solution and its business application. Practitioners and their partners also need to understand the challenges around data-sharing and data ethics in this collaborative digital ecosystem and address the value of the data and its contributions. In this way, owner-operators who are embarking on their digitalization journey will be better able to realize new value, and of more importance, build confidence in these technologies so that they can be trusted to make better, more-informed decisions safely.
Jim Stuart is SVP digital and software for Lloyd’s Register (www.lr.org) and has been leading technology companies for more than 20 years. His experience covers software product innovation, digital strategy, business transformation and operational excellence. He co-leads LR’s digital products and software business unit, which focuses on helping asset-intensive industries realize productivity improvements and reduce operating costs while maintaining asset health, reliability, and uptime.