Digital-Logo-framed2

What's the connection with AI & preventative maintenance?

April 3, 2020
And how is that connection shifting the edge / cloud balance with data analytics?

Artificial intelligence. Preventative maintenance.

Two buzzy phrases common to this world of digital transformation. Two concepts understood, to varying degrees, by business owners, who capitalize on these approaches (to varying degrees). 

Stratus Technologies' John Fryer

But used in tandem—AI powering preventative maintenance—is one step further. And according to John Fryer, Stratus Technologies senior director of industry solutions, it’s a pairing that is critical to gaining a comprehensive understanding of not just how your machines are running, but how they’re going to run.

Smart Industry: How is AI changing our approach to preventative maintenance?

John: A large change in the preventative-maintenance area is in how data is collected and transmitted to the cloud, where most AI/machine learning models run. Many early implementations assumed that all data would be sent directly to the cloud, from newly deployed sensors, for example. While this may be acceptable in pilot projects, it can be problematic at scale from cost and bandwidth perspectives (consider remote locations with low/expensive bandwidth) and a security perspective (lots of new attack surfaces).

Smart Industry: What are new approaches with enabling edge devices with AI?

John: There is an increasing recognition that while AI models need to run in the cloud, the algorithms they generate (particularly for applications like preventative maintenance) need to operate at the edge. We are starting to see collaboration between the edge and the cloud vs. an antagonistic approach. Indeed, the hyper-scalers (like Azure, AWS, and Google) are all offering edge strategies and solutions as an adjunct to their cloud offerings. There is also recognition that there are real-time analytics applications that can run at the edge for areas such as product quality feedback as well as hard-failure prevention. In addition, the latency of roundtrip delays to the cloud just won’t meet the real-time criteria.

Many of the associated hardware offerings don’t meet the requirements of industrial environments, which demand ruggedization, simplicity of deployment and maintenance, self-protection from failure, physical security and cybersecurity, autonomous operation from their own edge-platform preventative-maintenance capabilities, and automated administration and remote management.

We are also starting to see companies move toward the recognition that traditional OT-automation applications deployed on an edge platform need to live alongside more IT-centric analytics applications that connect to the cloud and machine-learning models. This separation is easily accomplished with virtualization—separation of the OT and IT applications. The IT applications themselves often deploy containers in their virtual machines to micro-services, which are commonly used for edge analytics and machine-learning/AI algorithms.

Smart Industry: Who is leading the charge here?

John: Today it is mainly companies with deep pockets who can afford some level of experimentation. That said, the current environment with COVID-19 is driving many companies to reassess what they will do in the future. Although budgets will be tight for many companies after a period of downtime, there is an increasing realization that edge computing can play an important role in keeping production going with reduced resources and remote monitoring and operation. Preventative maintenance is just one key area for this application. Edge-platform deployments are supplemental and relatively inexpensive, versus major automation upgrades. The benefits they offer can deliver significant ROI in a short timeframe.

Smart Industry: How does the trend of data-recording factor into this?

John: At its most basic level, data recording is the collection of data from sensors connected to machinery used in industrial-automation applications. This data can be recorded via traditional automation and control applications and is often stored in a historian. It can also be recorded directly from sensors deployed on equipment to accurately collect data for analytics purposes.

The data itself is almost always time-series data, collected on a consistent periodic basis in which there are several options for storage—at the edge, in traditional historians within control rooms, modern hierarchical historians that integrate with edge-computing platforms, within databases hosted by enterprise data centers, or in the cloud.

Smart Industry: And how does this enable modern approaches to preventative maintenance?

John: Sensor data has been used for maintenance applications for many years. Historically, it has been used for post-mortem analysis following equipment failure, which has led to the adoption of some form of Asset Performance Management (APM) or Condition Based Monitoring (CBM). CBM has been around for many years and can serve very well to identify increasing performance anomalies through basic-trend analysis. The most modern approaches to preventative maintenance use machine learning and AI to discover insights into what might seem to be large and disparate data sets. By doing this, it identifies potential failures and predicts how much time a machine or process may have before it fails.

In terms of data recording, there are a couple of critical issues here. Data loss can have a significant impact on many of these advanced models. Therefore, it is essential that edge devices used to collect much of this data are protected and have secure storage capabilities. For many applications, the data source and the edge location are remote, meaning physical access may be difficult, and network connectivity may be intermittent. Thus, the ability of an edge platform to operate autonomously is critical. The remote nature of edge devices can also lead to considerations around security (both physical and cyber) so edge platforms with the ability to protect themselves is also essential.

Another issue that is, perhaps, just coming to the forefront is timing. As data collection gets faster and comes from increasingly disparate sources, maintaining the correct sequencing of time-series data will be increasingly critical. Consolidating data at edge platforms allows the synchronization to occur more easily.

What do you need to learn? Check out the Smart Industry Base Camp Digital series here.