1660317530913 Heroplatform

Capitalizing on the manufacturing data platform

May 7, 2021
Data is the biggest asset. You should treat it that way.

By Francisco Almada Lobo, chief executive officer and co-founder of Critical Manufacturing 

 Data is the biggest asset.

With only a small percentage of companies succeeding in digital transformation, however, how can manufacturers take advantage of the massive amounts of data they are creating?

Like oil, data has a great deal of value, but only if it is refined to release the information it holds. Building a transformational data platform requires a combination of the Industrial Internet of Things (IIoT), a future-ready manufacturing-execution system (MES), and advanced analytical tools.

How smart is your factory?

The ultimate smart factory has action-oriented data and artificial intelligence (AI) controlled production lines. But how do we get there? What solutions are needed? Some predict that the answer lies solely in the IIoT and that MES is no longer required, but this is simply not true. The MES needs to evolve to include data platforms.

Building a manufacturing data platform

The main elements for a successful manufacturing data platform are a natural extension of the modern MES.

On the edge

A manufacturing data platform combines solutions for processing, storing and analyzing data from huge numbers of resources. Edge solutions run close to the place where data are generated, with some local processing and analysis before sending to a central system. As such, they reduce latency, enable faster responses to change process conditions, and can reduce the costs of central processing and analysis.

Data ingestion

One of the most critical functions of data ingestion is the metadata registry, which enables the platform to understand what data is being sent. The metadata registry refers to a schema. Based on the schema ID, schemas are added to messages before sending them on. Later, an application can read a schema and know which data it contains.

Data brokering

The data platform needs to deliver data from numerous data sources, including equipment, process data, MES and ERP, to applications such as historians, dashboards, alarms, analytics, and data reporting. Data warehouses and subsequently data lakes were used, but, for data platforms, Apache Kafta was a game-changer. It decouples data streams from systems and is distributed, fault-tolerant, has incredibly high performance, extremely lightweight consumers, and easily scales horizontally with the addition of hardware.

Kafka acts as a nervous system, managing streams of information from various applications, processing each piece of data, and sending it to where it needs to go and has the dual capability to process data in real-time and ‘replay’ data from any given point in time.

Data processing

Data processing includes batch and stream processing. Batch processing processes large groups of transactions in a single run, involving multiple operations and handling heavy data loads. This may be used to run a report or aggregate data on a data warehouse. Stream processing deals with transformations that require extremely fast handling, usually involving fewer data.

Higher stream processing speeds and configurable, automatic rule-based actions (e.g. ‘if this then that’) reduce latency between an event and subsequent action, thereby adding value. For exceptional processing speed with in-memory processing, the critical manufacturing platform uses the powerful and feature-rich Apache Spark to handle batch and stream processing.

Data enrichment

Data enrichment is invaluable for manufacturing. It merges third-party data from an external authoritative source with the existing database of first-party customer data.

Take the example of the temperature profile of a machine. Alone, there is little analysis that can be done. However, If the system understands the processes being carried out, historical temperature profiles, maintenance activities, etc., more can be understood about the readings.

The MES provides data for the enrichment and contains all the necessary contextual information. An event received into the data platform has a name, value, timestamp, and MES object. It is written into a raw topic, stored in a data lake, and sent into stream processing. An MES data enricher then appends contextual data to the message. This new, event-enriched topic is written back into Kafka, where it can be consumed again by stream or batch processing.

Advanced analytics

Descriptive, diagnostic, predictive, and prescriptive analysis help us understand what has happened, why it happened, what will happen and what actions should be taken.

One of the most common uses of predictive analytics is machine maintenance. Data is collected over time from sensors and machine actions and merged with previous maintenance activities. Correlations between variables and results can then be made to determine the causes of machine failures. The predictive analysis then creates a data-driven model to calculate the probability of machine failure or remaining useful life, thereby anticipating maintenance needs or postponing routine maintenance if not required.

Machine learning (ML)

ML is used to analyze large data sets and learn patterns to help make predictions about new data sets. Using big data requires a data platform that scales accordingly and is the most promising technique to gain hidden insights and value from data.

ML comprises several levels of analysis. Detection of anomalies to identify faulty products, predict machine maintenance needs, and detect possible safety issues. Classification then organizes information and looks between categories to identify correlations. Probability functions test how changes to specific variables will impact outcomes and optimization can then be achieved by calculating the probability of various outcomes and adjusting parameters accordingly.

Given enough relevant data, learning algorithms can approximate almost any function. Correlation, however, does not imply causation. Initial hypotheses need to be tested for significance, and the more statistically relevant investigated further.

Serving and output applications layers

The final block of the manufacturing data platform makes outputs available to applications such as third-party solutions, alarms, or visualization tools, through a serving layer into the applications layer.

Conclusion

It is a combination of MES, IIoT, equipment integration, and data platform elements that distinguish a manufacturing data platform from a generic one. The use of a data platform designed specifically for the manufacturing environment is a massive accelerator in providing insights to manufacturing processes, continuous improvement, and competitive advantage. It is a combination of MES and manufacturing data platform that will enable manufacturers to seize the huge advantages I4.0 has to offer.