John-Harrington-HighByte
John-Harrington-HighByte
John-Harrington-HighByte
John-Harrington-HighByte
John-Harrington-HighByte

Why data standards can’t solve your interoperability problems

June 17, 2022

"Most industry standards only go so far." 

HighByte's John Harrington

The efforts of standards organizations like OPC Foundation, Eclipse Foundation (Sparkplug), ISA, CESMII, and MTConnect represent a significant step forward for the advancement of Industry 4.0 in manufacturing.

But industry standards only go so far. Businesses need data to tell the story of what is happening, why it is happening, and how to fix it. Multiple pieces of information must be assembled with other pieces of information from other sources to tell the use-case story—just like words must be combined into sentences and sentences combined to form stories. Data standards can’t tell the use case story—they can only provide a dictionary.

Standardizing the device-level data into structures is key, but it only the beginning. Data standards alone will not solve your interoperability problems because they don’t provide the use case related context you need to make strategic decisions. Here are four key reasons why you still need an industrial DataOps solution even with the introduction or evolution of new standards.

1. You’re dealing with machine and vendor variability.

Standards bodies are made up of vendors and users in an industry. As the standard is being defined, variances are allowed for vendor machines with unique capabilities or limitations and unique use cases. While the intent is flexibility, the result is often ambiguity. It’s typical for vendors to implement the same standard slightly differently. Historically, vendors have refined their systems and changed data models over time to suit their needs.

As a result, even minor variations in data sets require human interaction to link these machines to other systems in the network and automate dashboards or analytics.

An industrial DataOps solutions enables codeless connections to a wide range of sources, including equipment or controllers, smart devices, sensors and systems. If the input data is standardized, it can easily be passed through or combined with other data with no additional effort. However, if it’s not standardized, it can be modeled and transformed to the governed data standard for the use case.

2. You’re viewing individual data with no relationship context.

Think about a manufacturing line with multiple machines. The machine standards address the data for each machine independently, not the combination of the machines or any custom automation connecting the machines. When analyzing operational metrics, bottlenecks, or quality root cause for a production line, specific information from each machine, test stand, and sensor should ideally be assembled into a single payload for that line. 

An industrial DataOps solution can assemble large payloads of data from multiple machines, consolidate it, and then publish to the target system. Models in the DataOps hub can correlate the data by logical use case. In a factory, this is typically machinery, process and product; but could also be sustainability or energy consumption. This systematic approach of building data models for the use case greatly accelerates the use of this information by line of business users who are less knowledgeable of the machines and line layouts.

3. You’re looking at more than just device data.

You can’t make strategic decisions if you’re not linking your machine data to other systems across your organization. This includes your enterprise applications, such as your ERP system, and your manufacturing databases (e.g., SCADA, MES, Historian, QMS, LIMS, and CMMS).

An industrial DataOps solution can connect to virtually any system in your organization and combine information from these systems with machine data. For example, your MES provides context on a particular batch. In a DataOps hub, you can combine that information with device data and quality-system data into a single payload and send it to the cloud for analysis or dashboarding. 

4. You’re not getting the data you need, when you need it.

Information overload is a real problem in the Industry 4.0 world. Industrial data is nearly infinite in both the volume of data values and the frequency at which they can be acquired. Modern PLCs can have hundreds of thousands of data points and can collect data from sensors at sub-millisecond frequencies.

Understanding what is needed—and when—is critical.

Sometimes data is needed at a cyclic rate, once per second. Other times, you may need an event-based feed to identify defects or machine performance issues. By defining the desired data payload and its event or frequency, you have a more efficient decision-making process, and you minimize cloud costs because you only store and process the data you need.

An industrial DataOps solution can assemble data from multiple sources into a single payload, perform any aggregates and transformations, and send to a cloud data lake or other system at the desired rate or event.

Standardized models are important to our industry because they provide a baseline data set to work with, i.e., the dictionary. While data standards are no substitute for telling the use-case-driven story with contextualized, intelligent insights that drive strategic decision-making, they do help expedite data-modeling when paired with a DataOps hub.

This approach allows data standards to deliver the value that’s been elusive since these standards bodies began to form in the 1990s. As you digitally transform to Industry 4.0, use your data to tell stories that solve business problems.

John Harrington is the chief product officer at HighByte