Finding the silver lining in cloud analytics

April 7, 2020
The questions you ask to uncover your problems should inform your reliance on the cloud.

By Bob Sperber

It’s increasingly obvious that the cloud is no longer the centralized IT repository it was imagined to be, but a profusion of multiple, varied clouds optimized for different uses. And since the information needed for an industrial application can come from any combination of local and remote sources, corralling the right data set can be a daunting task.

Sound familiar?

“It’s a rare company today that talks about moving ‘to the cloud,’” says David Cope, senior director of cloud-business development for Cisco. Instead, companies typically talk about adding the cloud to a mixed landscape of clouds. Cope has witnessed some “very
interesting phases” since the birth of cloud computing. First, there was initial skepticism by users, which gave way to experimentation with non-core applications. Next, companies evolved to adopt a multi-cloud scenario involving disparate, public clouds such as Microsoft Azure and Amazon AWS, as well as private clouds. The latest step in this evolution is the transition from multi-cloud to multi-domain. This means that users are expecting more seamless access and service across multiple locations, including all kinds of clouds as well as edge environments, from data centers to the manufacturing edge.

How and where analytics is performed varies by use case. One user might be concerned with a single site. Another might want to optimize processes across multiple sites. And at the enterprise level, the challenge may be one of knowing how to ring manufacturing data into a broader analysis of data from multiple business systems.

In all cases, the solution must fit the problem, and the problem should be driven by business (not technology).


Before getting “hung up on the widgets” of solutions and architectures, leaders must ask questions, and rethink the kinds of questions they ask, according to Mike Guilfoyle, vice president and analytics leader with ARC Advisory Group. They may start investigating issues in the realm of basic operational reliability, asking “What needs to be done?”

At a higher level, they can ask questions like “Can I do it?” and “Why did it happen?” These questions tend to lead to one another, like opening a matryoshka—a Russian nesting doll: Am I capable of doing what I think I want to do? If not, what do I need to change to be able to do that? How do I view my organization in that light? And how can I identify three or four things that are of strategic importance to the business performance, which relate to our ability to compete? Finally, how does this inform how things need to change?

Neglecting to drill that far down, Guilfoyle says, is “an age-old failing of strategic planning” that affects even leading organizations when working in the cloud. Companies can attain operational excellence but still fail to excel. The reason: competitors have access to the same technology solutions, which “will become commoditized over time.” The best questions to ask, therefore, are externally driven by market signals and unique customer needs.

To illustrate, Guilfoyle says a leading company recently encouraged operations to focus externally. As a result, one plant did something that in years past might have gotten people fired: they increased the cost of production to satisfy a unique customer need in a way they had never done before, at a speed they have never been able to reach before. “And that’s a fundamentally different way of thinking; that’s transformation,” he says.


Once the right questions are raised and a project has its scope, one of the first steps to take is to create a model, or digital twin, of the universe to be studied and infuse it with the data from the assets and sources to be studied.

To Bill Boswell, vice president of cloud marketing for Siemens Digital Industries Software, the sky seems to be the limit of what’s possible, virtually: “You can take the results of the real-time analytics you’re doing at the edge, or multiple edges, and maybe even combine that in context with other data that’s coming from, say, your ERP or CRM system.”

Adds Marcia Walker, global industry principal for manufacturing for SAS, “Cloud computing can be an organization’s secret weapon to quickly and dramatically improve quality, service, and supply chain performance—at very little cost.”

She cites a customer in medical-device manufacturing that wanted to implement a video-analytics-based pass/ no-pass quality-control routine on the line. In the past, operations would have been hobbled by months of IT department paperwork as well as the high costs of new hardware with GPU chips to create and run models. But recently the team moved its production dataset to a cloud server to build and tune its model.

Once the model was built, the team downloaded the model to run at the factory edge with live data, removing the data sample. When the model needs updating, the process can be repeated, Walker explains, adding: “It’s faster, it’s cheaper, and it’s more nimble for users than having to work with IT as they did in the past. That’s a very big flip in the old way of thinking that most operations people are completely unfamiliar with, because they don’t hang out with analytics geeks all day like I do!”


Walker cites a hypothetical use case that exemplifies how IoT data can deliver surprising new benefits in curious ways to, and from, manufacturing. Consider this made-up scenario: Everybody in Chicago has just started using the second rinse cycle on their washing machines, but nobody’s doing that in New York. An analyst is prompted to investigate—perhaps by the rise in warranty claims, or some other automated trigger—and discovers the two markets are served by different factories. Further investigation reveals a manufacturing flaw in the midwest, leading to a fix in the factory. Product quality is restored, warranty claims are reduced, and the manufacturer just gained supply chain and consumer insights its competitors lack.

The possibilities of capitalizing on having such deep “tentacles” in the supply chain could lead to new diagnostic/maintenance service offerings from dealers and service providers, better consumer research, perhaps even a subscription soap tie-in, Walker ruminates.

Industrial firms are similarly tapping their digital supply chains. Machinery and robotics OEM Festo recently expanded its own service capabilities with a cloud-based condition-monitoring dashboard to optimize its customers’ compressed-air consumption for energy savings And Rittal developed a remote, predictive-maintenancemanagement system for IT enclosures and related offerings. The two companies hosted these analytic revenue generators using a Siemens’ IoT platform.


At Dow Chemical, there’s a grand plan for “total data domination” among its manufacturing IT group—a sort of DMZ between OT and IT where remote tools and platforms “are starting to pervade our workforce,” says Lloyd Colegrove, fundamental problem-solving director for Dow’s global manufacturing and engineering organizations. His group uses its internal network instead of the cloud for its advanced, multidimensional chemometric analyses. One of the key tools is Northwest Analytics’ Focus EMI, which enables access (without duplication) to disparate data from plant historians, lab, quality and automation systems.

This is a real-time product-quality analysis that Louis Halvorsen, CTO of Northwest Analytics, says can’t afford the latency of a cloud because it requires instant creation “of data sets for very specific chemical materials being analyzed in very specific places, where a group of variables needs to be analyzed together, and each one of them has a different data set.” (Another IT-infrastructure aspect pertinent to chemical plants is explained below.)

The buildout is still early days, but as Colegrove says, digital transformation is “a marathon, not a sprint.” Initial ROI projections, including cost savings of $2 million per plant annually, have been far surpassed since the first plant implementation. Areas of improvement include increased yields; longer catalyst lifecycles; improved relations throughout the supply chain; and enhanced process knowledge.

Dow plants share models they can adapt for their own use and get access to aggregated dashboards, which can produce some healthy internal competition, which Colegrove says is a “side benefit from making the data available…and more artfully, to the engineers and operators that need to make improved real-time decisions.”

When another chemical processor needed a centralized monitoring/diagnostic center to monitor and perform predictive analytics across all of its plants, it accordingly used its own internal enterprise network and resources. “If you have your own data center, and the enterprisewide architecture and IT support, you basically have your own cloud, and you can do it yourself,” says Mani Janardhanan, vice president of product management for operational-certainty solutions with the solution provider, Emerson. “It wasn’t just training and simulation, it was real-time,” he says, which for this application, required roughly 30-minute intervals.

While cloud use is less pervasive in the risk-averse process industries, it may be the only practical way to go. When an Asian upstream oil-and-gas company needed to increase operations from 2,000 to 4,000 wells in 24 months, it couldn’t get engineers to work on-site across the broad, remote geographic territory. So the company turned to cloud-based performance monitoring and analytics from a central location to compare and improve performance, reduce costs and improve safety and reduce emissions.

That example comes by way of Megan Buntain, director of cloud partnerships with process-industry-analytics firm Seeq. She adds that, more broadly, companies are increasingly taking operational analytics to the cloud for three reasons: “They want that value chain efficiency. They want global operational performance. And they want to make that data available for machine-learning capabilities.”

“We run into company after company that has moved or aggregated data to the cloud only to ask ‘Now what?’” And, consistent with ARC’s Guilfoyle perspective, Buntain adds: “It’s critical that operational teams, the subject-matter experts closest to the factory or the process, are collaborators in driving the use case—and then work backward to what the data strategy needs to be.”

Achieving that end will require IT and OT leaders to put their heads together on defining their desired business outcomes. And to do that, they need to understand one another. Dow’s Colegrove is working “to turn chemical engineers into data scientists.” It’s harder to reach over the IT/OT divide, he believes, so he recruits new talent from chemical-engineering programs and works to teach them the data-science tools they need to do their jobs better. He’s also working with schools on how they can better prepare graduating engineers for the IT/OT challenges ahead.


IT keeps evolving. Standards of all sorts keep emerging and analytics are becoming more database agnostic, according to SAS’ Walker. A decade ago, working for an industrial-automation firm, she says customers spent roughly 80 percent of their time consolidating, cleansing and prepping data for analysis—data-management efforts that are now largely automated. Today, she says it’s “very practical” to work across multiple systems and databases, “I would like to shout this from the rooftops!”

No matter the role of manufacturing data in an analytics project—cloud or no cloud—Seeq’s Buntain encourages manufacturing professionals to keep performing analytics and adding value on data wherever their existing data is stored.

The practical challenge may be more human than technical, more about bridging the IT/OT divide, Siemens’ Boswell says, “But we’re starting to see a shift in attitudes.”

As both sides of that divide gain experience and confidence working together, operational teams will not only start moving and aggregating more broadly based data to the IT cloud...they’ll learn to build better business cases. In turn, IT organizations will better understand how to get their heads around the terabytes-per-day of data streaming from operations across the hybrid IoT environment.