For many years, process-manufacturing companies have been gathering sensor-generated, time-series data in their historian of choice. As a result, massive amounts of process data is available for analysis. Each company may have a different goal for analyzing operational data, but turning this data into knowledge is a different story.
Ideally, analyzing big data should be simple, easy and fast, providing knowledge with context. By giving the power of analytics to each operational stakeholder, data-driven decisions can be made instantly, which helps improve operational performance and overall profitability.
Let’s look at Covestro, a world-leading supplier of high-tech polymer materials that was facing two problems. First, its big-data analytics was done in MS Excel, a slow process that limited engineers from using all of their data. Second, Covestro relied on their analytics experts for data-modeling, which required data modeling that gobbled up time and created potential knowledge loss in the plant.
To leverage their historian to the max, Covestro selected a state-of-the-art, self-service, industrial-analytics platform, which from day one provided tangible benefits:
- No data modeling was needed to analyze many years of production data
- No Excel was needed to analyze or monitor production performance
- It was simple and easy to use, increasing efficiency of the engineers to analyze and monitor operational performance
- It provided direct answers to process issues with advanced tools to find root causes
- It provided graphical visualization of data
- It was fast—no super computer was needed
This new way of analyzing their manufacturing data has led to better control and even reduction of energy consumption at their site.
Meet organizational goals
Process engineers are eager to improve the production process in line with their organizational goals. Whether it is improving product quality or reducing waste, engineers are looking for ways to contribute to the organizational KPIs. To make them as effective in that goal as possible, they need to have adequate tools and leverage the data available.
Let’s look at Ashland, a provider of specialty chemical solutions that has recently shifted focus from the construction market to the pharmaceutical market. Self-service industrial analytics proved to be the solution that helped Ashland analyze and understand their data better and leverage it to profit from digitalization.
These monitoring capabilities enable process and production engineers to raise timely red flags, and prevent incidents from happening in the future. By implementing the self-service industrial-analytics platform, Ashland was able to solve previously unsolvable production issues and enhance quality. More importantly, they were able to increase their on-target production of GMP products from 70% to 95% within the first year of using self-service advanced analytics.
Expand the data analytics success to the entire organization
Engineers are able to continuously improve operational performance and contribute to organizational goals by having data at their fingertips. Using root-cause analysis means bettering performance via golden batch fingerprint or best operating zones of equipment. These can be used for early warnings and notifications to enable energy management, waste reduction, increases in uptime and delivery of product quality to customer demand.
Edwin van Dijk is vice president of marketing with TrendMiner