BCD-content-graphic2

Four steps to success with data analytics

July 9, 2020
Your project is done when stakeholders tell you how well their new tool is working on the day shift and night shift.

From the “Launching your IIoT Program” presentation delivered by Jolene Baker, Logical Systems senior manufacturing intelligence specialist; and James Bronstein, Ghost Solutions, principal engineer

Data analytics can be generally defined as the science of analyzing raw data in order to make
conclusions about that information. There are four main types, each of which addresses different questions: descriptive analytics (What happened?); diagnostic analytics (Why did it happen?); predictive analytics (What’s likely to happen?); and prescriptive analytics (What action should we take to prevent a problem or seize an opportunity?)

In our practice, these have spanned the range of analytics. Projects have included:

       Identifying the quality of material using advanced pattern recognition

       Tracking of plants’ utility-consumption rates to improve facility performance

       Preventing piston-pump failures with deviation-over-time analytics and alarming

       Providing target feed rates to prevent continuous process failures

These solutions weren’t based on textbook definitions, but on real-time industrial-process challenges. Every project has aspects that are unique, and no single commercial solution exists that can provide a solution out of the box. Additionally, along with the technical aspects of any project, there are important management elements to achieving analytics success.

So where should you start?

Analytics prerequisites

When approaching an analytics project, the firs prerequisite is setting expectations to establish your organization’s goals. This is critical to gaining top-down support. In the same vein, you must foster understanding among stakeholders throughout the organization so they know exactly why the project is important to their success.

Such non-technical communication skills will continue as the project proceeds; they’re critical to creating a solution that brings lasting change.

In approaching technical-deployment issues, assess at the IT/OT integration level what, exactly, will be required to achieve the desired dataset. You’ll likely be working in a hybrid environment that puts all the data at your fingertips; it’s rare to find a fully connected IIoT infrastructure.

You’ll also need to assess the resources of the IT/OT technical staff…they’re be planning, building, implementing and supporting these efforts.   

Four steps to success

As you approach project execution steps, remember that success in analytics doesn’t actually come from the math or the solution itself, but from the quality of the effort you use to get it adopted. Here’s an overview of four steps that can provide structure to managing the details of your project:

Conception & initiation

Start with stakeholders to determine who will use the system, and address ease-of-use and appearance issues. Secure the in-house or outside talent who will build the system. Conduct a technical assessment to secure details on data, from its source(s) to the location at which the application will run (e.g., your HMI, DCS, business system, or custom platform). Keep IT/OT access and security requirements in mind from the start.

Build & iterate

It’s time to design the analytics solution. Whether it’s an all-new design/development project or, more commonly, an update/expansion of a legacy solution, you’ll need to translate the solution in compliance with the platform(s) it will run on and refine that solution over time. To arrive at the complete solution, you’ll need to test the build-through iterations as you incorporate feedback, customize and repeat…until all users are satisfied.

Launch & execution

Once the build is ready, you’ll conduct introduction training for everyone from operators to management. It’s critical to nail the process of hands-on training, shift after shift. And even though the build is nominally complete, constantly incorporate the feedback you receive to foster buy-in from operators. Also critical: manually testing any advanced, real-time analytics targeted to run in closed auto-feedback loops, manually stopping, pausing and fine-tuning before flipping to “auto.”

Performance & control

It’s important to realize that analytics tend to drift in a dynamic, live environment. Continue to track reliability and accuracy to assure analytics and process performance. As with the launch phase, don’t be afraid to pause as needed. Communicate in follow-ups with operators, foremen, supervisors, managers and engineers that this learning is an ongoing process.

So how do you know your project is done? When you walk into the control room and stakeholders tell you how well their new tool is working on the day shift and night shift. That’s when it becomes their new normal.