chang
chang
chang
chang
chang

Case Study: Quick-rising smart manufacturing tools

Oct. 24, 2017

Better cake from big data. 

Just a few years ago I was working with a customer who makes baked snacks—

Factora's Michael Chang

apple pies, cakes, chocolate swirls, you get the idea. Factora had been called in because of a waste issue. To give you an opportunity to see how tools have evolved, I’m going to tell you how we solved it then, and how we might solve it today.

Too tall is wasted

The problem was over-yield. Too many of a certain type of cake were rising too high, too often, leading to high waste numbers.

Now, going in, you a production-line baker knows that if they create cakes that are smaller than the required minimum, the entire batch goes to waste. So their tendency is to aim a little high, to ensure meeting that regulatory hurdle. It’s an ongoing issue in food production, one which smart manufacturing was born to remedy. By reducing variability, you can meet regulations with less waste.

Still, identifying the source of the too-tall cakes was no easy task. For those of you with no experience in food production, imagine vats of flour, hundreds of gallons of water flowing through industrial hoses, bags of sugar—in all, 300 meters of mixing and baking machine. Is it the percentage of flour? The heat of the oven? The amount of water? The size of the eggs? Or a combination?

And did we mention that every line baker, over time, has developed their own way of managing the settings and producing the cakes? That the whole process is regarded as a type of expert black magic, known only to a skilled handful?

Data scientist and a trending tool

How did we solve the problem? With a data scientist, a trending tool, and many test batches, we arrived at a new set of norms for production. From there, we mounted overhead electronic displays which showed line bakers when any given key success factor was out of spec, and needed to be adjusted.

The new system was textbook successful. Within a week, the frequency of yellows (check this) and reds (fix this now!) showing up on the displays fell precipitously. With that variability fell, quality rose…and waste dropped. Our client achieved 100% ROI on the project in well under a year, the savings after that were all gravy (or frosting).

Now, how would we do it today?

Even in just the past few years, many new big data tools have appeared in our industry. At Factora, we’re now working with a tool called Analytics, by ThingWorx.

Were we addressing the cake case today, we might dump all the data into Analytics. Ambient temperatures, raw materials, and so on, with each row of test data correlated with one critical factor: cake success or failure.

Analytics would ‘do the math’ for us, generating the optimal set of factors to repeatedly make successful cakes.

What’s the difference?  Real-time, speed, and new insights

Real-time The electronic display would be generating predictions in real time, based on an ever-growing database of cake failures and successes. Rather than a data expert developing a set of static test criteria from a dataset, the predictions would be machine-generated, becoming ever more sophisticated over time, backed by ever more data.

Speed Analytics can take five years of data and, in a few hours, generate an algorithm that offers 70% accuracy. True, it won’t be 100%. But surprisingly often, 70% is enough. Not to mention, the speed and low operational cost of the tool allows far more performance issues to be addressed.

New insights An additional benefit of Analytics is that you can throw as many inputs (factors) into the system as you wish—thereby discovering new or hidden insights.

In summary

As these tools become more prolific and more flexible, costs are coming down. What dataset would you like to dump into an analytical, predictive tool? What savings could you generate?