Use manufacturing intelligence to find value in process-control data
It’s better when you can see the forest, trees of greatest interest and relevant woodland conditions
By Steve Wise
It’s often the case that if you are manufacturing something, whether food, beverages, chemicals, pharmaceuticals or anything process-based, you are collecting and managing large amounts of data. As ingredients are mixed and blended, sensors placed on the various pieces of equipment provide a constant stream of measurements, and record and transmit flow rates, temperatures, pressures and other parameters.
Sure, it’s great that your control system is able to send out data in millisecond intervals between measurements, but what happens when it comes to analyzing these data, especially when they come from disparate machines or systems? Herein lies the problem: Many process engineers are looking at each individual tree, rather than the forest as a whole.
Consider a food manufacturer mixing materials in large vessels. Operators are taking samples from these vessels — in real time — to proactively monitor and respond to irregularities. If a problem occurs, an alarm sounds and the operator quickly remedies the situation. Processing continues. But, what good are these real-time alerts, if they don’t go beyond the shop floor, or the data’s “first life?”
What to collect
Before gathering data into a centralized repository or hub to uncover manufacturing intelligence, you must first understand which data should be collected. This is best determined by seeking input from staff members at each level of your organization. What type of data would be most useful to each of them?
Most likely, C-level executives will be interested in more advanced data summaries that can help lead to overall cost savings. Percentages and predictions, such as in scrap reduction, are highly valued. For quality professionals and plant managers, defect rates and overall equipment effectiveness (OEE) metrics are key. Plant operators are more focused on real-time alerts for out-of-spec events and other product-specific information.
By taking into consideration the needs of your data’s audience when implementing process control, you can produce ROI, a major factor in more readily affirming the value of your quality programs to upper management.
Just because something can be measured doesn’t mean it should be measured. So, what makes a good metric? First, a good metric must be objective, measurable and bi-directional, or able to serve customers both upstream and downstream. With readily available, bi-directional data, downstream operations can compensate for upstream anomalies, and upstream operations can be more responsive to downstream issues.
A metric must also be customer-specific, to easily help its respective audience — whether a CIO or plant operator — make better business decisions using either real-time or historical information. Focus first on collecting data that drive tier-1 decisions, such as profitability and customer satisfaction, and then tier-2 data, such as inventory turns and warranty costs. The top-tier goals will naturally drive and link to those of the lower level.
When to collect
The rapid collection of vast amounts of data is an innate tendency within the process industry, and there is a fine line between the necessity in collecting data in real time, and collecting data too fast. From a statistical perspective, there must be enough time in between data points for a variation to occur; otherwise, control charts aren’t meaningful.
The secret to finding a middle ground is relying on rational subgrouping. Rather than collecting data every minute, get a snapshot of your processes every hour, for example. Try taking samples in different, yet reasonable, time intervals and see what you can uncover. In the processing industry, you cannot see the full picture of your operations without backing away and looking at it from different levels.
After collecting meaningful data — in rational intervals — and assimilating them into your hub, you can take the next step in revealing the forest. A manufacturing intelligence hub allows you to take that data, and slice and dice them in millions of ways for easy analysis and reporting, giving them a second life. A robust hub that is powered by statistical process control (SPC) allows you to do so. Utilizing data analysis tools — such as root cause analysis, Pareto charts and box and whisker charts — offers the opportunity to find the true meaning and value in the data collected, while presenting it in the best format for each audience.
Putting key players into place
In emarking on your quest for Manufacturing Intelligence, it is imperative to invest the time and resources, namely the right people and technology, needed for optimal results. A good quality team will include a champion from within the upper echelon of the company and skilled hands-on team members. The champion provides an authoritative voice to reinforce the total value and benefits of a new quality system.
Remember that quality initiatives don’t come full circle overnight. Therefore, initially, keep the scope of your implementation manageable. Hone in on an area that will produce quick and significant results before expanding to other areas. Soon enough, you will find your organization identifying new ways to improve product quality, maintain compliance, boost profitability and enhance customer satisfaction.
Steve Wise is vice president of statistical methods at InfinityQS International, the global authority on manufacturing i ntelligence and enterprise quality. A Six Sigma Black Belt, Wise focuses on ensuring proper use of statistical techniques within InfinityQS’ software offerings and the application of these techniques for the customer base.
InfinityQS provides quality management software and services to manufacturers worldwide. Its mission is to help manufacturers of all sizes monitor and control product quality from a single facility to an entire enterprise and supply chain.