Data-driven manufacturing: Using operational analytics tools to improve consistency and drive profitability
One of the biggest challenges facing process manufacturers today is achieving consistent performance across different production lines, facilities and geographies. Inconsistent performance can lead to quality issues, higher costs and missed delivery targets, all of which can undermine profitability and customer satisfaction. The good news is that operational analytics tools can help address this challenge.
By leveraging analytics and operational functions, manufacturing operations management (MOM) tools enable engineering teams to identify optimal operating conditions and create guardrails that ensure processes run within those parameters. Doing so can help improve consistency, reduce variability and enhance product quality. A good MOM toolkit typically includes features such as control charts, limits, centerlines, dashboards, reports and smart alarms, all designed to help operators maintain optimal process conditions and quickly identify and correct deviations.
Distinguishing between operations and analytics tools in MOM platforms
To fully optimize industrial processes, both the operational and analytics functions of MOM platforms are necessary. While they work together to achieve the common goal of process optimization, they serve different purposes.
Analytics tools primarily focus on trending, modeling, statistics and correlative parts of MOM platforms. These tools help identify optimal operating conditions and investigate common or recurring causes of unplanned downtime or product loss. By reducing variability, analytics tools lay the foundation for process optimization.
The insights gained from the analytical phase inform the development of operational tools, which serve as guardrails to ensure that the plant operates consistently and efficiently. Once optimal conditions have been identified, operational tools like control charts, centerlines, alarms, reports and dashboards are developed to keep the plant running consistently within a target operating range.
Creating an operation and analytics feedback loop
While visually identifying issues during a plant walkthrough can be useful, analytics provide a more comprehensive and data-driven understanding of problems. Without applying analytical insights, you end up repeatedly fighting the same fires. Manufacturing operations toolkits allow you to look forward and implement a better operating paradigm for continuous improvement.
The process of passing diagnostic or predictive analytical data along to operational systems essentially results in a feedback loop that empowers subject-matter experts (SMEs) to solve problems and incorporate the solutions into operations.
To achieve consistency with MOM tools, they need to be able to do the following:
- Collect all the available essential data from historians, DCSs, SCADA systems, etc.
- Integrate all that data and make it accessible in one location.
- Provide descriptive, diagnostic and possibly even predictive analytics tools that subject matter experts (SMEs) can easily use in troubleshooting and root cause analysis.
- Incorporate the solution into operations to prevent it from recurring.
Ideally, you want both operational and analytics insight simultaneously. Data analysis empowers users to solve problems, and institutionalizing that data leads to consistent decision-making, i.e., consistent data should yield consistent results.
Making operational data actionable
To make operational data actionable, you must develop a “single source of truth.” In many facilities, personnel have spent decades building out the data layer of their infrastructure. Those data investments could include data historians, lab quality, MES (manufacturing execution system) data, ERP (enterprise resource planning) data, maintenance management records, etc. The existing investments should be leveraged to form a data foundation, inform processes and create value.
Integrating all this data provides a full view of your plant’s performance — a single source of truth. Once a single source of truth is established, data can be transformed in two ways. First, data can be transformed for performance by analyzing trends over time and retrieving other performance data as needed. Second, data can be transformed for operational context by analyzing data from the perspective of work shifts or product types.
To empower the operations team to make quick decisions, a self-service data analysis layer can be implemented. This feature allows subject matter experts who know the processes and understand the plant to access the data they need to solve problems.
Integrating external data and advanced analytics can further inform operations. For example, manufacturers may have additional analytics tools for specific equipment. Incorporating third-party data sources can add value to that single source of truth.
By making operational data actionable, manufacturers can improve their process performance and achieve their operational goals.
Promoting value from consistency
Access to real-time operational data for making data-driven decisions yields significant value by increasing production efficiency, improving throughput, reducing variable costs and more. However, consistency in data is equally important. In fact, reducing variability in the system is often valuable.
To achieve consistency, you need access to all the necessary data from a single operations toolkit. Eliminating data silos and connecting different systems is the only way to get a holistic view of operations. It's also essential to consider local factors and flexibility to achieve corporate uniformity. A unified perspective makes it easier to analyze the data meaningfully, make decisions about plant-specific assets, and invest in promoting consistency to drive value.
The key to promoting operational consistency and enabling value is mining the data with the proper operations toolkit. Start by gathering the correct data for analysis. Creating a single source of real-time data and analytics can identify issues before they arise and add to institutional knowledge to prevent recurring problems. The goal is to establish a baseline to gauge performance improvement and promote operational consistency, which ultimately drives value.
Kevin Jones serves as the Director of Sales and Marketing for dataPARC. Kevin has been with the company since 2001 and has over 22 years of experience in process industries and using data to drive decisions. Kevin holds an undergraduate degree in Chemical Engineering from University of Idaho. dataPARC is a leading provider of industrial analytics and data visualization tools for process optimization and decision support.