Powering up batch and process control analytics with industrial DataOps

May 22, 2024
Industrial DataOps supports the goal of providing process manufacturers with smarter insights, guiding them toward optimized process parameters and improved batch outcomes.

Within the process manufacturing industries, batch/process control analytics are incredibly important, offering a powerful toolbox to improve efficiency, productivity, quality, consistency and competitiveness. 

By leveraging data to optimize processes, process manufacturers can reduce costs, minimize waste and ultimately enhance their bottom line. However, fully optimizing batch and process control analytics is dependent on the veracity, quality and amount of data available for analysis. As the saying goes, “garbage in, garbage out” — the quality of output depends on the quality of input. Therefore, a robust analytics strategy requires the seamless connection and integration of diverse sources, ensuring accurate data collection, contextualization and utilization by process manufacturers. The introduction of industrial data operations (DataOps) aims to dismantle silos, enhancing the widespread accessibility and usability of industrial data to propel operational improvements. 

Highlighting incumbent challenges

Process manufacturers lacking a cohesive batch/process control analytics strategy face many challenges. To leverage industrial DataOps to harness the full value of data, it is important to understand these challenges, the most notable of which are also key performance indicators (KPIs). These are some of the key metrics involved in batch performance analytics:

  1. Product quality — Variations in product quality are a common problem, due to lack of insights into process conditions. With limited data available, it is hard to detect and correct any variations in quality, as data can be siloed and not collected cohesively in real time. Therefore, there can be no “Golden Batch” or statistical control limit guidance as batch production takes place. Golden Batch manufacturing was defined by Control Global way back in 2008 as “the time-based profile of the measurement values that were recorded for a particular batch that met product quality targets.” Simply put, it is a system for identifying an ideal output and optimizing the manufacturing process to replicate the conditions that produced it. Without timely information available, late process corrections can lead to batch losses, with the cost of scrap estimated to be anywhere from $100k to $1 million-plus annually.
  2. Asset utilization — The inability to increase throughput and shorten cycle time due to a lack of analytics on batch performance is another key challenge for process manufacturers. The lack of cohesive cross-site performance comparisons to identify and share best practices stymies evolution, necessitating manual batch analysis that is both resource-intensive and time consuming.
  3. Manufacturing cost — The third main challenge is a lack of insight to enable the analysis of raw material consumption and energy usage from five sources, known as WAGES (water, air, gas, electricity and steam) on a batch-to-batch basis. This lack of visibility into process results in an inherent inability to correct deviations and produce more consistent, repeatable batches. Off-spec material or product, usually the result of sub-par material delivery or bad ingredients, must be reprocessed or additional material must be added, lowering asset utilization and increasing manufacturing costs. The estimated cost of rework, excess raw material and WAGES is around $1 million.

The common theme across them all is that they require trusted, timely and meaningful data to create these metrics.

The value of industrial DataOps

Introducing industrial DataOps into the batch/process control analytics process standardizes, centralizes and contextualizes complex industrial data, which can help solve data architecture and integration challenges. Having this single source of truth with automated data pipelines offers process manufacturers a common, streamlined way to discover, understand and analyze data.​ The capability to intelligently uncover relationships and identify gaps enables granular parsing of data at a product level.

Leveraging existing information technology (IT) and operational technology (OT) data connections, industrial DataOps feeds into a defined manufacturing production data platform that enhances data value. This accelerates the development and scaling of applications, such as batch performance, energy management and asset intelligence, addressing incumbent challenges caused by fragmented visibility and the need for manual workflows due to distributed and siloed data sources. This convergence of IT and OT data sources eliminates historical limitations, allowing for rapid deployment and scalability of solutions.

In terms of batch performance analytics, industrial DataOps supports control process variability through a “Golden Batch” framework, as shown in Figure 1. Baseline measures are established for batch and routine variations. Process behavior control charts and “out of trends” events analysis then detect and quantify any variations from the established baseline, driving root cause analysis through correlation analysis and predictive modeling. The loop is then closed through corrective actions to manufacturing processes.

This is just one example of how industrial DataOps can be integrated to improve batch/process control analytics and solve tangible problems in a single process manufacturing line. As industrial DataOps connects at the enterprise level, this process can easily be scaled across an entire organization through, for example, a cloud-native Software-as-a-Service (SaaS) platform to include multiple tenant and manufacturing sites, resulting in improved operational performance that drive elements such as:

  • Multi-level KPI tracking and comparison to the Golden Batch KPIs. 
  • KPI trend monitoring over time via a variety of chart displays. 
  • Comparison of the performance of multiple batches at one time.
  • Root-cause analysis for batches not following Golden Batch profile.
  • KPI trends analysis and forecasting.

Unlocking the Golden Batch

Understanding the process necessary to realize these benefits of scale is key to the implementation of a robust, optimized batch performance analytics strategy that leverages DataOps and unlocks the Golden Batch. Data collection, contextualization, monitoring and analysis delivers valuable insights visualized through an interactive dashboard that enables process manufacturers to constantly refine and improve operations by following a logical industrial DataOps process:

Data collection

There are three types of data that need to be integrated: event data, batch data and time series data. These components, when seamlessly integrated, provide a comprehensive view of industrial processes, enabling efficient analysis and decision-making.

Data is collected from disparate sources through standard extractors such as Microsoft SQL or FT Historian and formatted flat file, Excel spreadsheets or similar formats. This data is then extracted from the source systems, with tabular data stored in its original format as a copy in what is called a staging area. This is done to avoid requesting data multiple times, provide the ability to make changes to copy to fit business needs without manipulating the source, and to prevent data loss. 

Establishing a central repository that aggregates various types of data and merges disparate datasets enables users to gain a consolidated view that goes beyond the isolated utility of individual repositories. This centralized approach allows for meaningful correlations, offering a more profound analysis of industrial operations. The integration and overlay of diverse datasets in a central repository unlock valuable insights for enhancing operational efficiency.

Data contextualization

The way data is collected is often dissociated from the way it needs to be analyzed, so adding context to batch data, time series data and quality alarms and events makes it more meaningful, relevant and usable. Linking data to a flexible data model is the next stage, and crucial for interpreting data accurately, uncovering insights and making informed decisions. Data is transformed from one state to another, using integrated data transformation tools depending on the requirements and technology preferences. For example, reshaping data may be required, enriching and contextualizing data by matching it with other data objects, and analyzing data quality, for example, checking that all the required info is present in the data object 

Unveiling temporal patterns for enhanced monitoring

Overlaying data opens the door to temporal analysis, enabling users to track and compare crucial parameters over time. For instance, in batch processing, one can scrutinize temperature profiles across different runs to identify patterns and anomalies. This real-time monitoring, extending beyond the immediate batch, provides a comprehensive perspective on the overall process quality. 

Contextualization is key in identifying the root causes of variations in batch quality. By associating events and process data, users can trace back to specific conditions or operations that may have affected the final product quality. This capability allows for proactive adjustments and improvements to prevent the recurrence of quality issues.

Tracking manual operations and raw material utilization

Where manual operations are still necessary, data contextualization extends to monitoring those operations and raw material utilization. For instance, tracking the efficiency of manual operators in adhering to predetermined timelines can significantly impact batch cycle times. Additionally, precise control over raw material set points ensures optimal usage, preventing waste and contributing to the desired product quality.

Continuous improvement and scenario analysis

The integration of a centralized data repository facilitates continuous improvement initiatives. Process engineers can compare batches, analyze historical performance and make data-driven decisions to enhance efficiency. This iterative approach enables scenario analysis, empowering users to assess the impact of changes and innovations on batch outcomes.

Site managers can determine how efficiently each product is being manufactured and could help with investigating process idle time, thus allowing them to make more batches with the existing process equipment. 

The visualization of insights

The insights stage is where the analyzed data is output in pre-defined formats, for example, visualizing site and batch performance. Typical measures include batch performance index (BPI) score, single key performance indicator (KPI) scores, BPI/KPI distribution, BPI/ KPI control and trend charts, number of out of specification (OOS) and out of trend (OOT) events, OOT/OOS status, and quality, production summary and detailed production reporting. From this, process manufacturers have the intelligence to investigate actual and Golden Batch KPIs, actual and golden batch KPI parameters, identify critical quality attributes and process parameters, and refine and apply those on an ongoing basis.

Conclusion: Holistic visibility for informed decision-making

Industrial DataOps supports the goal of providing process manufacturers with smarter insights, guiding them toward optimized process parameters and improved batch outcomes. Furthermore, the aspiration to extend this analytical capability to an enterprise level promises a unique approach to managing and optimizing batch processes across different sites.

The ability to overlay diverse datasets, analyze temporal patterns and trace the root causes of variations empowers organizations to make informed decisions, optimize operational efficiency and continually enhance product quality. As technology advances, the integration of automation and enterprise-level analysis holds the promise of revolutionizing how industries approach and manage batch processes. By leveraging data to optimize processes, process manufacturers can reduce costs, minimize waste and enhance their bottom line.

Scott Thomas is a batch performance analytics product manager at Rockwell Automation. He has over 38 years of experience delivering customer automation projects, mainly on batch processes in the food and beverage, consumer goods and life sciences industry.

Rockwell Automation

www.rockwellautomation.com

Sponsored Recommendations

2024 Manufacturing Trends — Unpacking AI, Workforce, and Cybersecurity

The world of manufacturing is changing, and Generative AI is one of the many change agents. The 2024 State of Smart Manufacturing Report takes a deep dive into how Generative ...

The Journey to Operational Excellence: Quality-Driven Compliance

Learn firsthand from top industry analysts how to navigate regulatory compliance (i.e. FSMA) & food safety audits in manufacturing.

Cold Chain Tracking with FactoryTalk PharmaSuite

Manage thermo-sensitive materials, optimize production & reduce waste

State of Smart Manufacturing Report Series

The world of manufacturing is changing, and Generative AI is one of the many change agents. The 2024 State of Smart Manufacturing Report takes a deep dive into how Generative ...