Toward Zero Defects: Using Analytics to Reshape Quality

Manostaxx

Advanced analytical techniques can help companies identify and eliminate hidden sources of quality problems

Most companies already make use of an array of well-known methodologies to keep quality under control. Those approaches, which include the application of lean-management techniques and Six Sigma tools, have been instrumental in significantly reducing quality deviations. Yet difficult-to-diagnose quality issues still drive up product costs and put reputations at risk.
Today’s heavily instrumented, highly automated production environments present both an opportunity and challenge for quality teams. The opportunity is provided by the sheer size of the “data lake,” which may have grown by orders of magnitude over recent years. The challenge comes in knowing how to make use of all that data. Teams may lack a full picture of the data available to them, or an understanding of its relevance to quality outcomes.
That’s where the machines come in. Advanced analytics approaches are transforming the way companies do many things, from retail operations to  procurement, process yield improvement and maintenance planning. Success in these areas has fueled interest in what analytics can do in others — and has spurred growth in the number of off-the-shelf applications and tools.

But on their own, even the smartest analytical tools aren’t enough to produce meaningful quality improvement. That requires the right industry knowledge and careful change management as well. In this article, we’ll look at how one company in the pharma sector applied such a combined approach to its own quality challenges. The results were impressive. The site, which already performed strongly in comparative benchmarks, identified the likely root causes of 75 to 90 percent of its remaining process-related quality deviations, and revealed opportunities to improve overall process stability and control.
ANALYZING THE DATA
The first step in the company’s advanced analytics effort was to capture, store, structure and clean its data. Like all modern pharmaceutical operations, the manufacturing environment was data-rich. Production machinery was highly automated and heavily instrumented. Intermediate and finished products were regularly sampled and analyzed in its on-site quality assurance laboratory. And production staff kept meticulous records.
Bringing all that data together into a form suitable for automated analysis required significant effort, however. The site stored its data in eight separate databases, using a variety of structured and unstructured formats, running on different computer systems. The objective was to combine data from more than two and a half years of production into a single data lake containing more than 2,000 parameters (Exhibit 1). Achieving that required close cooperation with the company’s IT and compliance functions — especially to uphold corporate data security and regulatory standards.
The next challenge was to identify relationships between process parameters and production deviations. A custom-coded decision-tree model iteratively searched the data lake to find factors associated with an increased probability of quality deviations. To identify the most important relationships, the team looked for specific risk factors, especially process parameter thresholds at which deviations were at least twice as likely to occur (Exhibit 2).
APPLYING HUMAN INSIGHT TO IDENTIFY ROOT CAUSES
The risk-factor analysis identified more than two dozen points of potential interest. But since correlation does not necessarily mean causation, the company had its manufacturing-process specialists review the results. They helped weed out meaningless relationships, such as downstream process parameters associated with upstream deviations, to identify those worthy of further exploration.
The company then conducted deep dives into specific areas. Some of this work took place in front of the analysis screen: looking in detail at the evolution of specific parameters over time, along with the associated incidence of quality deviations, for example. A lot of it happened on the shop floor. The teams studied the way processes were actually run and managed. They looked at factors that could potentially influence variability in process parameters, using well established lean and root-cause analysis tools, such as the 5-whys approach.
That detailed, hands-on work revealed numerous opportunities for improvement. For example, one key process involved an exothermic reaction, with a water cooling system to remove excess heat from the reaction chamber. The analysis showed that higher temperatures during the reaction were associated with a greater number of deviations. Operators had assumed that the cooling system would hold the reaction temperature within an acceptable range, but the time series data showed that the maximum temperature was not well controlled. It had been allowed to rise gradually over a number of months, leading to more deviations.
In response, the team proposed a two-stage action plan. First, the cooling system’s target temperature would be reduced. Second, if the maximum reaction temperature was still too high, the pure water used in the cooling system would be replaced with a lower-temperature solution containing anti-freeze.
Continue at:
https://www.pharmamanufacturing.com/articles/2017/toward-zero-defects-using-analytics-to-reshape-quality/?utm_source=hs_email&utm_medium=email&utm_content=57293272&_hsenc=p2ANqtz–gao0v9_mDRA4xWBIXpARuJOYrYEtCyZ0jIiSSkxQNslgfTfzAhZG88OaWp-lpMKQZ7zAm-3sroISkW9yChf8jkJCyuQ&_hsmi=57293272
The text above is owned by the site bellow referred.
Here is only a small part of the article, for more please follow the link
Also see:

Kaizen – Toyota Production System guide

Leave a Reply

Your email address will not be published. Required fields are marked *