Home » Posts tagged 'QbD'
Tag Archives: QbD
Dr. Amrendra Kumar Roy
We would like to share our ~13 yrs of practical experience in the field of product development using statistical tools. But first, what compelled us to pursue six-sigma. Most of us started our career as a process chemist after completing PhD and it was during those initial days we realized the importance of “first time right” during commercialization. This enabled not only first mover advantage but also ensured timely and un-interrupted supply of our products into the market. Another aspect of the process development is its robustness, which ensures sustainable margins in whatever products we manufacture. Above achievement was possible only because of the six-sigma tools that we learned and applied at R&D stage. Latter we shifted our focus to the legacy products running in the plants, which we again studied using six-sigma tools to beat the eroding margins and this was possible because of few chemical engineers with six-sigma black belt joined the team.
As a chemist we were never trained on statistical tools hence, it was really a Herculean task for us to understand it. Another problem we faced was the statistical software, we would like to confess that we were never comfortable using these software as we were aware of “garbage in and garbage out” concept very well. We were never confident of the calculations thrown by software because we were not acquainted with the statistics. To site some examples
We were using regression analysis on five variables and found that we were getting an R Sq. of ~0.99 by including all five variables. We were happy about the results but we failed to realize that Adj. R Sq. has decreased while we added 4th and 5th variable, ending with a regression equation with un-necessary terms in it. As a result, we un-necessarily proposed control strategies for those insignificant variables which involved investment.
Another mistake we often made is to ignore the outliers during the Design of Experiments (DOE)! But we always wonder why we were ignoring these outliers? Just because we wanted to have a good regression equation? Are we not doubting our own experimental data? Later on we learned that if we keep ignoring the outliers just to have a good model, we would ultimately be modeling the system noise rather than modeling the effect. It is better to investigate the cause of outlier rather than ignoring it.
Above examples made us realize that having theoretical knowledge of six-sigma is not enough, it is the practical experience that really matters. Real challenge is the correct analysis of the experiments data so that the product could be scaled up without any problem. Learning to do the correct statistical analysis using any software was the mantra of the game. We should be confident that whatever output we are getting from the software is correct and this is possible only if we have good understanding of the statistical concepts. We are not saying that we should master the statistics but we must have clear understanding of the concepts before we use any software. It took us too long to understand these fundamentals aspects of applied statistics, main reason being the absence of statistical guru with adequate industrial experience. But major hurdle was to find a good tutor or at least a good book which can explain the concepts without involving too much of the statistics. We started looking for applied statistics courses and we found some solace in the “research methodology” module of MBA courses. Having gone through it, it gave us the confidence that six sigma tools can be learned without having in-depth knowledge of statistics.
During last 7-8 years we developed our own way of learning applied statistics with the help of diagrams and figures. During this journey we also found that each statistical topic have some connections with other topics and we can’t study any topic in isolation.
How normal distribution and hypothesis testing is working behind the scene in ANOVA, DoE, regression analysis and control charts.
Having gone through these hardship, I decided to share the experience with all those who like to understand the six sigma tools but are reluctant in doing so because of the statistics involved. Our website would help all six-sigma aspirants to understand the statistical concepts with the help of figures and diagrams. We would also be helping you in understanding the relationship between two unrelated topics like hypothesis testing and control charts.
Another feature that will help you is the solved example from the industry. Hence this would be an ideal website if you wish to appear for green/black belt exam from a reputed institute. We are saying this because we ourselves are ASQ certified six-sigma black belt and we want to share one important thing about the exam that we experienced, you can’t clear the exam unless you have understood the statistical concepts behind every six sigma tools. When we are saying understanding the statistical concepts, it doesn’t means learning pure statistics but only the concepts behind any tool, their advantages and limitations. This becomes important as ASQ never asks direct questions but questions are applied in nature. For example
A bulb production process found to follow normal distribution. A sample of 100 bulbs were drawn from a batch of 1000000 at random and found to have a mean life time of 1525 hrs. Historical mean life time was found to be 1548 hrs. with a standard deviation of 200 hrs. What is the percentage of bulbs having a life span of exactly 1548 hrs. from the current batch?
A manufacturing process was under optimization in a plant and a sample to 10 bags were selected at random from each batch. There were 5 batches in total and the mean weight (in Kgs) of the samples (10 bags) withdrawn are 100.5, 101.1, 99.8, 100.2 and 99.95. The range (in Kg) for these consecutive 5 batches were found to be 0.7, 0.9, 0.8, 0.9 and 1 Kg. Calculate the control limits for the chart.
Problem looks simple, in first case just calculate the z-value to tell the percentage and in the second case appears to be a direct question where we can easily calculate the control limits. But there is a catch, in first case probability for z = any number is zero! it is always about finding the probability between two numbers for a continuous probability distribution. In second case, if you missed the opening statement “under optimization” you are wasting your time in calculating the control limits, as control charts are always calculated for the stable process. In either of the question if you start the calculation, we can ensure that you won’t be finishing the exam in time!
Another major issue during applying six-sigma is the “use of right tool at the right place”. Hence our focus would not only to understand the concepts behind any statistical tools but also about selecting an appropriate tool for a given situation.
This website will start posting the six sigma topics (mainly statistical portion) from first week of January, 2017. Hence get registered on this course as soon as possible. The way we are planning to run the course is by posting one topic every week so that we can understand it well before taking subsequent topic. We are doing it in a slow pace because once we are on some advanced topic say “normal distribution” then at that time we should not be struggling with topics like variance, mean, z-transformation etc. Each topic will be followed by real life examples so that one can understand not only the concepts but also the use of appropriate tools. At the end of each topic we will also be demonstrating the use of excel sheet in resolving statistical problems. We are emphasizing on excel sheet as it is available to all. This would be our main USP during the course.
Organizational Initiatives Towards Developing Greener Processes for Generic Active Pharmaceutical Ingredients
– Dr. Vilas H. Dahanukar, Chief Scientist-Process R&D, Integerated Product Development Organization, Dr. Reddy’s Laboratories Ltd., India
4th Industrial Green Chemistry World Convention & Ecosystem (IGCW-2015) on 4th – 5th December 2015
The presentation will load below
Please scroll with mouse to see
Innovative Techniques, To Synthesize Breakthrough Molecules, See DOE On pae 4 onwards
The presentation will load below
WHAT YOU NEED TO KNOW…..
Quality by Design in Drug Product Development
Introduction to drug product development – setting the scene
- Drug product development at a glance – from first in man to marketing authorization
- Pharmaceutical QbD: Quo vadis?
- Application of QbD principles to drug product development
Expectations from regulatory agencies
- Regulatory initiatives and approaches for supporting emerging technologies
- Concepts of Real Time Release Testing (Draft Annex 17 EU GMP Guideline)
- Harmonization of regulatory requirements (QbD parallel-assessment FDA-EMA, ICH Q8 -> Draft Q12?)
- Regulatory expectations: Lessons learned from applications so far
- Knowledge Management (KM) System – Definition and Reason
- Knowledge Management Cycle
- Explicit and Tacit Knowledge – The Knowledge Spiral
- Correlation between KM and other Processes
- Enabling Knowledge Management
- Knowledge Review – integral part of the Management Review (ICH Q10)
Quality Risk Assessment and Control Strategy
- Objectives of Quality Risk Assessment (QRA) as part of development
- Overview to risk assessment tools
- Introduction of Process Risk Map
- Introduction of risk based control strategy development
QbD Toolbox: Case studies DoE, PAT, and Basic Statistics
- Value-added use of QbD tools – generic approaches and tailored solutions
- Case studies and examples for different unit operations and variable problems
Reports and Documentation
- Development Reports
- Transfer protocols and reports
- Control Strategy and link to the submission dossier
Wrap-up & Final Discussion
The concepts and tools used over the two days will be summarized and future implications and opportunities of applying QbD principles to process development will be discussed. Delegates will be given time to ask questions on how they can apply what they have learned to their own drug product development and manufacturing.
Workshop Process Risk Map & link to Control Strategy
Based on a risk assessment tool tailored to cover development needs, delegates will work on case studies of process development for a solid oral dosage form.
From QTPP and CQA to relationship analysis of process parameters and material attributes
Process mapping for integrated documentation of the development work
Process Risk Map as a tool for development-focussed risk assessment
Quality by Design in API Manufacturing
General framework and key elements of QbD for APIs – background and potential strategies
- What is it all about?
- What are the benefits?
- When and how should you use it?
- Practical examples with typical points of discussion
How to identify and control Critical Quality Attributes (CQAs) in API synthesis – a risk-based approach to developing a control strategy
- Severity assessment of quality attributes
- Impact levels for critical process parameters (CPPs) and critical material attributes (CMAs)
- Considerations for the API Starting material
- Design of an effective risk-based control strategy
How to provide information on the development of the API manufacturing process – dossier requirements
- What should be done at which stage?
- Which information is relevant for the dossier?
- What are the key-points to be considered for APIs (NCE/Biotech) and their formulations
- Typical questions from Authorities
Process Evaluation and Design Space
- Changing Validation Approach
- Validation Life Cycle
- Design Space Concept
Application of PAT in the API industry
- PAT at development stages of a QbD-based development
- PAT as part of the Control Strategy in a GMP environment
- Practical examples of PAT implementations at a commercial scale in a GMP environmen
Control strategies – Case studies and examples
- HA definitions
- Why and When is a control strategy needed
- Different types/elements of a control strategy
- Practical examples
QUALITY BY DESIGN (QBD) IN API
Identifying target product profile (TPP). TPP has been defined as a “prospective and dynamic summary of the quality characteristics of a drug product that ideally will be achieved to ensure that the desired quality, and thus the safety and efficacy, of a drug product is realized”. This includes dosage form and route of administration, dosage form strength(s), therapeutic moiety release or delivery and pharmacokinetic characteristics (e.g., dissolution and aerodynamic performance) appropriate to the drug product dosage form being developed and drug product-quality criteria (e.g., sterility and purity) appropriate for the intended marketed product. The concept of TPP in this form and its application is novel in the QbD paradigm.
Identifying CQAs. Once TPP has been identified, the next step is to identify the relevant CQAs. A CQA has been defined as “a physical, chemical, biological, or microbiological property or characteristic that should be within an appropriate limit, range, or distribution to ensure the desired product quality”10. Identification of CQAs is done through risk assessment as per the ICH guidance Q9 . Prior product knowledge, such as the accumulated laboratory, nonclinical and clinical experience with a specific product-quality attribute, is key in making these risk assessments. Such knowledge may also include relevant data from similar molecules and data from literature references. Taken together, this information provides a rationale for relating the CQA to product safety and efficacy. The outcome of the risk assessment would be a list of CQAs ranked in order of importance. Use of robust risk assessment methods for identification of CQAs is novel to the QbD paradigm.
Defining product design space. After CQAs for a product have been identified, the next step is to define the product design space (that is, specifications for in-process, drug substance and drug product attributes). These specifications are established based on several sources of information that link the attributes to the safety and efficacy of the product, including, but not limited to, the following:
- Clinical design space
- Nonclinical studies with the product, such as binding assays, in vivo assays and in vitro cell-based assays
- Clinical and nonclinical studies with similar platform products
- Published literature on other similar products
- Process capability with respect to the variability observed in the manufactured lots
The difference between the actual experience in the clinic and the specifications set for the product would depend on our level of understanding of the impact that the CQA under consideration can have on the safety and efficacy of the product. For example, taking host cell proteins as a CQA, it is common to propose a specification that is considerably broader than the clinical experience. This is possible because of a greater ability to use data from other platform molecules to justify the broader specifications. On the other hand, in the case of an impurity that is unique to the product, the specifications would rely solely on clinical and nonclinical studies.
In QbD, an improved understanding of the linkages between the CQA and safety and efficacy of the product is required. QbD has brought a realization of the importance of the analytical, nonclinical and animal studies in establishing these linkages and has led to the creation of novel approaches.
Defining process design space. The overall approach toward process characterization involves three key steps. First, risk analysis is performed to identify parameters for process characterization. Second, studies are designed using design of experiments (DOE), such that the data are amenable for use in understanding and defining the design space. And third, the studies are executed and the results analyzed to determine the importance of the parameters as well as their role in establishing design space.
Failure mode and effects analysis (FMEA) is commonly used to assess the potential degree of risk for every operating parameter in a systematic manner and to prioritize the activities, such as experiments, necessary to understand the impact of these parameters on overall process performance. A team consisting of representatives from process development, manufacturing and other relevant disciplines performs an assessment to determine severity, occurrence and detection. The severity score measures the seriousness of a particular failure and is based on an estimate of the severity of the potential failure effect at a local or process level and the potential failure effect at end product use or patient level. Occurrence and detection scores are based on an excursion (manufacturing deviation) outside the operating range that results in the identified failure. Although the occurrence score measures how frequently the failure might occur, the detection score indicates the probability of timely detection and correction of the excursion or the probability of detection before end product use. All three scores are multiplied to provide a risk priority number (RPN) and the RPN scores are then ranked to identify the parameters with a high enough risk to merit process characterization. FMEA outcome for a process chromatography step in a biotech process. RPN scores are calculated and operating parameters with an RPN score >50 are characterized using a qualified scaled-down model. For the case study presented here, these include gradient slope, temperature, flow rate, product loading, end of pool collection, buffer A pH, start of pool collection, volume of wash 1, buffer B pH, buffer C pH and bed height. Process characterization focused on parameters such as temperature, that have a high impact on the process (severity = 6), occur frequently in the manufacturing plant (occurrence = 6) and are difficult to quickly correct if detected (detection = 7). In contrast, parameters such as equilibration volume, with a low impact on the process (severity = 3), low occurrence (occurrence = 2) and a limited ability to detect and correct (detection = 5), were not examined in process characterization.
I liked this pic
By Frederic L. “Rick” Sax, M.D., global head for the Center for Integrated Drug Development, Quintiles.
The biopharmaceutical manufacturing industry has used quality by design (QbD) principles for decades. The essence of QbD is designing with the end in mind (in this case, the efficient manufacture of a high-quality drug product). This approach emphasizes that the operative word in QbD is not quality, but design.
read all at