Presentation is loading. Please wait.

Presentation is loading. Please wait.

Method Validation-where do I start?

Similar presentations


Presentation on theme: "Method Validation-where do I start?"— Presentation transcript:

1 Method Validation-where do I start?
Dr Geraldine O’Donnell Director of DNA Your logo(s) here

2 Overview-Where do I start?
Why, when, when not and how to validate Instrumental methods (qualitative)-performance parameters Instrumental methods (quantitative)-performance parameters Template for validation plan/report Touch on human based method validation

3 Why Validate? During the process of the introduction or implementation of a new method a specific step must be taken to prove in an objective way that the method is suitable for its intended use. This step is called validation Demand for validated methods has been driven by customers, accreditation bodies (i.e. as part of ISO accreditation requirement), and forensic community. an integral part in the development and implementation process of a new method

4 When to Validate non-standard methods,
ISO/IEC non-standard methods, laboratory-designed/developed methods standard methods used outside their intended scope amplifications and modifications of standard methods

5 When not to validate for standardised methods such as ISO, ASTM a full validation is not necessary need to verify the in-house performance of the method as detailed in ISO/IEC the laboratory shall confirm that it can properly operate standard methods before introducing the tests or calibrations called verification according to VIM

6 How to validate Make decision that initial method development is finished Often not possible to determine exactly where method development finishes and validation begins Many of the method performance parameters that are associated with method validation are in fact usually evaluated, at least somewhat, as part of method development. Document the measurement procedure (SOP)

7

8 How to validate Part A- Initiate Project
A designated person is appointed to draw up the validation plan Plan should contain the following four elements: 1. The laboratory and customer requirements 2. The performance parameters, that will need to be used to ensure that the outputs meet the laboratory and customer requirements - parameters will be dependent on the technique or process under consideration, but should in general address, as appropriate:

9 Sampling Precision Repeatability Within-lab reproducibility Bias Matrix/substrate effects Specificity Working range Limit of detection/sensitivity Linearity Robustness Environmental susceptibility Competency of personnel Eurachem guide: Terminology in analytical measurement – Introduction to VIM 3 (2011) available from

10 How to validate Part A- Initiate Project
3.The Acceptance Criteria to be used to assess whether the performance parameters have been met Note: It is critical to the success of the validation that the acceptance criteria are set as specific as possible prior to the commencement of the validation work. 4. The design of the validation tests should also be considered at this stage to ensure that they are as objective as possible. validation plan will be reviewed by the operational manager/quality manager

11 Start the validation work
How to validate Part B Start the validation work tests performed should be those specified in the validation plan. New or additional tests should not be introduced, or planned tests not performed, unless authorised by the operational manager for the designated area. Prepare implementation plan designated person has to consider what needs to be in place before the new technique or process can be implemented and how the implementation will be carried out. These considerations should be included as part of the final validation report

12 Where appropriate the following should be addressed:
How to validate part B Where appropriate the following should be addressed: The staff training plan and the arrangements for competence assessment and proficiency testing The protocols for calibration, monitoring and maintenance of any equipment The supply and traceability of any standards/reference materials The supply and quality control of key materials and reagents

13 Review progress on validation work
How to validate part B The SOP documents for the process and the interpretation/reporting of results Anti-contamination protocols Any special requirements associated with health and safety  Review progress on validation work

14 Complete and prepare validation report
How to validate Part C Complete and prepare validation report important that this includes all the information needed to facilitate independent assessment of the fitness for purpose of the technique or process A summary of the raw experimental data will normally suffice. But the raw data must be available. A statement of fitness for purpose regarding the method is added in the report.

15 Review validation report and implementation plan
How to validate Part C Review validation report and implementation plan final validation report and implementation plan will be reviewed and approved at least by the operational manager for the designated area or the quality manager. must be signed of formally as deemed fit for use. consideration given to executive summary aim of this is to provide those making decisions on the use of the results with a summary of the validation steps performed, and key issues surrounding the validation

16 Instrumental Methods (Qualitative) Performance parameters
Qualitative analysis can be defined as “Classification according to specified criteria In analytical chemistry and related disciplines, the ‘criteria’ are understood to relate, in general, to information about the presence, composition and/or structure of materials A qualitative method would normally give three possible results, positive, negative or inconclusive For single laboratory validation of qualitative methods recommend the Metrology of Qualitative Chemical Analysis (MEQUALAN) , report EUR EN, ISBN , European Commission

17 Instrumental Methods (Qualitative) Performance parameters
Precision most useful estimate is the within-lab reproducibility which is the precision measured with different analysts, over extended timescales and, within a single laboratory Precision for qualitative measurement can be stated in terms of a percentage of similar results obtained for test samples precision is generally dependent on analyte concentration, and so should be determined at different levels

18 Instrumental Methods (Qualitative) Performance parameters
Trueness – Bias is an expression of how close the mean of a set of results is to the reference value can measure the false positive and false negative rates when we have prior information about the presence or absence of an analyte in a test sample For a given test method, the basic properties that need to be measured are the numbers of true positive (TP) and true negative (TN) results and the numbers of false positive (FP) and false negative (FN) results obtained on a range of samples. From these numbers, the fundamental measures of reliability viz. the false positive and false negative rates can be calculated.

19 Instrumental Methods (Qualitative) Performance parameters
The false rates vary with the level of any analyte. For high levels of analyte, the likelihood of false negatives will be very low but at levels slightly above the threshold it will be relatively high.

20 Instrumental Methods (Qualitative) Performance parameters
Measurement range For a qualitative test there is a LOD or a threshold. For further guidance see The Fitness for Purpose of Analytical Methods; a Laboratory Guide to Method Validation and Related Topics, 2nd Edition, Eurachem (2014)

21 Instrumental Methods (Qualitative) Performance parameters
Ruggedness Control of uncertainties in test parameters, such as times, temperatures, lengths etc, are vital for reliable qualitative testing. expected to control factors affecting the test result to within specified tolerances or in a validation demonstrate that the possible variation in individual test parameters have no significant influence on the outcome of the test. Measurement uncertainty Not directly applicable for a qualitative test.

22 Instrumental Methods (Quantitative) Performance parameters
Precision the most useful estimate is the within-laboratory reproducibility is usually stated in terms of standard deviation or relative standard deviation (RSD) is generally dependent on analyte concentration, and so should be determined at a number of concentrations

23 Instrumental Methods (Quantitative) Performance parameters
Trueness – Bias trueness is normally expressed in terms of measurement bias bias is estimated from the difference between the mean value of several measurement results preferable obtained under within-laboratory reproducibility conditions and a reference value trueness can also be investigated as selectivity e.g. by measuring bias at different levels of known interferences and in different matrices.

24 Instrumental Methods (Quantitative) Performance parameters
Measurement range measuring range is an interval of the concentration, which can be measured with a specified uncertainty using the method the lower limit of the measuring range is often considered to be the limit of quantification, LOQ Eurachem guide: Terminology in analytical measurement -Introduction to VIM 3 (2011)

25 Instrumental Methods (Quantitative) Performance parameters
Ruggedness/Robustness In any method there will be certain steps which, if not carried out sufficiently carefully, will have a severe effect on method performance steps should be identified and their influence on method performance can be evaluated involves making deliberate variations to the method, and investigating the subsequent effect on performance

26 Instrumental Methods (Quantitative) Performance parameters
Measurement uncertainty expanded measurement uncertainty is normally what the laboratory reports to the customer provides an interval within which the value of the measurand is believed to lie with a high level of confidence (normally 95%) in proficiency testing the claimed uncertainty can be verified by comparing the difference between the result of the laboratory and the assigned value

27 Template of the validation plan and report
IN-HOUSE VALIDATION OF MEASUREMENT PROCEDURES VALIDATION PLAN

28 Template of the validation plan and report
IN-HOUSE VALIDATION OF MEASUREMENT PROCEDURES VALIDATION PLAN

29 Template of the validation plan and report

30 Template of the validation plan and report

31 Template of the validation plan and report

32 Template of the validation plan and report

33 The key to understanding how to validate human based methods is understanding what changes arise from substituting the INSTRUMENT with the HUMAN

34 Human Based Aspects not as self determining as for instrumental methods (i.e. selection of critical features) No standards available Hence: As method development is more demanding two requirements arise: documentation of the feature set to demonstrate that the method is fit for purpose documentation of the decision / choices made for the selection of the critical features Need to check the basis of these decisions ahead of performance testing (method development check)

35 Performance testing involves validating the analysis, comparison
Instrumental Methods parameters to be validated e.g. precision, trueness, measurement range etc. Human Based Methods Specimen Analysis Comparison Evaluation Verification Performance testing involves validating the analysis, comparison and Verification steps Must use samples where the ground truth is known for the features in performance testing

36 THANK YOU!!


Download ppt "Method Validation-where do I start?"

Similar presentations


Ads by Google