1 MARLAP SCIENCE ADVISORY BOARD CHAPTERS 6 & 7 Stan Morton DOE-RESL April 2002.

Slides:



Advertisements
Similar presentations
Radiochemical Methods and Data Evaluation
Advertisements

Contracting for Laboratory Services Ann Mullin Cleveland US Geological Survey Presented to the Environmental Protection Agency Science Advisory Board April.
1 MARLAP, MARSSIM and RETS- REMP: How is All of this Related…or NOT? Robert Litman, Ph.D. Eric L. Darois, MS, CHP Radiation Safety & Control Services,
Chapter 7 Statistical Data Treatment and Evaluation
Chapter 16 Introduction to Nonparametric Statistics
1 Chapter 9 Hypothesis Testing Developing Null and Alternative Hypotheses Type I and Type II Errors One-Tailed Tests About a Population Mean: Large-Sample.
1 1 Slide STATISTICS FOR BUSINESS AND ECONOMICS Seventh Edition AndersonSweeneyWilliams Slides Prepared by John Loucks © 1999 ITP/South-Western College.
Interpreting Your Lab Report & Quality Control Results
World Health Organization
MARLAP Measurement Uncertainty
EPIDEMIOLOGY AND BIOSTATISTICS DEPT Esimating Population Value with Hypothesis Testing.
Laboratory Quality Control
8-1 Quality Improvement and Statistics Definitions of Quality Quality means fitness for use - quality of design - quality of conformance Quality is.
Control Charts for Variables
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 9-1 Chapter 9 Fundamentals of Hypothesis Testing: One-Sample Tests Basic Business Statistics.
BCOR 1020 Business Statistics Lecture 21 – April 8, 2008.
Inferences About Process Quality
Chapter 8 Introduction to Hypothesis Testing
UNIVERSITY OF HOUSTON - CLEAR LAKE Quality product (or service) as one that is free of defects and performs those functions for which it was designed.
Quality Assurance/Quality Control Policy
DQOs and the Development of MQOs Carl V. Gogolak USDOE Environmental Measurements Lab.
Multi-Agency Radiological Laboratory Analytical Protocols Manual: MARLAP Presentation to the Radiation Advisory Committee/Science Advisory Board April.
QUALITY CONTROL OF PHYSICO-Chemical METHODS Introduction :Validation توثيق المصدوقية.
Chemometrics Method comparison
Choosing Statistical Procedures
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 9 Hypothesis Testing.
Annex I: Methods & Tools prepared by some members of the ICH Q9 EWG for example only; not an official policy/guidance July 2006, slide 1 ICH Q9 QUALITY.
Business Statistics, A First Course (4e) © 2006 Prentice-Hall, Inc. Chap 9-1 Chapter 9 Fundamentals of Hypothesis Testing: One-Sample Tests Business Statistics,
Statistical inference: confidence intervals and hypothesis testing.
B. Neidhart, W. Wegscheider (Eds.): Quality in Chemical Measurements © Springer-Verlag Berlin Heidelberg 2000 P. HoulgateAssessment of Test Kits in Terms.
Determining Sample Size
WWLC Standard Operating Procedures Presented by Frank Hall, Laboratory Certification Coordinator.
Week 8 Fundamentals of Hypothesis Testing: One-Sample Tests
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
HD 2007 Rule Diesel Fuel Sulfur Testing and Sampling Methods and Requirements US EPA Office of Transportation and Air Quality November 20, 2002.
Quality WHAT IS QUALITY
1 1 Slide © 2003 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide IS 310 – Business Statistics IS 310 Business Statistics CSU Long Beach.
Introduction The past, the present and the future.
Quality Control Lecture 5
Laboratory QA/QC An Overview.
Quality Assurance How do you know your results are correct? How confident are you?
L Berkley Davis Copyright 2009 MER301: Engineering Reliability Lecture 9 1 MER301:Engineering Reliability LECTURE 9: Chapter 4: Decision Making for a Single.
5. Quality Assurance and Calibration Quality assurance is We do to get the right answer for our purpose. Have Sufficient accuracy and precision to support.
Chap 8-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 8 Introduction to Hypothesis.
1 of 36 The EPA 7-Step DQO Process Step 6 - Specify Error Tolerances (60 minutes) (15 minute Morning Break) Presenter: Sebastian Tindall DQO Training Course.
Chapter 9: Testing Hypotheses Overview Research and null hypotheses One and two-tailed tests Type I and II Errors Testing the difference between two means.
MARLAP Chapter 20 Detection and Quantification Limits Keith McCroan Bioassay, Analytical & Environmental Radiochemistry Conference 2004.
CD-ROM Chap 16-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition CD-ROM Chapter 16 Introduction.
Validation Defination Establishing documentary evidence which provides a high degree of assurance that specification process will consistently produce.
IMPORTANCE OF STATISTICS MR.CHITHRAVEL.V ASST.PROFESSOR ACN.
© Copyright McGraw-Hill 2004
9-1 Copyright © 2016 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
SRRTTF Interpretation of Low Level PCB Data Project Objectives and Impact on +Blank Correction Procedures.
1 of 31 The EPA 7-Step DQO Process Step 6 - Specify Error Tolerances 60 minutes (15 minute Morning Break) Presenter: Sebastian Tindall DQO Training Course.
Wenclawiak, B.: Fit for Purpose – A Customers View© Springer-Verlag Berlin Heidelberg 2003 In: Wenclawiak, Koch, Hadjicostas (eds.) Quality Assurance in.
Control Charts and Trend Analysis for ISO 17025
Quality Control Internal QC External QC. -Monitors a test's method precision and analytical bias. -Preparation of quality control samples and their interpretation.
LECTURE 13 QUALITY ASSURANCE METHOD VALIDATION
1 of 48 The EPA 7-Step DQO Process Step 6 - Specify Error Tolerances 3:00 PM - 3:30 PM (30 minutes) Presenter: Sebastian Tindall Day 2 DQO Training Course.
SEMINAR ON PRESENTED BY BRAHMABHATT BANSARI K. M. PHARM PART DEPARTMENT OF PHARMACEUTICS AND PHARMACEUTICAL TECHNOLGY L. M. COLLEGE OF PHARMACY.
© 2010 Pearson Prentice Hall. All rights reserved Chapter Hypothesis Tests Regarding a Parameter 10.
Quality is a Lousy Idea-
EPA Method Equivalency
Quality is a Lousy Idea-
METHOD VALIDATION: AN ESSENTIAL COMPONENT OF THE MEASUREMENT PROCESS
Why Use Them? By: Marcy Bolek – Alloway
Introduction To Medical Technology
Quality Control Lecture 3
Quality guidelines on impurities
Presentation transcript:

1 MARLAP SCIENCE ADVISORY BOARD CHAPTERS 6 & 7 Stan Morton DOE-RESL April 2002

2 Multi-Agency Radiation Laboratory Protocols Manual Purpose: Provide guidance and a framework for project planners, managers, and laboratory personnel to ensure that radioanalytical laboratory data will meet a project’s or program’s data requirements and needs.

3 Chapter 6 - Selection and Application of an Analytical Method MARLAP recommends the performance-based approach to method selection –Lab selects and proposes a method in response to the APSs and method validation level specified in the SOW method selection is a complex process that must consider: –APSs - MQOs, method validation status, qualified staff availability, production schedule, radiological and sample TATs, equipment calibration / availability, etc., –Project manager approves use of selected method evaluates submitted method validation documentation or evaluates performance of lab re analysis of method validation PT samples

4 Measurement Quality Objectives MQOs - analytical portion of the DQOs –statement of performance or requirement for a particular method performance characteristic - can be quantitative or qualitative method uncertainty, u MR, at a specified concentration (expressed as an estimated standard deviation) quantification capability (expressed as the minimum quantifiable concentration - MQC) detection capability (expressed as the minimum detectable concentration - MDC) applicable analyte concentration range - method’s ability to measure analyte over some specified range

5 Measurement Quality Objectives MQOs - analytical portion of the DQOs –statement of performance or requirement for a particular method performance characteristic - can be quantitative or qualitative method specificity - ability of method to measure the analyte in the presence of interferences method ruggedness - relative stability of method performance for small variations in method parameter values

6 Chapter Six: Selection and Application of an Analytical Method 6.1Introduction 6.2Method Definition 6.2.1Performanced-Based Approach and Application 6.3Life Cycle of Method Application 6.4Generic Considerations for Method Devel/Selection 6.5Project Specific Considerations for Method Selection 6.5.1Matrix and Analyte Identification 6.5.2Process Knowledge 6.5.3Radiological Holding and Turn Around Times 6.5.4Unique Processing Specifications

Measurement Quality Objectives Method Uncertainty Quantification Capability Detection Capability Applicable Analyte Concentration Range Method Specificity Method Ruggedness Method Bias Considerations

8 6.6Method Validation 6.6.1Purpose of Method Validation 6.6.2Laboratory’s Method Validation Protocol 6.6.3Tiered Approach to Validation Existing Methods Validation for Similar Matrices New Application of a Validated Method Newly Developed or Adapted Methods 6.6.4Method Validation Documentation

9 6.7Analyst Qualifications and Proficiency 6.8Method Control 6.9Continued Performance Assessment 6.10Documentation Sent to Project Manager Summary of Recommendations 6.11References

10

11

12 MQO - Method Uncertainty Requirement, u MR The recommended value of u MR is based on the assumption that any known bias in the measurement process has been corrected and any remaining bias is much smaller than the shift, , when a concentration near the gray region is measured. –MAPEP & QAP PE programs should measure laboratory bias as a testing parameter - bias is determined through multiple analyses, not a single measurement If decisions are to be made about the mean of a sampled population, estimate the u MR of the analytical method at the UBGR (action level) and require that the u MR be  the width of the gray region divided by 10. If this requirement can not be met, then require at least that the u MR be  the width of the gray region divided by 3.

13

14 Example The project planners have identified that the action level for the analyte is 0.10 Bq/g and the lower boundary of the gray region is 0.02 Bq/g. If decisions are to be made about the contaminated areas based on samples, then the u MR at 0.10 Bq/g would be: u MR =  / 10 = ( ) / 10 = Bq/g (8%) If this u MR cannot be achieved, then a method uncertainty as large as  / 3 = Bq/g (27%) may be allowed if more samples are taken per contaminated survey area. In terms of method selection, this MQO calls for a method that can ordinarily produce measured results with an expected combined standard uncertainty of 8% at the action level

15  is defined as DCGL - LBGR  is the standard deviation of the measured analyte distribution for the area being remediated  2 is mathematically defined as (  S 2 +  M 2 ) where  S 2 is the variance of sampled population (analyte) and  M 2 is average analytical method (measurement) variance  M is effected by laboratory sample preparation, subsampling, radiochemical and radiometric processes  /  is defined as the relative shift - number of standard deviations separating the DCGL and the LBGR Non-parametric tests are used to determine if the null hypothesis (survey unit does not meet the release criterion) can be rejected; Sign and Wilcoxon Rank Sum Tests.

16

17 MQO - Method Uncertainty Requirement If decisions are to be made about individual samples in a survey, MARLAP recommends the use of: u MR =  / ( z 1-  + z 1-  ) Example: UBGR = 1 Bq/L; LBGR = 0.5 Bq/L;  = 0.05;  = 0.10 u MR = / ( ) = 0.17 Bq/L or 17% at the action level. If  = 0.05;  = 0.05, then the following simplification can be used u MR = 0.3 

18 MQO - Method Uncertainty Requirement Flexibility in Data Review When Project Planners establish the MQO for method uncertainty for method selection and development, the maximum allowable standard deviation, u MR at the UBGR is specified. During data evaluation, the measurement uncertainty at any sample analyte concentration  UBGR should not exceed u MR During data evaluation, the measurement uncertainty at any sample analyte concentration > UBGR should not exceed u MR / UBGR or  MR

19 MQO - Detection Capability If the lower bound of the gray region is at or near zero and decisions are to be made about individual samples, choose an analytical method whose MDC is no greater than the upper bound of the gray region (action level). – LBGR = 0;  = UBGR – u MR  UBGR / ( z 1-  + z 1-  ) – u MR * (z 1-  + z 1-  )  UBGR – Form of: MDC  UBGR

20

21 Method Validation Process Parameters specified or ascertained from the analytical results generated –Defined Method Validation Level (Table 6.1) –Analytes –Defined matrix for testing, including chemical and physical characteristics that approximate project samples or –Selected project specific or appropriate alternative matrix PT samples, including known chemical or radionuclide interferences at appropriate levels –Defined sample preservation –Stated additional data testing criteria –Establish acceptable chemical / radiotracer yield values

22 Method Validation Process Parameters specified or ascertained from the analytical results generated APSs including MQOs for each analyte / matrix –Chemical or physical characteristics of analyte when appropriate –Action level when applicable –Applicable analyte concentration range including zero analyte (blanks) –Method uncertainty at a specific concentration –MDC or MQC –Bias if applicable –Other qualitative parameters to measure the degree of method ruggedness or specificity

23 Chapter 7 - Evaluating Analytical Methods and Laboratories First part of chapter discusses the evaluation of the documentation that the lab sends on the proposed method, method validation and performance in PE programs –Follow up on Chapter 6 on Selection and Application of an Analytical Method Second part of the chapter discusses the initial and ongoing evaluation of lab services –Follow up on Chapter 5 and Appendix E on Obtaining Laboratory Services

24 CHAPTER 7 EVALUATING ANALYTICAL METHODS AND LABORATORIES 7.1 Introduction 7.2 Evaluation of Proposed Analytical Methods –7.2.1Documentation of Required Method Performance Method Validation Documentation Method Experience, Previous Projects, and Clients Internal and External Quality Assurance Assessments –7.2.2Performance Requirements of the Sow – Analytical Protocol Matrix and Analyte Identification

25 CHAPTER 7 CONTINUED EVALUATING ANALYTICAL METHODS AND LABORATORIES Process Knowledge Radiological Holding and Turnaround Times Unique Processing Specifications Measurement Quality Objectives Bias Considerations 7.3Initial Evaluation of the Laboratory –… –7.3.4Review of Performance Indicators Review of Internal QC Results External PE Program Results

26

27

28

29 MQO - Method Uncertainty Requirement Flexibility in Data Review When Project Planners establish the MQO for method uncertainty for method selection and development, the maximum allowable standard deviation, u MR at the UBGR is specified. During data evaluation, the measurement uncertainty at any sample analyte concentration  UBGR should not exceed u MR During data evaluation, the measurement uncertainty at any sample analyte concentration > UBGR should not exceed u MR / UBGR or  MR

30 Monitoring A Lab’s Quantitative Performance Premise –Ensuring the Lab meets the method uncertainty requirement u MR =  / 10  MR = u MR / UBGR No method bias is assumed

31 Use of Internal and External QC Samples Laboratory Control Samples % D = 100 * (SSR - SA) / SA where SSR is measured result and SA is the spike value. Assumes the uncertainty in SA is negligible compared to the uncertainty in SSR. Warning Limits: (  2  MR ) * 100 Control Limits: (  3  MR ) * 100 Plot on control chart for trending - no action based on single measurement

32 Laboratory Control Samples - Example UBGR= 5 Bq/kg, u MR = 0.35 Bq/kg,  MR = 0.07 LCS is prepared with SA = 10.0 Bq/kg and analytical result of SSR =  0.75 Bq/kg. % D = 100 * (SSR - SA) / SA = 100 * ( ) / 10 = 16.1 Warning Limits: (  2  MR ) * 100 or  14 % Control Limits: (  3  MR ) * 100 or  21 % % D is above the warning limit but below the control limit

33 Use of Internal and External QC Samples Duplicate Samples - assuming significant activity X av = ( X 1 + X 2 ) / 2 When X av < UBGR; test the statistic | X 1 - X 2 | Warning Limit: 2.83 * u MR Control Limit: 4.24 * u MR Plot on control chart for trending - no action based on single measurement

34 Use of Internal and External QC Samples Duplicate Samples - assuming significant activity X av = ( X 1 + X 2 ) / 2 When X av > UBGR; Test the statistic RPD = | X 1 - X 2 | * 100 / X av Warning Limit: 2.83 *  MR Control Limit: 4.24 *  MR Plot on control chart for trending - no action based on single measurement

35 Use of Internal and External QC Samples Method Blanks - testing criteria are provided. However, the target value for an analytical blank is zero. For the sample specific MDC, the critical level should be used to determine if a blank is statistically positive. Testing Criteria Related to the u MR: Test the statistic: Measured Concentration Value Warning Limits:  2 u MR Control Limits:  3 u MR Plot on control chart for trending - no action based on single measurement

36 Method Blanks - Example UBGR= 5 Bq/kg, u MR = 0.35 Bq/kg,  MR = 0.07 Analytical Result: X = 0.20  0.10 Bq/kg Testing Criteria Related to the u MR: Warning Limits:  2 u MR or 0.70 Bq/kg Control Limits:  3 u MR or 1.05 Bq/kg Analytical result is below the warning limit. Although the test allows for a certain degree of contamination in comparison to the action level, the target value is zero and any blank value > the critical vale would be considered to be positive

37 Use of Internal and External QC Samples Matrix Spikes - testing criteria are provided. Test the statistic: Z Z = ( SSR - SR - SA ) / {  MR ( SSR 2 + max(SR, UBGR) 2 ) 1/2 } where SSR sample result, SA is the spike value, SR is the unspiked value. ( Assumes the uncertainty of the spike value is insignificant compared to the uncertainty of the measurements) Warning Limits:  2 Control Limits:  3 Plot on control chart for trending - no action based on single measurement

38 Matrix Spike Example UBGR= 5 Bq/kg, u MR = 0.35 Bq/kg,  MR = 0.07 SR = 3.5  0.29 Bq/kg, SA = 10.1  0.31 Bq/kg, SSR = 11.2  0.55 Bq/kg. Since SR is less than UBGR, max(SR, UBGR) = UBGR = 5 Bq/kg. Z = ( SSR - SR - SA ) / {  MR ( SSR 2 + max(SR, UBGR) 2 ) 1/2 } Z = ( ) / ( 0.07 * ( ) 1/2 = Warning Limits:  2 Control Limits:  3 Z is less than the lower warning limit (-2) but slightly greater than the lower control limit (-3)

39 MQO - Method Uncertainty Requirement Flexibility in Data Review When Project Planners establish the MQO for method uncertainty for method selection and development, the maximum allowable standard deviation, u MR at the UBGR is specified. During data evaluation, the measurement uncertainty at any sample analyte concentration  UBGR should not exceed u MR During data evaluation, the measurement uncertainty at any sample analyte concentration > UBGR should not exceed u MR / UBGR or  MR

40 Example - Evaluation of Data Below and Above the UBGR UBGR= 1 Bq/L, LBGR = 0.5 Bq/L, u MR = 0.17 Bq/L,  MR = 0.17 or 17% Any result  1 Bq/L should have a measurement ( Combined Standard ) uncertainty no more than 0.17 Bq/L. Any result > 1 Bq/L should have a relative combined standard uncertainty no greater than 17%.