EPA’s Bioassessment Performance and Comparability “Guidance”

Slides:



Advertisements
Similar presentations
Chapter 2 The Process of Experimentation
Advertisements

Supplementary Training Modules on Good Manufacturing Practices
Radiochemical Methods and Data Evaluation
Quality is a Lousy Idea-
Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
1 MARLAP, MARSSIM and RETS- REMP: How is All of this Related…or NOT? Robert Litman, Ph.D. Eric L. Darois, MS, CHP Radiation Safety & Control Services,
RELIABILITY Reliability refers to the consistency of a test or measurement. Reliability studies Test-retest reliability Equipment and/or procedures Intra-
Federal Guidance on Statistical Use of Administrative Data Shelly Wilkie Martinez, Statistical and Science Policy, OIRA U. S. Office of Management and.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Interpreting Your Lab Report & Quality Control Results
World Health Organization
Comparable Biological Assessments from Different Methods and Analyses David B. Herbst 1 and Erik L. Silldorff 2 1 Sierra Nevada Aquatic Research Laboratory,
Dr Samah Kotb Lecturer of Biochemistry 1 CLS 432 Dr. Samah Kotb Nasr El-deen Biochemistry Clinical practice CLS 432 Dr. Samah Kotb Nasr.
Importance of Quality Assurance Documentation and Coordination with Your Certified Laboratory Amy Yersavich and Susan Netzly-Watkins.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
1 HIV Drug Resistance Training Module 9: A Systems Approach to Laboratory Quality.
DQOs and the Development of MQOs Carl V. Gogolak USDOE Environmental Measurements Lab.
Proposal in Detail – Part 2
Quality Assurance in the clinical laboratory
QA/QC FOR ENVIRONMENTAL MEASUREMENT Unit 4: Module 13, Lecture 2.
Multi-Agency Radiological Laboratory Analytical Protocols Manual: MARLAP Presentation to the Radiation Advisory Committee/Science Advisory Board April.
QA/QC FOR ENVIRONMENTAL MEASUREMENT
Measurement and Data Quality
Development of GHG projections guidelines Melanie Sporer, EEA.
1 of 23 From Qualitative Concept to Practical Implementation Evolution of the Data Quality Objectives Concept DQO Training Course Day 1 Module 1 15 minutes.
A performance-based system (PBS) approach is a process that can be used to measure quality control characteristics of various aspects of field sampling.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
Site Classification for Re-calibration of the Alabama Index of Stream Macroinvertebrate Condition Ben Jessup and Jen Stamp Tetra Tech, Inc. SWPBA November.
Quality WHAT IS QUALITY
QA/QC and QUALIFIERS LOU ANN FISHER CITY OF STILLWATER, OK
Fundamentals of Data Analysis Lecture 10 Management of data sets and improving the precision of measurement pt. 2.
1 Proposed Adoption of Biological and Toxicological Water Quality Data Elements and WQDE Guide LeAnne Astin Interstate Commission on the Potomac River.
Chapter 4: Test administration. z scores Standard score expressed in terms of standard deviation units which indicates distance raw score is from mean.
Laboratory QA/QC An Overview.
National Water Quality Monitoring Council Methods and Data Comparability Board Advisory Committee on Water Information Herb Brass September 15, 2004.
Benthic Community Assessment Tool Development Ananda Ranasinghe (Ana) Southern California Coastal Water Research Project (SCCWRP) Sediment.
ME551/GEO551 Introduction to Geology of Industrial Minerals Spring 2005 SAMPLING.
1 of 39 The EPA 7-Step DQO Process Step 3 - Identify Inputs (45 minutes) Presenter: Sebastian Tindall Day 2 DQO Training Course Module 3.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Why do we need to do it? What are the basic tools?
An Overview of EPA’s Quality Assurance Guidance for Ambient Air Quality Monitoring Data Data Analysis and Interpretation February 12 – 14, 2008, Tempe,
QC THE MULTIRULE INTERPRETATION
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Biochemistry Clinical practice CLS 432 Dr. Samah Kotb Lecturer of Biochemistry 2015 Introduction to Quality Control.
Quality control & Statistics. Definition: it is the science of gathering, analyzing, interpreting and representing data. Example: introduction a new test.
WERST – Methodology Group
Recommendations for Applying the Critical Elements Methodology.
Wenclawiak, B.: Fit for Purpose – A Customers View© Springer-Verlag Berlin Heidelberg 2003 In: Wenclawiak, Koch, Hadjicostas (eds.) Quality Assurance in.
Control Charts and Trend Analysis for ISO 17025
LECTURE 13 QUALITY ASSURANCE METHOD VALIDATION
Module 11 Module I: Terminology— Data Quality Indicators (DQIs) Melinda Ronca-Battista ITEP Catherine Brown U.S. EPA.
Lesson 3 Measurement and Scaling. Case: “What is performance?” brandesign.co.za.
Clinical Utility of EQA Dr. Angela Amayo UON27/11/2008.
Wadeable Stream Assessment Comparability Study: Interim Results Mark Southerland, Jon Vølstad, Ed Weber, Beth Franks, and Laura Gabanski May 10, 2006.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
Statistical Concepts Basic Principles An Overview of Today’s Class What: Inductive inference on characterizing a population Why : How will doing this allow.
SEMINAR ON PRESENTED BY BRAHMABHATT BANSARI K. M. PHARM PART DEPARTMENT OF PHARMACEUTICS AND PHARMACEUTICAL TECHNOLGY L. M. COLLEGE OF PHARMACY.
1 EUROPEAN TOPIC CENTRE ON WATER EUROWATERNET Towards an Index of Quality of the National Data in Waterbase.
Polsko-Norweski Fundusz Badań Naukowych / Polish-Norwegian Research Fund Third phase of deWELopment project Scope of the work Warsaw, 1st Feb
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Quality is a Lousy Idea-
Water Quality Monitoring -Quality Assurance/Quality Control-
Quality Assurance in the clinical laboratory
Quality is a Lousy Idea-
This teaching material has been made freely available by the KEMRI-Wellcome Trust (Kilifi, Kenya). You can freely download,
Measuring Data Quality and Compilation of Metadata
Topics in Microbiology Quality Assurance Project Plan Essentials
Understanding Data Choices, Characteristics, Limitations
The samples and the Error
Presentation transcript:

EPA’s Bioassessment Performance and Comparability “Guidance”

Why Develop this Guidance Previous NWQM Conferences expressed need Previous NWQM Conferences expressed need Several states, tribes, and others want this guidance Several states, tribes, and others want this guidance EPA views this as a critical component to strengthen existing bioassessment programs EPA views this as a critical component to strengthen existing bioassessment programs Credible data laws; over-interpreting results Credible data laws; over-interpreting results Links with other bioassessment programs: critical elements, reference condition criteria Links with other bioassessment programs: critical elements, reference condition criteria

Document Outline Background: Bioassessments and DQOs; Performance- based approach and comparability defined; lit review Background: Bioassessments and DQOs; Performance- based approach and comparability defined; lit review Performance-based Methods: Performance characteristics defined; relationship to comparability, data quality Performance-based Methods: Performance characteristics defined; relationship to comparability, data quality Documenting Performance: how to; challenges; examples Documenting Performance: how to; challenges; examples Determining Comparability: at data, metric and assessment levels; rules; challenges; examples Determining Comparability: at data, metric and assessment levels; rules; challenges; examples Documentation and Reporting: metadata; forms Documentation and Reporting: metadata; forms

EPA Bioassessment Guidance: Performance Includes: Recommended performance characteristics that should be documented for a bioassessment protocol and its submethods (e.g., laboratory procedures, taxonomy) Recommended performance characteristics that should be documented for a bioassessment protocol and its submethods (e.g., laboratory procedures, taxonomy) How to calculate performance characteristics How to calculate performance characteristics Forms to help document performance and recommended metadata (water quality data elements) Forms to help document performance and recommended metadata (water quality data elements) Examples of performance-based methods for different DQOs Examples of performance-based methods for different DQOs Case study examples that calculated performance for bioassessment protocols or submethods Case study examples that calculated performance for bioassessment protocols or submethods

EPA Bioassessment Guidance: Comparability Includes: Recommended rules for combining bioassessment data Recommended rules for combining bioassessment data Recommended rules for combining assessments Recommended rules for combining assessments Case study examples that examined comparability of different protocols or datasets Case study examples that examined comparability of different protocols or datasets Recommendations to enhance comparability evaluations and promote comparability of information derived from different programs Recommendations to enhance comparability evaluations and promote comparability of information derived from different programs

Identify objectives and design of monitoring project Study Objectives or Monitoring Question Study Objectives or Monitoring Question Data-Quality Objectives (DQO) Data-Quality Objectives (DQO) (includes sampling design) (includes sampling design) Collect field samples and on-site data Sampling Plan Sampling Plan Field Certification and Training Field Certification and Training Quality-Assurance Plan Quality-Assurance Plan Field-activities Safety Plan Field-activities Safety Plan Produce data from laboratory analyses Quality-Assurance Plan Quality-Assurance Plan Specify Method Acceptance Criteria Specify Method Acceptance Criteria Laboratory Accreditation Laboratory Accreditation Taxonomic certification Taxonomic certification Method Verification Method Verification Manage Data Required Metadata (data elements), including data- quality documentation Required Metadata (data elements), including data- quality documentation Major Project ElementsComparability Requirements Bioassessment Protocols Consist of Several Steps, Each Benefiting From a Performance Assessment

Generalized Flowchart State management objectives Specify data quality objectives (DQO) Specify measurement quality objectives (MQO) (= acceptance criteria) Collect data Document protocol performance Evaluate performance relative to MQOs Exclude/reject data not meeting MQOs “Calculate” indicators Address management objectives Select indicators

Bioassessment Example Determine stream miles that are biologically impaired Determine to a 90% degree of confidence whether or not a site is impaired or not Precision of indicators for replicate samples  20% RSD or  80% similarity;  80% discrimination efficiency; stressor gradient MQOs? Sensitivity? Collect field replicates; split lab samples; sort residue QC; % taxonomic agreement Using data meeting MQOs, calculate assessment values Interpret findings in context of management objective Metrics or index responsive to stressors and reliable, representative sampling methods

Performance Characteristics Tentatively: Precision, sensitivity, responsiveness and bias/accuracy Tentatively: Precision, sensitivity, responsiveness and bias/accuracy Could pertain to both field and laboratory procedures Could pertain to both field and laboratory procedures Sampling method also includes representativeness Sampling method also includes representativeness Need replication at sites along a human disturbance gradient – HDG (from very best to very worst) Need replication at sites along a human disturbance gradient – HDG (from very best to very worst) Reference condition data are key Reference condition data are key Appropriate definition of the HDG is essential Appropriate definition of the HDG is essential

Performance; Comparability: What is it? What do we mean? Methods Indicators Reference Condition Sampling Design ? ? ?

We Need Your Input

Definitions: Performance Characteristics Performance CharacteristicDefinition Precision (Repeatability and reproducibility) Repeatability  variability in biological index or indicator measure using split samples. (lab error) Reproducibility  variability in biological index or indicator measure using duplicate field samples. (measurement error – includes lab error) Sensitivity Level of generalized stressor gradient (or HDG) at which the biological index or indicator is statistically different from reference condition. Bias/Accuracy (False positive or false negative rate) Degree to which truly stressed sites are systematically classified as unimpaired (false negative), or truly non-stressed sites are systematically classified as impaired (false positive). Responsiveness Degree to which a biological index or indicator measure decreases monotonically with increasing stress.

Variance in Index Score - bioassessment precision Variance in HDG Score – stressor precision Two Sources of Error to Address

Sensitivity Method A distinguishes HDG 3 from HDG 2 Method B distinguishes HDG 4 from HDG 2 A B A B Reference variability

Precision

Responsiveness Dissimilar Identical Similar to Reference A Each of these index values does not overlap with reference nor with each other – 5 distinct classes of stress identified.

Similar to Reference Three of these index values do not overlap with reference; the value for HDG 5 overlaps with HDG 4; therefore, only 3 different classes of stress distinguished (HDG 2, 4, and 6) Responsiveness Dissimilar Identical

Bias / Assessment Accuracy Non-impaired mean and 95% CI based on bioassessment data from several reference sites; need several test sites in the HDG 1-3 classes and in the HDG 4-6 classes to test for false positive and negative rates. Non-impairedImpaired False Negative False Positive