Deana M. Crumbling, M.S. Technology Innovation Office U.S. Environmental Protection Agency Washington, D.C. (703) 603-0643 Northeast.

Slides:



Advertisements
Similar presentations
Radiochemical Methods and Data Evaluation
Advertisements

Quality is a Lousy Idea-
Lessons Learned Multi Incremental Sampling Alaska Forum on the Environment February, 2009 Alaska Department of Environmental Conservation.
REDUCING BURDEN WHILE INCREASING QUALITY AT A GOVERNMENT AGENCY David A. Marker (Westat), Mary K. Dingwall (Westat), and Marla D. Smith (U.S. EPA) Presented.
1 What is Remediation Process Optimization? How Can It Help Me Identify Opportunities for Enhanced and More Efficient Site Remediation? Mark A. Gilbertson.
ENVIRONMENTAL UNIT IN AN ICS STRUCTURE. EU Mission Statement The Environmental Unit is established to provide technical and scientific expertise and capabilities.
A U.S. Department of Energy Office of Science Laboratory Operated by The University of Chicago Office of Science U.S. Department of Energy Streamlined.
TITLE OF PROJECT PROPOSAL NUMBER Principal Investigator PI’s Organization ESTCP Selection Meeting DATE.
Quality Assurance for Tribal FIFRA Enforcement Grants David R. Taylor EPA Region 9 Quality Assurance Office.
ISO Culture SHOQ Quiz 1. This program must be run in MS PowerPoint and requires sound. 2. Click “View” and then “Slide Show”. 3. Click “Continue”
Substantive environmental provisions Prof. Gyula Bándi.
Twinning Project RO2006/IB/EN/09 1 Saxony-Anhalt State Environmental Protection Agency Wolfgang GarcheBukarest Wolfgang Garche Saxony-Anhalt.
Importance of Quality Assurance Documentation and Coordination with Your Certified Laboratory Amy Yersavich and Susan Netzly-Watkins.
System Design and Analysis
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
Environmental Engineering
1 Risk Assessment Develop Objectives And Goals Develop and Screen Cleanup Alternatives Select Final Cleanup Alternative Communicate Decisions to the Public.
QA/QC FOR ENVIRONMENTAL MEASUREMENT Unit 4: Module 13, Lecture 2.
QA/QC FOR ENVIRONMENTAL MEASUREMENT
Codex Guidelines for the Application of HACCP
Lecture 7 Analytical Quality Control
Mantova 18/10/2002 "A Roadmap to New Product Development" Supporting Innovation Through The NPD Process and the Creation of Spin-off Companies.
Environmental Impact Assessment (EIA): Overview
1 of 23 From Qualitative Concept to Practical Implementation Evolution of the Data Quality Objectives Concept DQO Training Course Day 1 Module 1 15 minutes.
1 Facilitating Reuse at RCRA Sites: Innovative Technologies for Groundwater Characterization and Cleanup Introduction Walter W. Kovalick Jr., Ph.D. Associate.
Performance Measurement and Analysis for Health Organizations
Module 5: Assuring the Quality of HIV Rapid Testing
MCP Representativeness Evaluations & Data Usability Assessments
A Modern Strategy for Hazardous Waste Site Clean-Up: The "Triad" Approach Sept. 27, 2001 ENRY Belgrade, Yugoslavia Daniel M. Powell Technology Innovation.
Quality WHAT IS QUALITY
Results Based Management: Logical Framework Matrix (LFM) December 30 th, 2009 Abeer Shakweer, Ph.D., Planning and Monitoring Manager Science and Technology.
Introduction The past, the present and the future.
Important informations
Laboratory QA/QC An Overview.
1 of 45 How Many Samples do I Need? Part 1 Presenter: Sebastian Tindall 60 minutes (15 minute 1st Afternoon Break) DQO Training Course Day 1 Module 4.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
1 of 32 Systematic Planning for Environmental Decision-Making DOE EM-3 Day 2 DQO Training Colorado Department of Public Health & Environment EPA Conference.
| Data Validity and Usability in the RI/FS Process CHRISTINA MOTT FRANS, JOSH HOPP, CHRISTINE RANSOM.
Gile Sampling1 Sampling. Fundamental principles. Daniel Gile
ME551/GEO551 Introduction to Geology of Industrial Minerals Spring 2005 SAMPLING.
Quality Assurance How do you know your results are correct? How confident are you?
1 of 39 The EPA 7-Step DQO Process Step 3 - Identify Inputs (45 minutes) Presenter: Sebastian Tindall Day 2 DQO Training Course Module 3.
QUALITY ASSURANCE/QUALITY CONTROL
1 Deana M. Crumbling, M.S. Technology Innovation Office U.S. Environmental Protection Agency and Kira P. Lynch, M.S. Innovative Technology Advocate Seattle.
Biochemistry Clinical practice CLS 432 Dr. Samah Kotb Lecturer of Biochemistry 2015 Introduction to Quality Control.
ECOS Information Session Draft EPA Quality Documents February 13, 2013 Presented by EPA Quality Staff, Office of Environmental Information For meeting.
4-1 CLU-IN Studios Web Seminar August 14, 2008 Stephen Dyment USEPA Technology Innovation Field Services Division Demonstration.
Argonne National Laboratory Experience and Perspectives on Environmental Remediation Karen P. Smith Environmental Science Division Argonne National Laboratory.
David Moser USACE Chief Economist
1 of 27 How Many Samples do I Need? Part 2 Presenter: Sebastian Tindall (60 minutes) (5 minute “stretch” break) DQO Training Course Day 1 Module 5.
1 of 31 The EPA 7-Step DQO Process Step 6 - Specify Error Tolerances 60 minutes (15 minute Morning Break) Presenter: Sebastian Tindall DQO Training Course.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
Developing a Project Proposal ACTRAV-Turin. Contents Concept of “Logical Framework Approach” SPROUT – model project proposal Individual activity Presentation.
1 of 48 The EPA 7-Step DQO Process Step 6 - Specify Error Tolerances 3:00 PM - 3:30 PM (30 minutes) Presenter: Sebastian Tindall Day 2 DQO Training Course.
1 FORMER COS COB POWER PLANT From Characterization to Redevelopment Brownfields2006 November 14, 2006.
Slide n° 1 EU railway legislation - Safety regulatory framework NAB/RB training workshop in Valenciennes, April 2016 NAB/RB Training Workshop In Valenciennes,
Achieving Method Compliance with Innovative Technologies Tony Francis, PhD Principal, SAW Environmental TCEQ – Environmental Trade.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Quality is a Lousy Idea-
Strategic Information Systems Planning
Quality Risk Management
Quality is a Lousy Idea-
Maintaining quality data throughout the life of a project
Topics in Microbiology Quality Assurance Project Plan Essentials
ME551/GEO551 Introduction to Geology of Industrial Minerals Spring 2007 SAMPLING.
Objectives Provide an overview of the triad approach and its application Describe the elements of the triad approach for practical application Describe.
RISK MANAGEMENT MARKET & SOCIAL RESEARCH
Principal Investigator ESTCP Selection Meeting
The samples and the Error
Principal Investigator ESTCP Selection Meeting
Presentation transcript:

Deana M. Crumbling, M.S. Technology Innovation Office U.S. Environmental Protection Agency Washington, D.C. (703) Northeast States’ Improving the Quality of Site Characterization Conference June 4 and 6, 2002 What Could Be Wrong with the Traditional Approach?

2 USEPA Technology Innovation Office Advocates for better technologies and strategies to clean up contaminated sites: Advocates for better technologies and strategies to clean up contaminated sites: –Site investigation/characterization –Site remediation –Monitoring during or after remedial action Acts as an agent for change Acts as an agent for change –Disseminates others’ good ideas Cleanup Information Website: Cleanup Information Website:

3 Data Quality Remedial systems optimization Decision quality/uncertainty mgt Sound Science PBMS DL QL Dynamic Work Plans Field Analytical Methods DQOs Budgets NELAC QA Risk Assessment education States Enforcement contracting Decision Theory Precautionary principle political & economic constraints Innovation SAPs QAPPs Lab Certification

4 Putting all the Pieces Together: Manage Decision Uncertainty

5 Take-Home Message # 1 Using SOUND SCIENCE in the cleanup of contaminated sites means that the the scale of data generation and interpretation must closely “match” the scale of project decisions being based on that data. Sound science also means managing uncertainty since an exact match usually is not feasible. The current environmental data quality model is inadequate to ensure that this matching occurs.

6 Take-Home Message # 2 The Triad Approach uses innovative data generation and interpretation tools to make scientific defensibility cost-effective for contaminated site mgt The Triad Approach uses innovative data generation and interpretation tools to make scientific defensibility cost-effective for contaminated site mgt Triad Approach= Integrates systematic project planning, dynamic work plans, and real-time analysis to  time & costs and  decision certainty Triad Approach = Integrates systematic project planning, dynamic work plans, and real-time analysis to  time & costs and  decision certainty Theme for the Triad Approach = Explicitly identify and manage the largest sources of decision error, especially the sampling representativeness of data Theme for the Triad Approach = Explicitly identify and manage the largest sources of decision error, especially the sampling representativeness of data

7 Data Quality as a Tool to Achieve Decision Quality

8 First Generation Data Quality Model The SYSTEM functions as if it believes that… Screening Methods Screening Data Uncertain Decisions “Definitive” Methods “Definitive” Data Certain Decisions Methods = Data = Decisions Distinguish: Analytical Methods from Data from Decisions

9 First Generation Data Quality Model Assumptions “Data quality” depends on analytical methods “Data quality” depends on analytical methods Using regulator-approved methods ensures “definitive data” Using regulator-approved methods ensures “definitive data” QC checks that use ideal matrices are representative of method performance for real-world samples QC checks that use ideal matrices are representative of method performance for real-world samples Laboratory QA is substitutable for project QA Laboratory QA is substitutable for project QA One-size-fits-all methods eliminate the need for analytical chemistry expertise One-size-fits-all methods eliminate the need for analytical chemistry expertise

10Non- Representative Sample PerfectAnalyticalChemistry+ “BAD” DATA Distinguish: Analytical Quality from Data Quality Data used for Project Decision Making is Generated on Samples

11 Distinguishing Concepts Analytical Methods Overall Data Quality Decision Quality { Manage Uncertainty in Decision Making Clarify Assumptions Draw Conclusions Representative Sampling { Manage Uncertainty in Data Generation Data Assessment/ Analytical Integrity Non-scientific considerations Method Modifications Method Selection { Analytical Quality

12 So, how should “Data Quality” be defined? Data Quality = The ability of data to provide information that meets user needs --Condensed from USEPA Office of Environmental Information QMP, 2000 Users need to make correct decisions Users need to make correct decisions Data quality is a function of data’s… Data quality is a function of data’s… –ability to represent the “true state” in the context of the decision to be made »The decision defines the appropriate scale over which the “true state” should be measured (i.e., the scale of data generation) –information content (including its uncertainty)

13 Second-Generation Data Quality Model Scientific Foundation “Data quality” = data’s ability to support decisions “Data quality” = data’s ability to support decisions Anything that compromises data representativeness compromises data quality Anything that compromises data representativeness compromises data quality “Data” representativeness = sampling representativeness + analytical representativeness “Data” representativeness = sampling representativeness + analytical representativeness Project-specific planning: matches scale(s) of data generation with scale(s) of decision-making. Project-specific planning: matches scale(s) of data generation with scale(s) of decision-making. Technical expertise required to manage sampling and analytical uncertainties Technical expertise required to manage sampling and analytical uncertainties

14 The Data Quality “Chain” SamplingAnalysis Sample Support DECISIONDECISION GoalMaking DECISIONDECISION

15#1 #2#3 The decision driving sample collection: Assess contamination resulting from atmospheric deposition Sample Support: Critical to Representativeness Sample Volume & Orientation

16 #1#2#3 The decision driving sample collection: Assess contamination resulting from atmospheric deposition Sample Support: Critical to Representativeness Sample Volume & Orientation We assume the volume of a sample should have no effect on the concentration of contaminant in that sample.

17#1 #2#3 The decision driving sample collection: Assess contamination resulting from atmospheric deposition The Nugget Effect Same Contaminant Mass in Nugget, but Different Sample Volumes Produce Different Concentrations Sample Prep Sample Support: Critical to Representativeness Sample Volume & Orientation

18 The Data Quality “Chain” SamplingAnalysis Sample Support Sampling Design Sample Preservation Sub- Sampling Sample Preparation Method(s) Determinative Method(s) DECISIONDECISION Goal Result Reporting Making DECISIONDECISION Extract Cleanup Method(s) All links in the Data Quality chain must be intact for Decision Quality to be supported ! e.g., Method 8270

19 Summing Uncertainties Uncertainties add according to (a 2 + b 2 = c 2 ) Ex. 3 3 X Total Uncertainty Analytical Uncertainty Sampling Uncertainty Ex. 1 Ex. 2 1/3 X Ex. 1 Ex. 3 Ex. 2

20 Partitioning Data Uncertainty Std Dev Sampling Std Dev Analytical SampAnalRatio Std Dev Sampling : Std Dev Analytical = Samp:Anal Ratio Brownfields Project Example: Scrap yard site w/ contaminated soil Effect of matrix on analytical variability for B(a)P Using LCS data (no matrix effect) :6, : 1 Using LCS data (no matrix effect) : 6,520 : 4.4 = 1464 : 1 Using MS/MSD data (matrix incl’d) :6, : 1 Using MS/MSD data (matrix incl’d) : 6,520 : 12.7 = 513 : 1 Different metals (LCS data used to estimate analytical variability) High spatial variability,Pb : 1 High spatial variability, Pb 3255 : 3 = 1085 : 1 Natural background present, As : 1 Natural background present, As 22.4 : 7 = 3 : 1 Total UncertaintyAnalytical Uncertainty Sampling Uncertainty a b c

21 Sample Location ~ 95% Analytical (between methods) ~ 5% Example of Variability: Sample Location vs. Analytical Method 39,800 On-site 41,400 Lab 500 On-site 416 Lab 164 On-site 136 Lab 27,800 On-site 42,800 Lab 24,400 On-site 27,700 Lab 1,280 On-site 1,220 Lab On-site 286 Lab

Marrying Analytical Methods to Make Sound Decisions Involving Heterogeneous Matrices Costly definitive analytical methods Cheaper/screening analytical methods High spatial density Low DL + analyte specificity Manages analytical uncertainty = analytical representativeness = analytical quality Definitive analytical quality Screening sampling quality Manages sampling uncertainty = sampling representativeness = sampling quality Definitive sampling quality Screening analytical quality

Marrying Analytical Methods to Make Sound Decisions Involving Heterogeneous Matrices Costly definitive analytical methods Cheaper/screening analytical methods High spatial densityLow DL + analyte specificity Manages analytical uncertainty Manages sampling uncertainty Collaborative Data Sets

Marrying Analytical Methods to Make Sound Decisions Involving Heterogeneous Matrices Costly definitive analytical methods Cheaper/screening analytical methods High spatial densityLow DL + analyte specificity Manages analytical uncertainty Manages sampling uncertainty Decision Quality Data Reliable (yet Cost-Effective) Scientifically Defensible Decisions

25 Improve Decision Quality--Manage Uncertainties ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ $ $ $ FROM THIS TO THIS Ex 1 Ex 2 Ex 3 Fixed Lab Analytical Uncertainty Sampling Uncertainty Ex 1 Sampling Uncertainty Controlled through Increased Density Field Analytical Data Ex 2 Fixed Lab Data Decreased Sampling Variability after Removal of Hotspots Ex 3 Remove hot spots

26 Managing Decision Uncertainty Using Modern Tools Managing Decision Uncertainty Using Modern Tools

27 A Systems Approach Framework The Triad Approach Systematic Project Planning Dynamic Work Plan Strategy Real-time Measurement Technologies

28 Unifying Concept for Triad: Managing Uncertainty Manage uncertainty about project goals Manage uncertainty about project goals –Identify decision goals with tolerable overall uncertainty –Identify major uncertainties (cause decision error) –Identify the strategies to manage each major uncertainty Manage uncertainty in data Manage uncertainty in data – Sampling uncertainty: manage sample representativeness –Analytical uncertainty: especially if field methods are used Multidisciplinary expertise critical Multidisciplinary expertise critical –A TEAM is the best way to bring needed knowledge to bear Systematic planning is used to proactively…

Dynamic Work Plans Real-time decision-making “in the field” Real-time decision-making “in the field” –Evolve CSM in real-time –Implement pre-approved decision tree using senior staff –Contingency planning: most seamless activity flow possible to reach project goals in fewest mobilizations Real-time decisions need real-time data Real-time decisions need real-time data –Use off-site lab w/ short turnaround? »Use screening analytical methods in fixed lab? –Use on-site analysis? »Use mobile lab with conventional equipment? »Use portable kits & instruments? Mix And Match } In all cases, must generate data of known quality

30 Generating Real-time Data Using Field Methods Manage Uncertainty through Systematic Planning Need clearly defined data uses—tie to project goals Need clearly defined data uses—tie to project goals Understand dynamic work plan—branch points & work flow Understand dynamic work plan—branch points & work flow Project-specific QA/QC protocols matched to intended data use Project-specific QA/QC protocols matched to intended data use Select field analytical technologies to Select field analytical technologies to –Support the dynamic work plan (greatest source of $$ savings) –Manage sampling uncertainty (improves decision quality) Select fixed lab methods (as needed) to Select fixed lab methods (as needed) to –Manage uncertainties in field data (just ONE aspect of QC for field data) –Supply analyte-specific data and/or lower quantitation limits (if needed for regulatory compliance, risk assessment, etc.) (if needed for regulatory compliance, risk assessment, etc.)

31 Sample Representativeness is Key! Cheaper analyses permit increased sample density Cheaper analyses permit increased sample density –New software for statistical/geostatistical decision support »VSP software pkg FREE: »SADA software pkg FREE: »FIELDS/SADA software: Real-time measurements support real-time decision- making Real-time measurements support real-time decision- making –Rapid feedback for course correction  smarter sampling Data Quality: Focus on overall data uncertainty; analytical uncertainty usually a small fraction Data Quality: Focus on overall data uncertainty; analytical uncertainty usually a small fraction Finally able to address defensibly and affordably!

32 Case Study: Wenatchee Tree Fruit Site Pesticide IA kits guide dynamic work plan: remove and segregate contaminated soil for disposal Pesticide IA kits guide dynamic work plan: remove and segregate contaminated soil for disposal 230 IA analyses (w/ thorough QC) Managed sampling uncertainty: achieved very high confidence that all contamination above action levels was located and removed Clean closure data set Clean closure data set – 33 fixed lab samples for analyte-specific pesticide analysis – Demonstrate full compliance with all regulatory requirements for all 33 pesticide analytes to >95% statistical confidence the first time! 33 pesticide analytes to >95% statistical confidence the first time! Projected cost: ~$1.2M; Actual: $589K (Save ~ 50%) Projected cost: ~$1.2M; Actual: $589K (Save ~ 50%) Field work completed: <4 months; single mobilization Field work completed: <4 months; single mobilizationhttp://cluin.org/char1_edu.cfm#site_char Managed field analytical uncertainty as additional QC on critical samples: confirmed & perfected field kit action levels) + 29 fixed-lab samples for 33 analytes

33 Terminology to Link Data Quality with Decision Quality

34 “Data Quality” Terminology Current terminology usage does not focus on the goal of decision quality Irony: Great focus on the quality of data points; Irony: Great focus on the quality of data points; but overall quality of decisions easily unknown. but overall quality of decisions easily unknown. Current usage does not distinguish Current usage does not distinguish –Methods vs. data vs. decisions –The factors that impact each step in the process –Relationships between different aspects of quality

35 Proposed Clarification of Terms Quality Assurance Project QA: ID causes of potential intolerable decision errors & the strategies to manage and prevent those decision errors Project QA: ID causes of potential intolerable decision errors & the strategies to manage and prevent those decision errors Data QA: manage both sampling and analytical uncertainties to degree needed to avoid decision errors Data QA: manage both sampling and analytical uncertainties to degree needed to avoid decision errors –Analytical representativeness evaluated, including impact of sample/matrix effects on analytical performance –Sample representativeness must be evaluated Lab QA: manage technical performance of analytical instruments, processes, and operators to meet lab’s QA goals Lab QA: manage technical performance of analytical instruments, processes, and operators to meet lab’s QA goals –Sample/matrix effects on analytical performance may or may not be evaluated—depends on contract specifications.

36 Analytical QA Terminology Demonstration of method applicability Demonstration of method applicability –Shows that a particular method, project-specific SOP, and selected QC acceptance criteria are appropriate for a project-specific application or site-specific matrix Demonstration of proficiency Demonstration of proficiency –Shows that a particular operator or lab can perform a method properly

37 Proposed Clarification of Terms Data Quality Decision quality data* = Effective data* = data shown to be effective for decision-making Decision quality data* = Effective data* = data shown to be effective for decision-making Screening quality data* = some useful information provided; but too uncertain to support decision-making alone Screening quality data* = some useful information provided; but too uncertain to support decision-making alone Collaborative data sets = distinct data sets used in concert with each other to co-manage sampling and/or analytical uncertainties to an acceptable level Collaborative data sets = distinct data sets used in concert with each other to co-manage sampling and/or analytical uncertainties to an acceptable level * Includes sampling uncertainty. Nature of the analytical method irrelevant.

38 Misleading Terminology This term falsely implies that: All methods run in the field are screening methods;All methods run in the field are screening methods; Therefore, all data produced in the field are of screening quality.Therefore, all data produced in the field are of screening quality. Field Screening

39 “Effective Data” “Decision Quality Data” Data of known quality that can be logically demonstrated to be effective for making the specified decision because both the sampling and analytical uncertainties are managed to the degree necessary to meet clearly defined (and stated) decision confidence goals

40 The Diffusion of Innovation “At first people refuse to believe that a strange new thing can be done, then they begin to hope it can be done—then it is done and all the world wonders why it was not done centuries ago.” —Francis Hodges Burnett —Francis Hodges Burnett