Quantifying Indicator Uncertainty

Slides:



Advertisements
Similar presentations
1 EPAs 2008 Report on the Environment Denice M. Shaw, PhD U.S. Environmental Protection Agency Office of Research and Development National Center for Environmental.
Advertisements

Framework for the Ecological Assessment of Impacted Sediments at Mining Sites in Region 7 By Jason Gunter (R7 Life Scientist) and.
UWWT directive in Croatia – implementation and data availability
EPA PM2.5 Modeling Guidance for Attainment Demonstrations Brian Timin EPA/OAQPS February 20, 2007.
Sustained Compliance for Public Water Systems, Chapter 2 Workshop The Significant Non-Complier List.
Reporting Results for Pesticide Programs Robin Powell, EM Pyramid Lake Paiute Tribe Environmental Department.
The Calibration Process
Inventory Management Systems Developing a National System for GHG Inventories Lisa Hanle U.S. Environmental Protection Agency October 29, 2004 Panama City,
2 CGE Greenhouse Gas Inventory Hands-on Training Workshop for the African Region - Building an Inventory Management System - Pretoria, South Africa
Multi-Agency Radiological Laboratory Analytical Protocols Manual: MARLAP Presentation to the Radiation Advisory Committee/Science Advisory Board April.
V. Rouillard  Introduction to measurement and statistical analysis ASSESSING EXPERIMENTAL DATA : ERRORS Remember: no measurement is perfect – errors.
Environmental Risk Analysis
1 of 35 The EPA 7-Step DQO Process Step 4 - Specify Boundaries (30 minutes) Presenter: Sebastian Tindall Day 2 DQO Training Course Module 4.
Procedures and Forms 2008 FRCC Compliance Workshop April 8-9, 2008.
10/16/ State Strategic Plan Review 10/16/063 Section 408 Program Matrix Systems: Crash Roadway Vehicle Driver Citation / Adjudication Injury Surveillance.
1 Survey of the Nation’s Lakes Presentation at NALMS’ 25 th Annual International Symposium Nov. 10, 2005.
Research Seminars in IT in Education (MIT6003) Quantitative Educational Research Design 2 Dr Jacky Pow.
Environment Environnement Canada Rob Kent, Chris Lochner, Janine Murray, Connie Gaudet Water Quality Monitoring and Surveillance Water Science and Technology.
Scale Scoring A New Format for Provincial Assessment Reports.
BASINS 2.0 and The Trinity River Basin By Jóna Finndís Jónsdóttir.
Point Source Loads and Decision Criteria for Toxics Modeling Baltimore Harbor TMDL Stakeholder Advisory Group September 10, 2002.
Recommendations for Applying the Critical Elements Methodology.
Forging Partnerships on Emerging Contaminants November 2, 2005 Elizabeth Southerland Director of Assessment & Remediation Division Office of Superfund.
Minnesota Drinking Water Designated Use Assessment Workshop Tom Poleck EPA Region 5, Water Quality Branch May 20-21,
Level 2 Assessment. Abbreviations to Know RTCR – Revised Total Coliform Rule TCR – Total Coliform Rule TC – Total Coliform EC – E. Coli PN – Public Notice.
Level 1 Assessment. Abbreviations To Know RTCR – Revised Total Coliform Rule TCR – Total Coliform Rule TC – Total Coliform EC – E. Coli PN – Public Notice.
Using Regional Models to Assess the Relative Effects of Stressors Lester L. Yuan National Center for Environmental Assessment U.S. Environmental Protection.
Statistical Concepts Basic Principles An Overview of Today’s Class What: Inductive inference on characterizing a population Why : How will doing this allow.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Results of Screening Analyses of 224 California MSW Landfills Landfill Compliance Study presented to California Integrated Waste Management Board by GeoSyntec.
Data Mining: Data Prepossessing What is to be done before we get to Data Mining?
Introduction to Quality Assurance. Quality assurance vs. Quality control.
Work shop on Procurement Key-performance indicators with selected implementing entities Public procurement and property administration agency August 2016.
Pending EPA Health Advisory for PFOA and PFOS
Quality issues in monitoring diagnostic and treatment performance Dr
Six-Sigma : DMAIC Cycle & Application
Safe Drinking Water Act , CCL and Perchlorate
Disseminating Research Findings Shawn A. Lawrence, PhD, LCSW SOW 3401
3 Research Design Formulation
Risk Communication in Medicines
COUNT - RECONCILIATION
Update on Data Collection and Reporting
The Calibration Process
Triennial Review of Water Quality Standards Proposed Rulemaking
Niagara River Area of Concern
Chapter Three Research Design.
Jim Flechtner, PE Executive Director April 26, 2018
Practical clinical chemistry
Maintaining quality data throughout the life of a project
Ecology Review pg By: Lindsey Powell.
Problem DC 10-2, Page 547 What is K? The confidence factor
D8 and D9 REVIEW PROCESS April-June 2014: February 2015:
Environmental Management Systems The ISO Approach Initial Environmental Review & Gap Analysis Presented by: NC Division of Pollution Prevention.
Van Wert, OH Water and Wastewater Element Training
Update on Water Quality
OHWARN Workshop Disruption of Service Rule Update
Exceptional Events Rulemaking Proposal
Elements of a statistical test Statistical null hypotheses
Draft examples of possible GES Decision criteria Descriptor 9
The normal balance of ingredients
Exceptional and Natural Events Rulemaking
Western Regional Haze Planning and
Special Control Charts II
Marine Strategy Framework Directive 2008/56/EC
Lecture 1: Descriptive Statistics and Exploratory
Interpreting Epidemiologic Results.
Umweltbundesamt, Austria
Quality Assessment The goal of laboratory analysis is to provide the accurate, reliable and timeliness result Quality assurance The overall program that.
EPA’S ROLE IN APPROVING BASIN PLAN AMENDMENTS
Regional Environmental Health Specialist
Presentation transcript:

Quantifying Indicator Uncertainty Jay Messer U.S. EPA - National Center for Environmental Assessment November 23, 2018

Importance of indicator uncertainty Uncertainty in indicators can lead to mistaken ideas about current environmental status Uncertainty in indicators can lead to mistaken ideas about environmental trends Uncertainty in performance measures can lead to mistaken ideas about program performance (e.g., were performance goals met or not?)

Importance of indicator uncertainty Reviews of EPA’s Report on the Environment 2008 - Quantitative treatment of uncertainty is essential! EPA’s Science Advisory Board Public comment (Federal Register) Interagency government review

Importance of indicator uncertainty

Prevalence of uncertainty estimates in environmental indicator reports Quantitative treatment of indicator uncertainty remains the exception, rather than the rule: Very few US environmental indicator reports include uncertainty estimates. Where they do, the information is seldom used in an interpretive manner.

EPA’s Report on the Environment-2008 Has only qualitative information on uncertainty (“limitations”) Of the 85 ROE indicators, only 15% include uncertainty estimates in their original sources.

EPA’s Strategic Plan & Performance Reports No targets or performance measures include uncertainty estimates

ROE Indicator Pilot Study Objectives Better characterize and quantify uncertainty in current state, change, and trend estimates in ROE indicators Develop the most appropriate ways to scale ROE indicators so that they are relevant and comparable at regional or sub-regional scales Determine how sensitive ROE indicators are to environmental management in the face of natural and other phenomena

Major sources of uncertainty Measurement error Modeling error Non-comparable measurements Treatment of missing data Lack of representative sampling “Elephants” (very large entities) Sampling error and statistical models

Let’s take a look at some examples

Importance of sampling error Individual sites - Brasstown Creek, NC Stream restoration

Populations of sites: Chesapeake Bay

Populations of sites – Gulf of Mexico v1 v2 v3 WQI 1 3 SQI CH BI 2 FTCI 5 All 1.8 2.4 2.2

EPA’s Report on the Environment-2008 Among the indicators in ROE08 31 are based on “inventories” 28 are based on probability samples 26 are based on other designs

Indicators used in Uncertainty Pilot Inventory High priority cleanup sites where contamination is not continuing to spread above levels of concern Population served by community water systems with no reported violations Probability Coastal water quality index National average blood lead levels Other Reported toxic chemicals in wastes released, treated, recycled, or recovered for energy use Ambient concentrations of particulate matter Fish faunal intactness

Contaminated groundwater movement at cleanup sites Type inventory Sources of uncertainty On-site measurements Determination of status Completeness Documentation

Water systems with no reported violations Type inventory Sources of uncertainty Detection of violations (measurement and sampling error) Reporting of violations

Blood lead levels Type Probability Sources of uncertainty Sampling error Collection error Measurement error Data entry and management error

Coastal water quality index Type Probability Sources of uncertainty Sampling error Measurement error Calibration error Data entry and management error Aggregation error

Ambient PM concentrations Type Other Sources of uncertainty Measurement error Missing values (days and years) Calibration error Data entry and management error Representativeness

Toxic wastes released, recycled, etc. Type Other (model-based) Sources of uncertainty Filing decisions Estimations of wastes released, etc. Data entry and record-keeping errors Elephants

Fish faunal intactness Type Other Sources of uncertainty Completeness of sample data (both site and region) Taxon identification Representativeness of species distribution

Root cause analysis: Community Water Systems Indicator Origin of Indicator Data: CWSs monitor for drinking water contaminants. CWSs report contaminant levels that exceed the maximum contamination level (MCL) and/or surface water treatment rule treatment technique (SWTR TT) violations of health-based standards to State or Tribal drinking water programs, which report the violation(s) to EPA. At a CWS: CWS samples water on a daily basis. CWS sends water samples to a certified laboratory for testing. CWS reports MCL and/or SWTR TT violation(s) to State or Tribal drinking water programs. At State or Tribal Drinking Water Program: State or Tribal drinking water program prepares a violation report. State or Tribal drinking water program submits a violation report to EPA (via SDWIS/FED). At US EPA HQ: EPA collects violation reports from states and compiles them in SDWIS/FED. EPA prompts states to resubmit inaccurate or incomplete violation reports. EPA performs triennial audits to assess the accuracy and completeness of violation reports in SDWIS/FED. To Prepare the ROE Indicator: ROE selects the data fields appropriate to include in the Indicator. ROE sums selected data fields to produce Indicator data cuts for presentation. Indicator Presented: “Populations Served by Community Water Systems with No Reported Health-Based Violations“ At a Certified Laboratory: Laboratory tests water sample to determine if any contaminants exceed the MCL. Laboratory reports results to CWS (in some cases it will bypass the CWS and report results directly to the State or Tribal drinking water programs). DATA FLOW ROE may make data processing errors. Water samples not collected using normal protocol. Water samples mistakenly not sent for testing. CWS incorrectly identifies a violation(s) has or has not occurred. State or Tribal drinking water program does not submit violation report within the 60-day window to report a violation(s) at the end of each quarter. Violation form is inaccurate. Violation form is not complete. States do not report certain types of violations on a consistent basis (i.e., SWTR TT violations). Violation reported inaccurately estimates actual population affected. Underreporting of monitor and reporting (M/R) violations, could mask MCL and SWTR TT violations. Compliance determination error. Discrepancy between state violation data and data in SDWIS/FED. Data entry problems. SDWIS/FED software limitations. Low rate of violation report resubmission by State and Tribal drinking water programs (i.e., violation report(s) not resubmitted within the 30-day window to verify and correct the violation report(s)). Color Key: Data Origin(s) Data Flow in Root Data System(s) Processing For ROE Final Indicator Uncertainty Elements Laboratory receives false positive or false negative test results. Laboratory mistakenly does not report results to CWS, or State or Tribal drinking water programs. UNCERTAINTY

Uncertainty pilot questions Are all of the major sources of uncertainty and variability identified? Can the major sources of uncertainty in current status be quantified with the data currently available? Can the major sources of uncertainty and variability in trends be quantified with the data currently available? Is the uncertainty small enough to detect change in the indicator values over time?

Uncertainty summaries Sources Status Trends Sensitivity Inventory (31) Groundwater Yes* Yes*/? Drinking water Yes ? Probability sampling (28) Coastal WQ Blood lead Other designs (26) Toxic releases No Particulates Fish fauna

Pilot Conclusions (tentative) Most sources of uncertainty in indicators can be identified It may not be possible to quantify uncertainty in status or trends of indicators not based on inventories or probability sampling The practical effects of uncertainty on change or trend detection remains largely unexplored.

A final caveat – the importance of indicator scale National trends may mask important regional, state, and local variation Smaller sample size means additional uncertainty at regional, provincial, and local scales Indicators or performance measures may require time and space scale that are “just right.”

Effect of sampling error on ability to detect a trend or achieve a target