Emissions Factors Uncertainty Primer August 28, 2007.

Slides:



Advertisements
Similar presentations
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. A PowerPoint Presentation Package to Accompany Applied Statistics.
Advertisements

Tinus Pulles How to establish the uncertainties? Particulate Emission Inventory for Europe.
Dialogue Policy Optimisation
Harmonization of Parts 60 and 75
Sensitivity Analysis In deterministic analysis, single fixed values (typically, mean values) of representative samples or strength parameters or slope.
Environmental Justice at the MPCA Presenters: Kristie Ellickson and Monika Vadali The MPCA EJ Framework Workgroups 2014 Point Source work related to EJ.
1 Schedule Risk Assessment (SRA) Overview July 2013 NAVY CEVM.
Engineering Economic Analysis Canadian Edition
QM Spring 2002 Business Statistics Normally Distributed Random Variables.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Chapter 14 Simulation. Monte Carlo Process Statistical Analysis of Simulation Results Verification of the Simulation Model Computer Simulation with Excel.
Chapter 4 Topics –Sampling –Hard data –Workflow analysis –Archival documents.
Auditing & Assurance Services, 6e
The Calibration Process
Introduction to ModelingMonte Carlo Simulation Expensive Not always practical Time consuming Impossible for all situations Can be complex Cons Pros Experience.
Probability distribution functions
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Sampling. Concerns 1)Representativeness of the Sample: Does the sample accurately portray the population from which it is drawn 2)Time and Change: Was.
Records Managers’ Forum 28 February Draft standard on the appraisal and disposal of State records Catherine Robinson Senior Project Officer, Government.
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
1 WRAP Policy Fire Tracking Systems Draft December 9, 2002 FEJF Meeting December 10-11, 2002 Jackson, WY.
Emissions Factors Improvement Ron Myers OAQPS/SPPD/MPG 2/13/2008 Revising Critical Components to Streamline and Expand Program Applicability.
Compliance Assurance and Title V Monitoring A Summary of the Rules and Applications Peter Westlin, EPA, OAQPS.
Copyright © 2010 Lumina Decision Systems, Inc. Monte Carlo Simulation Analytica User Group Modeling Uncertainty Series #3 13 May 2010 Lonnie Chrisman,
Probabilistic Mechanism Analysis. Outline Uncertainty in mechanisms Why consider uncertainty Basics of uncertainty Probabilistic mechanism analysis Examples.
Guidance on Establishing Monitoring to Comply with CAM and Other Title V Requirements A Summary of Technical and Policy Materials Barrett Parker, EPA,
CERTIFICATION In the Electronics Recycling Industry © 2007 IAER Web Site - -
August 7, Market Participant Survey Action Plan Dale Goodman Director, Market Services.
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
EFFICIENT CHARACTERIZATION OF UNCERTAINTY IN CONTROL STRATEGY IMPACT PREDICTIONS EFFICIENT CHARACTERIZATION OF UNCERTAINTY IN CONTROL STRATEGY IMPACT PREDICTIONS.
Engineering Economic Analysis Canadian Edition
BPA M&V Protocols Overview of BPA M&V Protocols and Relationship to RTF Guidelines for Savings and Standard Savings Estimation Protocols.
Compliance Assurance and Title V Monitoring A Summary of Rules and Permitting Issues Peter Westlin, EPA, OAQPS.
EXPEDITED PERMIT PILOT PROGRAM STAKEHOLDER MEETING EXPEDITED PERMIT PILOT PROGRAM STAKEHOLDER MEETING Wednesday, September 13, 2006.
Current situation concerning national inventory system in Ukraine 1. Previous national inventories Up to date 3 national inventories were prepared and.
EPA Chesapeake Bay Trading and Offsets Workplan June 1, 2012.
Title V Operating Permits: A Compliance and Enforcement Tool Candace Carraway US Environmental Protection Agency Office of Air Quality Planning and Standards.
Draft, 2 June NATURAL HAZE LEVELS SENSITIVITY ASSESSMENT 1. Project Overview Ivar Tombach Regional Haze Data Analysis Workshop 8 June 2005.
1 3. M ODELING U NCERTAINTY IN C ONSTRUCTION Objective: To develop an understanding of the impact of uncertainty on the performance of a project, and to.
Monte Carlo Process Risk Analysis for Water Resources Planning and Management Institute for Water Resources 2008.
Evaluation of Wood Smoke Quantification and Attribution RTF PAC October 17, 2014.
Flexibility Mechanisms A Consideration of Flexibility Mechanisms for Inclusion in the Gothenburg Protocol Chris Dore.
Industry environmental self- monitoring and reporting UK experience with reference to the Pollution Inventory Dr Ian Whitwell The Environment Agency of.
6 November 2013 Created for IEA Conference Presented by: M. Cristina Ferrari NAVFAC SW Environmental Program Manager Naval Facilities Engineering Command.
1 Modeling Under PSD Air quality models (screening and refined) are used in various ways under the PSD program. Step 1: Significant Impact Analysis –Use.
International Atomic Energy Agency Roles and responsibilities for development of disposal facilities Phil Metcalf Workshop on Strategy and Methodologies.
Compliance Assurance Monitoring (CAM) November 24, 2009.
1 Waste Discharge Authorization Application - British Columbia WG6 Application Process WG Document Review presented by Helga Harlander October x, 2008.
RLV Reliability Analysis Guidelines Terry Hardy AST-300/Systems Engineering and Training Division October 26, 2004.
1 Emissions Measurement and Monitoring Projects Update Robin Segall Office of Air Quality Planning and Standards US Environmental Protection Agency Measurement.
MONTE CARLO ANALYSIS When a system contains elements that exhibit chance in their behavior, the Monte Carlo method of simulation may be applied.
Components are existing in ONE of TWO STATES: 1 WORKING STATE with probability R 0 FAILURE STATE with probability F R+F = 1 RELIABLEFAILURE F R Selecting.
Hydro One Community Liaison Committee Minister’s Conditions Meeting #2 August 19, 2014 Solina Hall.
Multistate Research Program Roles & Responsibilities Eric Young SAAESD Meeting Corpus Christi, TX April 3-6, 2005.
Session 4.2. Criteria for baseline setting and monitoring Dealing with methodological matters in the ERUPT programme Zsolt Lengyel SenterNovem Carboncredits.nl.
Regional Policy Guidance on monitoring TÓTH Gábor DG EMPL – Impact Assessment, Evaluation Unit ESF Evaluation Partnership meeting, Rome, 26 November 2014.
ISRIC Spring School – Hand’s on Global Soil Information Facilities, 9-13 May 2016 Uncertainty quantification and propagation Gerard Heuvelink.
Tree and Forest Classification and Regression Tree Bagging of trees Boosting trees Random Forest.
Kansas City Power & Light and KCP&L Greater Missouri Operations – Suggestions for Chapter 22 Revisions Missouri Public Service Commission Meeting Aug 31,
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
G5/4 Revision Project Progress Update Andrew Bower
EIAScreening6(Gajaseni, 2007)1 II. Scoping. EIAScreening6(Gajaseni, 2007)2 Scoping Definition: is a process of interaction between the interested public,
AUDIT STAFF TRAINING WORKSHOP 13 TH – 14 TH NOVEMBER 2014, HILTON HOTEL NAIROBI AUDIT PLANNING 1.
Uncertainty Analysis in Emission Inventories
Prepared by Lloyd R. Jaisingh
Unit 5: Hypothesis Testing
Uncertainty Analysis in Emission Inventories
Communication on External costs of electricity production A view from industry’s perspective.
AICT5 – eProject Project Planning for ICT
Incorporating metal bioavailability into permitting – UK experience
Presentation transcript:

Emissions Factors Uncertainty Primer August 28, 2007

2 What are Emissions Factors? Emissions Factors (EFs) are low cost, low burden means to estimate emissions –EFs are average values gleaned from few emissions tests at a subset of sources Typical EF units are lbs of pollutant per fuel input –EFs developed for use in the national emissions inventory

3 What’s wrong with EFs? Nothing when used in proper context –Developing annual, national inventory However, EFs are used out of context –Individual sites use EFs rather than direct measurements To determine program applicability To establish permit limits To demonstrate compliance To calculate fees As basis for TRI reporting

4 What’s wrong with EFs? –Such out of context use often ignores potential consequences Half of sources’ emissions exceed values determined by using EFs Moreover, Inspector General and other stakeholders found –Few high quality EFs –Difficult process for generating new EFs –No accounting for EF uncertainty –No guidance for using EFs out of context

5 How Has MPG Responded? Convened stakeholders Created streamlined EF development process –Captures emissions test data electronically, assigns quality rating automatically, and will post results online Drafted peer-reviewed EF Uncertainty Assessment and seeks comments on it –Comment period extended 3 times – 10/31 is new deadline

6 What did we learn from the Uncertainty Assessment? Uncertainty is the lack of knowledge regarding the true value –Includes variability, systematic error, and random error –Can be expressed as a probability distribution –Depends on Type of pollutant (gaseous criteria, PM, or HAP) Use of controls (controlled or uncontrolled) Number of emissions tests performed Number of similar units nearby Decision level (percentile appropriate for program)

7 What did we learn from the Uncertainty Assessment? Probability distributions for HAP, PM, and gaseous criteria pollutant EFs –Are based on 42 A-rated and 1 B-rated EFs –Were generated from Monte Carlo simulation and repeated sampling –Exhibit lognormal or Weibull characteristics (no negative values and long tail to the right) –Allow calculation of expected range of EF values

8 What did we learn from the Uncertainty Assessment? Expected EF values range Pollutant Less than 3 emissions tests25 or more emissions tests 10 th percentile95 th percentile10 th percentile95 th percentile HAP0.2 * EF13.4 * EF0.1 * EF3.9 * EF PM condensable 0.2 * EF6.9 * EF0.1 * EF3.6 * EF PM filterable, controlled 0.4 * EF3.9 * EF0.3 * EF2.7 * EF PM filterable, uncontrolled 0.5 * EF2.7 * EF0.4 * EF2.2 * EF Gaseous criteria pollutants 0.3 * EF5.4 * EF0.3 * EF2.8 * EF

9 What did we learn from the Uncertainty Assessment? In general, median of existing EFs is around 65 th percentile Uncertainty reduced with more supporting data (additional emissions tests) –Effects diminish after 10 tests Assessment includes procedure to calculate effect of uncertainty on more than one similar unit nearby

10 How Has MPG Responded? No action on guidance until uncertainty method finalized –Commenters overly concerned about perceived impact of potential guidance instead of focusing on draft method

11 Next Steps Continue to refine revised EF development process –Add new pollutants and seek new partners to pilot program Prepare and improve WebFIRE, the interactive EF website –Put more hard copy records into electronic format

12 Next Steps Respond to comments and finalize uncertainty method Begin internal Agency discussions with programs concerning EF use Continue updating stakeholders of program progress