PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.

Slides:



Advertisements
Similar presentations
Exceptional Events Elements of an Effective Demonstration Darren Palmer US EPA Region 4.
Advertisements

1 Policies for Addressing PM2.5 Precursor Emissions Rich Damberg EPA Office of Air Quality Planning and Standards June 20, 2007.
Inventory Issues and Modeling- Some Examples Brian Timin USEPA/OAQPS October 21, 2002.
COMPARATIVE MODEL PERFORMANCE EVALUATION OF CMAQ-VISTAS, CMAQ-MADRID, AND CMAQ-MADRID-APT FOR A NITROGEN DEPOSITION ASSESSMENT OF THE ESCAMBIA BAY, FLORIDA.
Photochemical Model Performance for PM2.5 Sulfate, Nitrate, Ammonium, and pre-cursor species SO2, HNO3, and NH3 at Background Monitor Locations in the.
EPA PM2.5 Modeling Guidance for Attainment Demonstrations Brian Timin EPA/OAQPS February 20, 2007.
Attribution of Haze Phase 2 and Technical Support System Project Update AoH Meeting – San Francisco, CA September 14/15, 2005 Joe Adlhoch - Air Resource.
Regional Haze, Dust, and New Mexico Developing a State Implementation Plan for Dust in the Salt Creek Wilderness Area, New Mexico.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2008 CAMx Modeling Model Performance Evaluation Summary University of North Carolina.
Christian Seigneur AER San Ramon, CA
Air Quality Impacts from Prescribed Burning Karsten Baumann, PhD. Polly Gustafson.
Regional Haze Rule Guidance: Tracking Progress & Natural Levels Overview of the concepts currently envisioned by EPA working groups by Marc Pitchford;
Evaluation of the AIRPACT2 modeling system for the Pacific Northwest Abdullah Mahmud MS Student, CEE Washington State University.
Model Performance Evaluation Data Base and Software - Application to CENRAP Betty K. Pun, Shu-Yun Chen, Kristen Lohman, Christian Seigneur PM Model Performance.
1 An Update on EPA Attainment Modeling Guidance for the 8- Hour Ozone NAAQS Brian Timin EPA/OAQPS/EMAD/AQMG November 16, 2005.
Ozone Overview John Koswan July 11, OZONE SIP DEVELOPMENT: TASKS COMPLETED TO DATE.
Air Quality Impact Analysis 1.Establish a relationship between emissions and air quality. AQ past = a EM past + b 2.A change in emissions results in an.
PM Model Performance Goals and Criteria James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26,
Regional Haze Modeling RPO Update Gary Kleiman, NESCAUM National RPO Meeting, Dallas, TX December 3, 2002.
University of California Riverside, ENVIRON Corporation, MCNC WRAP Regional Modeling Center WRAP Regional Haze CMAQ 1996 Model Performance and for Section.
Calculating Statistics: Concentration Related Performance Goals James W. Boylan Georgia Department of Natural Resources PM Model Performance Workshop Chapel.
TSS Data Preparation Update WRAP TSS Project Team Meeting Ft. Collins, CO March 28-31, 2006.
Development of PM2.5 Interpollutant Trading Ratios James Boylan and Byeong-Uk Kim Georgia EPD – Air Protection Branch 2012 CMAS Conference October 16,
EFFICIENT CHARACTERIZATION OF UNCERTAINTY IN CONTROL STRATEGY IMPACT PREDICTIONS EFFICIENT CHARACTERIZATION OF UNCERTAINTY IN CONTROL STRATEGY IMPACT PREDICTIONS.
An Update on the Colorado Regional Haze SIP Process and Outcomes Presented at: WRAP – Implementation Work Group San Francisco, CA March 2005.
Utah Wintertime PM2.5 Modeling Lance Avey Utah Division of Air Quality.
EPA’s DRAFT SIP and MODELING GUIDANCE Ian Cohen EPA Region 1 December 8, 2011.
V:\corporate\marketing\overview.ppt CRGAQS: Initial CAMx Results Presentation to the Gorge Study Technical Team By ENVIRON International Corporation October.
Regional Haze SIP Development Overview AQCC Presentation July 2005.
Project Outline: Technical Support to EPA and RPOs Estimation of Natural Visibility Conditions over the US Project Period: June May 2008 Reports:
PM Model Performance in Southern California Using UAMAERO-LT Joseph Cassmassi Senior Meteorologist SCAQMD February 11, 2004.
VISTAS Emissions Inventory Overview Nov 4, VISTAS is evaluating visibility and sources of fine particulate mass in the Southeastern US View NE from.
OThree Chemistry MM5/CAMx Model Diagnostic and Sensitivity Analysis Results Central California Ozone Study: Bi-Weekly Presentation 2 T. W. Tesche Dennis.
Draft, 2 June NATURAL HAZE LEVELS SENSITIVITY ASSESSMENT 1. Project Overview Ivar Tombach Regional Haze Data Analysis Workshop 8 June 2005.
Final Clean Air Fine Particle Implementation Rule Briefing for NTAA EPA Office of Air Quality Planning and Standards April 17, 2007.
Operational Evaluation and Comparison of CMAQ and REMSAD- An Annual Simulation Brian Timin, Carey Jang, Pat Dolwick, Norm Possiel, Tom Braverman USEPA/OAQPS.
1 Brian Finneran, Oregon DEQ WRAP IWG Meeting, Santa Fe December 2006 Update on Regional Haze 308 SIP Template.
Source Attribution Modeling to Identify Sources of Regional Haze in Western U.S. Class I Areas Gail Tonnesen, EPA Region 8 Pat Brewer, National Park Service.
Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen.
Time-Resolved & In-Depth Evaluation of PM and PM Precursors using CMAQ Robin L. Dennis Atmospheric Modeling Division U.S. EPA/ORD:NOAA/ARL PM Model Performance.
1 Modeling Under PSD Air quality models (screening and refined) are used in various ways under the PSD program. Step 1: Significant Impact Analysis –Use.
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
Extending Size-Dependent Composition to the Modal Approach: A Case Study with Sea Salt Aerosol Uma Shankar and Rohit Mathur The University of North Carolina.
Weight of Evidence Discussion AoH Meeting – Tempe, AZ November 16/17, 2005.
Opening Remarks -- Ozone and Particles: Policy and Science Recent Developments & Controversial Issues GERMAN-US WORKSHOP October 9, 2002 G. Foley *US EPA.
Implementation Workgroup Meeting December 6, 2006 Attribution of Haze Workgroup’s Monitoring Metrics Document Status: 1)2018 Visibility Projections – Alternative.
Cristina Gonzalez-Maddux ITEP, Research Specialist INTRODUCTION TO MODELING PART I 1.
AoH/MF Meeting, San Diego, CA, Jan 25, 2006 WRAP 2002 Visibility Modeling: Summary of 2005 Modeling Results Gail Tonnesen, Zion Wang, Mohammad Omary, Chao-Jung.
Operational Evaluation and Model Response Comparison of CAMx and CMAQ for Ozone & PM2.5 Kirk Baker, Brian Timin, Sharon Phillips U.S. Environmental Protection.
Notice: The views expressed here are those of the individual authors and may not necessarily reflect the views and policies of the United States Environmental.
Western Regional Technical Air Quality Studies: support for Ozone and other Air Quality Planning in the West Tom Moore Air Quality Program Manager Western.
Emission reductions needed to meet proposed ozone standard and their effect on particulate matter Daniel Cohan and Beata Czader Department of Civil and.
1 RPO Data Analysis/Monitoring Grant Guidance Review Extracted from the EPA’s 3/5/02 RPO 4 th Year Policy, Organizational & Technical Guidance.
OAQPS Update WESTAR April 3,  On March 12, 2008, EPA significantly strengthened the National Ambient Air Quality Standards (NAAQS) for ground-level.
Source: Javier Fochesatto Regulatory Context for Modeling Robert Elleman EPA Region 10.
Workshop on MDG, Bangkok, Jan.2009 MDG 3.2: Share of women in wage employment in the non-agricultural sector National and global data.
AoH Work Group Weight of Evidence Framework WRAP Meeting – Tucson, AZ January 10/11, 2006 Joe Adlhoch - Air Resource Specialists, Inc.
V:\corporate\marketing\overview.ppt CRGAQS: CAMx Sensitivity Results Presentation to the Gorge Study Technical Team By ENVIRON International Corporation.
The application of Models-3 in national policy Samantha Baker Air and Environment Quality Division, Defra.
Weight of Evidence for Regional Haze Reasonable Progress
Predicting PM2.5 Concentrations that Result from Compliance with National Ambient Air Quality Standards (NAAQS) James T. Kelly, Adam Reff, and Brett Gantt.
Brian Timin- EPA/OAQPS
Sunil Kumar TAC, COG July 9, 2007
AoH Phase 2 Update AoH Meeting – San Diego, CA January 25, 2006
Adjusting the Regional Haze Glide path using Monitoring and Modeling Data Trends Natural Conditions International Anthropogenic Contributions.
Causes of Haze Assessment Brief Overview and Status Report
Update on 2016 AQ Modeling by EPA
Guidance on Attainment Tests for O3 / PM / Regional Haze
Joe Adlhoch - Air Resource Specialists, Inc.
Presentation transcript:

PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS

Purpose To discuss PM2.5 and Regional Haze model performance issues that are relevant to SIP modeling. The discussions and information will be used to enhance the model performance evaluation section of the PM2.5 and Regional Haze modeling guidance.

Goals For everyone in the community to learn more about the latest work on PM model performance evaluations To gather enough information to be able to revise the guidance To listen to opinions and recommendations

PM2.5 Model Performance Evaluation- What’s in the Modeling Guidance? PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS

Contents Status of guidance What’s in the guidance Review of Chapter 16- Model performance

Status of DRAFT Guidance Draft “Guidance for Demonstrating Attainment of Air Quality Goals for PM2.5 and Regional Haze”, January 2001 Living document - may be revised as needed and posted on EPA’s website Will finalize guidance as part of PM2.5 implementation rule- 2004

What’s in the Guidance Part I- Using Model Results Attainment test Annual PM2.5 NAAQS 24 hr. PM2.5 NAAQS Regional haze reasonable progress test “Hot spot” modeling Using weight of evidence Data gathering needs Required documentation

What’s in the Guidance- con’t Part II- Generating Model Results Conceptual description Modeling protocol Selecting a model(s) Choosing days Selecting domain & spatial resolution Developing met inputs Developing emissions inputs Evaluating model performance (chapter 16) Evaluating control strategies

Overview of Chapter 16 How Do I Assess Model Performance and Make Use of Diagnostic Analyses?

Model Performance- Introduction How well is the model able to replicate observed concentrations of PM mass and its components (and precursors)? How accurately does the model characterize sensitivity of changes in component concentrations to changes in emissions?

Types of Analyses Operational Statistics Scatter plots Time series plots Diagnostic Ratios of indicator species Process analysis Sensitivity tests

“Big Picture” Operational Evaluation Graphical displays PM2.5 and PM components Time series plots Scatter plots Tile plots Q-Q plots Temporal resolution Episodes, seasonal, annual

Operational Evaluation- Species PM Species PM2.5 mass Sulfate Nitrate Mass associated with sulfate Mass associated with nitrate Elemental carbon Organic carbon (organic mass) Inorganic primary PM2.5 (IP) Mass of individual constituents of IP

Operational Evaluation- Species Gaseous Species Ozone SO2 CO NO2 NOy PAN Nitric acid Ammonia Hydrogen peroxide

Evaluation- Statistical Metrics Key question- How well does the model predict spatially averaged concentrations near a monitor which are averaged over the modeled days with corresponding monitored observations? Basic metric- Normalized gross error Averaged over monitor days Greatest concern for good model performance at monitors that are exceeding the standards

Statistics In the Current Guidance Normalized gross error Normalized bias Fractional error (means and standard deviation) Fractional bias (means and standard deviation) Aggregated statistics Averaged over multiple sites

Calculation of Statistics- Issues Many ways to calculate statistics Averaging across days Averaging across sites Similar, but different metrics Normalized mean error vs. mean normalized error Low concentrations Certain metrics are not appropriate when concentrations are very low

Performance Goals “It is difficult to establish generally applicable numerical performance goals” Model performance is not particularly important for components with small observed concentrations relative to other components In a relative attainment test, a small observed component cannot have a large influence “How good should a State expect performance of a model to be? Frankly, there is little basis for making recommendations at present (2001).”

Performance Goals Expect performance for PM components to be worse than ozone Ozone goals not appropriate Numbers listed in guidance as example aggregated normalized gross error Statistics averaged from several limited PM applications at the time (before 2001) PM2.5 ~30-50% Sulfate ~30-50% Nitrate ~20-70% EC ~15-60% OC ~40-50%

Performance Goals Relative proportions Major components (> 30% of PM2.5) Agree within +- 10% of relative portion If sulfate is 50% of mass, then goal would be to predict sulfate that is 40-60% of total mass Minor components Agree within +- 5% of relative portion Difficult to assess proportions if one component is way off (too high or too low)

Other Analyses Analyses to address model response to emissions changes Weekend/weekday emissions Not sure if this is appropriate for PM Ratios of indicator species Many ratios developed for ozone chemistry Several ratios exist for PM NH4+NH3/HNO3+NO3+SO4 Most PM ratio techniques require difficult to find trace gas measurements (e.g. NH3 and HNO3) Retrospective analyses

Diagnostic Tests Sensitivity analyses Is model especially sensitive to an input or combination of inputs? Initial and boundary conditions Emissions inputs Grid size and number of layers Alternative met fields Prioritize future data gathering Assess robustness of a strategy Prioritizing control efforts Process analysis

Next Steps Update modeling guidance Metric definitions and calculations Statistical benchmarks Diagnostic analyses Other analyses to test model’s relative response to emissions changes Use workshop materials and discussion to help inform decisions Looking for recommendations and opinions