Download presentation
Presentation is loading. Please wait.
Published byAbraham McCormick Modified over 9 years ago
1
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS
2
Purpose To discuss PM2.5 and Regional Haze model performance issues that are relevant to SIP modeling. The discussions and information will be used to enhance the model performance evaluation section of the PM2.5 and Regional Haze modeling guidance.
3
Goals For everyone in the community to learn more about the latest work on PM model performance evaluations To gather enough information to be able to revise the guidance To listen to opinions and recommendations
4
PM2.5 Model Performance Evaluation- What’s in the Modeling Guidance? PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS
5
Contents Status of guidance What’s in the guidance Review of Chapter 16- Model performance
6
Status of DRAFT Guidance Draft “Guidance for Demonstrating Attainment of Air Quality Goals for PM2.5 and Regional Haze”, January 2001 Living document - may be revised as needed and posted on EPA’s website http://www.epa.gov/scram001/guidance/guide/draft_pm.pdf Will finalize guidance as part of PM2.5 implementation rule- 2004
7
What’s in the Guidance Part I- Using Model Results Attainment test Annual PM2.5 NAAQS 24 hr. PM2.5 NAAQS Regional haze reasonable progress test “Hot spot” modeling Using weight of evidence Data gathering needs Required documentation
8
What’s in the Guidance- con’t Part II- Generating Model Results Conceptual description Modeling protocol Selecting a model(s) Choosing days Selecting domain & spatial resolution Developing met inputs Developing emissions inputs Evaluating model performance (chapter 16) Evaluating control strategies
9
Overview of Chapter 16 How Do I Assess Model Performance and Make Use of Diagnostic Analyses?
10
Model Performance- Introduction How well is the model able to replicate observed concentrations of PM mass and its components (and precursors)? How accurately does the model characterize sensitivity of changes in component concentrations to changes in emissions?
11
Types of Analyses Operational Statistics Scatter plots Time series plots Diagnostic Ratios of indicator species Process analysis Sensitivity tests
12
“Big Picture” Operational Evaluation Graphical displays PM2.5 and PM components Time series plots Scatter plots Tile plots Q-Q plots Temporal resolution Episodes, seasonal, annual
13
Operational Evaluation- Species PM Species PM2.5 mass Sulfate Nitrate Mass associated with sulfate Mass associated with nitrate Elemental carbon Organic carbon (organic mass) Inorganic primary PM2.5 (IP) Mass of individual constituents of IP
14
Operational Evaluation- Species Gaseous Species Ozone SO2 CO NO2 NOy PAN Nitric acid Ammonia Hydrogen peroxide
15
Evaluation- Statistical Metrics Key question- How well does the model predict spatially averaged concentrations near a monitor which are averaged over the modeled days with corresponding monitored observations? Basic metric- Normalized gross error Averaged over monitor days Greatest concern for good model performance at monitors that are exceeding the standards
16
Statistics In the Current Guidance Normalized gross error Normalized bias Fractional error (means and standard deviation) Fractional bias (means and standard deviation) Aggregated statistics Averaged over multiple sites
17
Calculation of Statistics- Issues Many ways to calculate statistics Averaging across days Averaging across sites Similar, but different metrics Normalized mean error vs. mean normalized error Low concentrations Certain metrics are not appropriate when concentrations are very low
18
Performance Goals “It is difficult to establish generally applicable numerical performance goals” Model performance is not particularly important for components with small observed concentrations relative to other components In a relative attainment test, a small observed component cannot have a large influence “How good should a State expect performance of a model to be? Frankly, there is little basis for making recommendations at present (2001).”
19
Performance Goals Expect performance for PM components to be worse than ozone Ozone goals not appropriate Numbers listed in guidance as example aggregated normalized gross error Statistics averaged from several limited PM applications at the time (before 2001) PM2.5 ~30-50% Sulfate ~30-50% Nitrate ~20-70% EC ~15-60% OC ~40-50%
20
Performance Goals Relative proportions Major components (> 30% of PM2.5) Agree within +- 10% of relative portion If sulfate is 50% of mass, then goal would be to predict sulfate that is 40-60% of total mass Minor components Agree within +- 5% of relative portion Difficult to assess proportions if one component is way off (too high or too low)
21
Other Analyses Analyses to address model response to emissions changes Weekend/weekday emissions Not sure if this is appropriate for PM Ratios of indicator species Many ratios developed for ozone chemistry Several ratios exist for PM NH4+NH3/HNO3+NO3+SO4 Most PM ratio techniques require difficult to find trace gas measurements (e.g. NH3 and HNO3) Retrospective analyses
22
Diagnostic Tests Sensitivity analyses Is model especially sensitive to an input or combination of inputs? Initial and boundary conditions Emissions inputs Grid size and number of layers Alternative met fields Prioritize future data gathering Assess robustness of a strategy Prioritizing control efforts Process analysis
23
Next Steps Update modeling guidance Metric definitions and calculations Statistical benchmarks Diagnostic analyses Other analyses to test model’s relative response to emissions changes Use workshop materials and discussion to help inform decisions Looking for recommendations and opinions
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.