Calculating Statistics: Concentration Related Performance Goals James W. Boylan Georgia Department of Natural Resources PM Model Performance Workshop Chapel.

Slides:



Advertisements
Similar presentations
Development and Application of PM2.5 Interpollutant Trading Ratios to Account for PM2.5 Secondary Formation in Georgia James Boylan and Byeong-Uk Kim Georgia.
Advertisements

Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP October 27, 2003, CMAS Annual Meeting, RTP, NC University of California, Riverside.
Center for Environmental Research and Technology/Air Quality Modeling University of California at Riverside Model Performance Metrics, Ambient Data Sets.
Georgia Institute of Technology Evaluation of CMAQ with FAQS Episode of August 11 th -20 th, 2000 Yongtao Hu, M. Talat Odman, Maudood Khan and Armistead.
COMPARATIVE MODEL PERFORMANCE EVALUATION OF CMAQ-VISTAS, CMAQ-MADRID, AND CMAQ-MADRID-APT FOR A NITROGEN DEPOSITION ASSESSMENT OF THE ESCAMBIA BAY, FLORIDA.
Photochemical Model Performance for PM2.5 Sulfate, Nitrate, Ammonium, and pre-cursor species SO2, HNO3, and NH3 at Background Monitor Locations in the.
I-95 Corridor Coalition Vehicle Probe Project 1 The Validation Process I-95 Corridor Coalition Vehicle Probe Project AASHTO SSOM -
EPA PM2.5 Modeling Guidance for Attainment Demonstrations Brian Timin EPA/OAQPS February 20, 2007.
Mitigating Risk of Out-of-Specification Results During Stability Testing of Biopharmaceutical Products Jeff Gardner Principal Consultant 36 th Annual Midwest.
An Assessment of CMAQ with TEOM Measurements over the Eastern US Michael Ku, Chris Hogrefe, Kevin Civerolo, and Gopal Sistla PM Model Performance Workshop,
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2008 CAMx Modeling Model Performance Evaluation Summary University of North Carolina.
Title EMEP Unified model Importance of observations for model evaluation Svetlana Tsyro MSC-W / EMEP TFMM workshop, Lillestrøm, 19 October 2010.
Faculty Salary Equity Study University of North Carolina at Chapel Hill Faculty Council Report November 1, 2002.
Timed. Transects Statistics indicate that overall species Richness varies only as a function of method and that there is no difference between sites.
Evaluation of the AIRPACT2 modeling system for the Pacific Northwest Abdullah Mahmud MS Student, CEE Washington State University.
Model Performance Evaluation Data Base and Software - Application to CENRAP Betty K. Pun, Shu-Yun Chen, Kristen Lohman, Christian Seigneur PM Model Performance.
Shaocai Yu *, Brian Eder* ++, Robin Dennis* ++, Shao-hang Chu**, Stephen Schwartz*** *Atmospheric Sciences Modeling Division, National Exposure Research.
Determining Sample Size
PM Model Performance Goals and Criteria James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26,
Modeling Studies of Air Quality in the Four Corners Region National Park Service U.S. Department of the Interior Cooperative Institute for Research in.
Lessons Learned: One-Atmosphere Photochemical Modeling in Southeastern U.S. Presentation from Southern Appalachian Mountains Initiative to Meeting of Regional.
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
WRAP Update. Projects Updated 1996 emissions QA procedures New evaluation tools Model updates CB-IV km MM5 Fugitive dust NH 3 emissions Model.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 3SAQS 2011 Modeling Update University of North Carolina (UNC-IE) ENVIRON International.
Presents:/slides/greg/PSAT_ ppt Effects of Sectional PM Distribution on PM Modeling in the Western US Ralph Morris and Bonyoung Koo ENVIRON International.
VII-1 Stratification Case study to illustrate alternative methods to stratify a sampling frame Dr. Will Yancey, CPA This material is the property of the.
Projects:/WRAP RMC/309_SIP/progress_sep02/Annex_MTF_Sep20.ppt Preliminary Mobile Source Significance Test Modeling Results WRAP Regional Modeling Center.
PCB 3043L - General Ecology Data Analysis. OUTLINE Organizing an ecological study Basic sampling terminology Statistical analysis of data –Why use statistics?
Georgia Environmental Protection Division IMPACTS OF MODELING CHOICES ON RELATIVE RESPONSE FACTORS IN ATLANTA, GA Byeong-Uk Kim, Maudood Khan, Amit Marmur,
PM Model Performance in Southern California Using UAMAERO-LT Joseph Cassmassi Senior Meteorologist SCAQMD February 11, 2004.
WRAP Experience: Investigation of Model Biases Uma Shankar, Rohit Mathur and Francis Binkowski MCNC–Environmental Modeling Center Research Triangle Park,
Impacts of MOVES2014 On-Road Mobile Emissions on Air Quality Simulations of the Western U.S. Z. Adelman, M. Omary, D. Yang UNC – Institute for the Environment.
Section 309 Mobile Source Significance Test Modeling Results WRAP Regional Modeling Center (RMC) University of California at Riverside, CE-CERT ENVIRON.
Model Evaluation Comparing Model Output to Ambient Data Christian Seigneur AER San Ramon, California.
Operational Evaluation and Comparison of CMAQ and REMSAD- An Annual Simulation Brian Timin, Carey Jang, Pat Dolwick, Norm Possiel, Tom Braverman USEPA/OAQPS.
Source Attribution Modeling to Identify Sources of Regional Haze in Western U.S. Class I Areas Gail Tonnesen, EPA Region 8 Pat Brewer, National Park Service.
Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen.
Georgia Institute of Technology Estimation of NH 4 + /SO 4 2- Molar Ratios Using URM Modeling Outputs Jim Boylan, Talat Odman, Ted Russell April 10, 2001.
Presented at the 7th Annual CMAS Conference, Chapel Hill, NC, October 6-8, 2008 Identifying Optimal Temporal Scale for the Correlation of AOD and Ground.
PCB 3043L - General Ecology Data Analysis.
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
Evaluation of 2002 Multi-pollutant Platform: Air Toxics, Mercury, Ozone, and Particulate Matter US EPA / OAQPS / AQAD / AQMG Sharon Phillips, Kai Wang,
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division October 21, 2009 Evaluation of CMAQ.
Implementation Workgroup Meeting December 6, 2006 Attribution of Haze Workgroup’s Monitoring Metrics Document Status: 1)2018 Visibility Projections – Alternative.
Evaluation of CMAQ Driven by Downscaled Historical Meteorological Fields Karl Seltzer 1, Chris Nolte 2, Tanya Spero 2, Wyat Appel 2, Jia Xing 2 14th Annual.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 3SAQS 2011 CAMx Model Performance Evaluation University of North Carolina (UNC-IE)
NPS Source Attribution Modeling Deterministic Models Dispersion or deterministic models Receptor Models Analysis of Spatial & Temporal Patterns Back Trajectory.
1 Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Temporal Analysis A more complete statistical evaluation, including diurnal variations, of.
Western Air Quality Study (WAQS) Intermountain Data Warehouse (IWDW) Model Performance Evaluation CAMx and CMAQ 2011b University of North Carolina (UNC-IE)
Source: Javier Fochesatto Regulatory Context for Modeling Robert Elleman EPA Region 10.
UNIVERSITY OF ALASKA FAIRBANKS 1 WRF/CHEM – Model Performance/Control Measure Benefits Nicole Mölders, Huy N.Q. Tran, Ketsiri Leelalakulum, Trang T. Tran.
EPA’s 8 th Conference on Air Quality Modeling Comments on Model Evaluation By Bob Paine, ENSR (Peer reviewed by the A&WMA AB-3 Committee)
Daiwen Kang 1, Rohit Mathur 2, S. Trivikrama Rao 2 1 Science and Technology Corporation 2 Atmospheric Sciences Modeling Division ARL/NOAA NERL/U.S. EPA.
Preliminary Evaluation of the June 2002 Version of CMAQ Brian Eder Shaocai Yu Robin Dennis Jonathan Pleim Ken Schere Atmospheric Modeling Division* National.
WRAP Technical Work Overview
Predicting PM2.5 Concentrations that Result from Compliance with National Ambient Air Quality Standards (NAAQS) James T. Kelly, Adam Reff, and Brett Gantt.
SEMAP 2017 Ozone Projections and Sensitivities / Contributions Prepared by: Talat Odman - Georgia Tech Yongtao Hu - Georgia Tech Jim Boylan - Georgia.
Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Spatial Analysis A more complete statistical evaluation, including diurnal variations, of the.
PCB 3043L - General Ecology Data Analysis.
2017 Projections and Interstate Transport of Ozone in Southeastern US Talat Odman & Yongtao Hu - Georgia Tech Jim Boylan - Georgia EPD 16th Annual.
Mark Rothmann U.S. Food and Drug Administration September 14, 2018
EPA’s Partial Approval of North Carolina’s (d) List
Food Balance Sheets FBS component: Food availability.
Absolute Maximum and Minimum Values
Proposed Ozone Monitoring Revisions Ozone Season and Methods
Adjusting the Regional Haze Glide path using Monitoring and Modeling Data Trends Natural Conditions International Anthropogenic Contributions.
The Coefficient of Determination (R2) vs Relative Standard Error (RSE)
Joe Adlhoch - Air Resource Specialists, Inc.
WRAP Regional Modeling Center (RMC)
Evaluation of Models-3 CMAQ Annual Simulation Brian Eder, Shaocai Yu, Robin Dennis, Alice Gilliland, Steve Howard,
Presentation transcript:

Calculating Statistics: Concentration Related Performance Goals James W. Boylan Georgia Department of Natural Resources PM Model Performance Workshop Chapel Hill, NC February 11, 2004

Outline Performance Statistic –Standard Bias and Error Calculations Model Performance Goals for PM –Speciated Bias and Error Goals –Relative Proportions Goals

Performance MetricsEquation Mean Bias (  g/m 3 ) Mean Error (  g/m 3 ) Mean Normalized Bias (%) (-100% to +  ) Mean Normalized Error (%) (0% to +  ) Normalized Mean Bias (%) (-100% to +  ) Normalized Mean Error (%) (0% to +  ) Mean Fractional Bias (%) (-200% to +200%) Mean Fractional Error (%) (0% to +200%)

Example GT showed a positive bias of 11 points NB = 14.3% FB = 13.3% North Carolina 77 Georgia Tech 88

Performance Metrics Mean Normalized Bias and Error –Usually associated with observation-based minimum threshold Some components of PM can be very small making it difficult to set a reasonable minimum threshold value without excluding a majority of the data points –Without a minimum threshold, very large normalized biases and errors can result when observations are close to zero even though the absolute biases and errors are very small A few data points can dominate the metric –Overestimations are weighted more than equivalent underestimations

Performance Metrics Normalized Mean Bias and Error –Biased towards overestimations Mean Fractional Bias and Error –Bounds maximum bias and error –Gives additional weight to underestimations and less weight to overestimations

Example Calculations Mean Normalized Bias and Error –Most biased and least useful of the three metrics Normalized Mean Bias and Error Mean Fractional Bias and Error –Least biased and most useful of the three metrics Model  g/m 3 ) Obs.  g/m 3 ) MB  g/m 3 ) NMB (%) MNB (%) MFB (%) ME  g/m 3 ) NME (%) MNE (%) MFE (%)

SAMI Model Performance Summary Species# Obs Mean  g/m 3 MB  g/m 3 NMB (%) MNB (%) MFB (%) ME  g/m 3 NME (%) MNE (%) MFE (%) SO NO NH NH4 Bi ORG EC Soils PM PM PMC bext

Proposed Performance Goals Based on Mean Fractional Error (MFE) and Mean Fractional Bias (MFB) calculations Performance goals should vary as a function of species concentrations –More abundant species should have a MFE  + 50% and MFB  ± 30% –Less abundant species should have less stringent performance goals Goals should be continuous functions with the features of: –Asymptotically approaching + 50% MFE and ± 30% MFB when the concentrations (mean of the observed and modeled concentrations) are greater than 2.5  g/m 3 –Approaching + 200% MFE and ± 200% MFB when the concentrations (mean of the observed and modeled concentrations) are extremely small

Proposed Mean Fractional Error and Bias Goals

Example Calculations Species X Model  g/m 3 )Obs.  g/m 3 ) FB (%)FE (%) Day 1 – Site A Day 1 – Site B Day 2 – Site A Day 2 – Site B %+100.0% Average %79.8% Average C O + C M = 0.5*( ) = MFE performance goal for “Species X” = 81.3% MFB performance goal for “Species X” = ± 46.2%

Mean Fractional Error Goal

Mean Fractional Bias Goal

SAMI – 6 Episodes

VISTAS – July 1999 Episode

VISTAS – January 2002 Episode

Relative Proportions (RP) PERF Goals EPA draft guidance (2001) –“For major components (i.e., those observed to comprise at least 30% of measured PM 2.5 ), we propose that the relative proportion predicted for each component averaged over modeled days with monitored data agrees within about 20% of the averaged observed proportion. For minor observed components of PM, we suggest a goal that the observed and modeled absolute proportion of each minor component agree within 5%.”                      N i o N i m m Total component Total component CoCo C NC C N Bias RP% (5%)

Example Calculation Calculating component proportions based on concentrations averaged over multiple days can hide poor model performance Observed RP (%)Modeled RF (%) Day 150%95% Day 250%95% Day 350%5% Day 450%5% Average50%

ObservedSimulated Relative Proportions for SAMI

Proposed Relative Proportions Performance Goals Propose to use an equation that accounts for the day- to-day variability of species relative proportions: RP  30%, Error  10% RP  15%, Error  5% RP 15% - 30%, Error  [RP]/3

Proposed Relative Proportions Performance Goals

Concluding Remarks Recommended performance values are model goals, not model criteria –Failure to meet proposed performance goals should not prohibit the modeling from being used for regulatory purposes Help identify areas that can be improved upon in future modeling If performing episodic modeling, performance evaluation should be done on an episode-by- episode basis If performing annual modeling, performance evaluation should be done on a month-by- month basis

Concluding Remarks (cont.) As models mature, performance goals can be made more restrictive by simply: –Adjusting the coefficients in the MFE and MFB goal equations – Lowering the relative proportion error goals Q: Is there a need for performance goals for gaseous precursors or wet deposition species? –“One-atmosphere” modeling system –If not, still should be evaluated to help identify potential problems with PM model performance

Questions?