AN EVALUATION OF THE ETA-CMAQ AIR QUALITY FORECAST MODEL AS PART OF NOAA’S NATIONAL PROGRAM CMAQ AIRNOW AIRNOW Brian Eder* Daiwen Kang * Ken Schere* Ken.

Slides:



Advertisements
Similar presentations
Analysis of CMAQ Performance and Grid-to- grid Variability Over 12-km and 4-km Spacing Domains within the Houston airshed Daiwen Kang Computer Science.
Advertisements

A PERFORMANCE EVALUATION OF THE ETA - CMAQ AIR QUALITY FORECAST MODEL FOR THE SUMMER OF 2004 CMAS Workshop Chapel Hill, NC 20 October, 2004.
Georgia Institute of Technology Evaluation of CMAQ with FAQS Episode of August 11 th -20 th, 2000 Yongtao Hu, M. Talat Odman, Maudood Khan and Armistead.
Photochemical Model Performance for PM2.5 Sulfate, Nitrate, Ammonium, and pre-cursor species SO2, HNO3, and NH3 at Background Monitor Locations in the.
An Assessment of CMAQ with TEOM Measurements over the Eastern US Michael Ku, Chris Hogrefe, Kevin Civerolo, and Gopal Sistla PM Model Performance Workshop,
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2008 CAMx Modeling Model Performance Evaluation Summary University of North Carolina.
The AIRPACT-3 Photochemical Air Quality Forecast System: Evaluation and Enhancements Jack Chen, Farren Thorpe, Jeremy Avis, Matt Porter, Joseph Vaughan,
A Comparative Dynamic Evaluation of the AURAMS and CMAQ Air Quality Modeling Systems Steven Smyth a,b, Michael Moran c, Weimin Jiang a, Fuquan Yang a,
The AIRPACT-3 Photochemical Air Quality Forecast System: Evaluation and Enhancements Jack Chen, Farren Thorpe, Jeremy Avis, Matt Porter, Joseph Vaughan,
Evaluation of the AIRPACT2 modeling system for the Pacific Northwest Abdullah Mahmud MS Student, CEE Washington State University.
Transitioning CMAQ for NWS Operational AQ Forecasting Jeff McQueen*, Pius Lee*, Marina Tsildulko*, G. DiMego*, B. Katz* R. Mathur,T. Otte, J. Pleim, J.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling Division, Applied Modeling Research Branch October 8, 2008.
A Case Study Using the CMAQ Coupling with Global Dust Models Youhua Tang, Pius Lee, Marina Tsidulko, Ho-Chun Huang, Sarah Lu, Dongchul Kim Scientific Applications.
CMAS Conference, October 16 – 18, 2006 The work presented here was performed by the New York State Department of Environmental Conservation with partial.
Performance of the National Air Quality Forecast Capability, Urban vs. Rural and Other Comparisons Jerry Gorline and Jeff McQueen  Jerry Gorline, NWS/OST/MDL.
Five-year Progress in the Performance of Air Quality Forecast Models: Analysis on Categorical Statistics for the National Air Quality Forecast Capacity.
National/Regional Air Quality Modeling Assessment Over China and Taiwan Using Models-3/CMAQ Modeling System Joshua S. Fu 1, Carey Jang 2, David Streets.
Modeling Studies of Air Quality in the Four Corners Region National Park Service U.S. Department of the Interior Cooperative Institute for Research in.
Comparison of three photochemical mechanisms (CB4, CB05, SAPRC99) for the Eta-CMAQ air quality forecast model for O 3 during the 2004 ICARTT study Shaocai.
O. Russell Bullock, Jr. National Oceanic and Atmospheric Administration (NOAA) Atmospheric Sciences Modeling Division (in partnership with the U.S. Environmental.
On the Model’s Ability to Capture Key Measures Relevant to Air Quality Policies through Analysis of Multi-Year O 3 Observations and CMAQ Simulations Daiwen.
AIRPACT-3 status and near- term objectives Joe Vaughan Brian Lamb Jeremy Avise Jack Chen Matt Porter Li Wang.
A comparison of PM 2.5 simulations over the Eastern United States using CB-IV and RADM2 chemical mechanisms Michael Ku, Kevin Civerolo, and Gopal Sistla.
Georgia Environmental Protection Division IMPACTS OF MODELING CHOICES ON RELATIVE RESPONSE FACTORS IN ATLANTA, GA Byeong-Uk Kim, Maudood Khan, Amit Marmur,
Rick Saylor 1, Barry Baker 1, Pius Lee 2, Daniel Tong 2,3, Li Pan 2 and Youhua Tang 2 1 National Oceanic and Atmospheric Administration Air Resources Laboratory.
CMAS Conference, October 6 – 8, 2008 The work presented in this paper was performed by the New York State Department of Environmental Conservation with.
Adaptation and Application of the CMAQ Modeling System for Real-time Air Quality Forecasting During the Summer of 2004 R. Mathur, J. Pleim, T. Otte, K.
Assimilating AIRNOW Ozone Observations into CMAQ Model to Improve Ozone Forecasts Tianfeng Chai 1, Rohit Mathur 2, David Wong 2, Daiwen Kang 1, Hsin-mu.
A detailed evaluation of the WRF-CMAQ forecast model performance for O 3, and PM 2.5 during the 2006 TexAQS/GoMACCS study Shaocai Yu $, Rohit Mathur +,
Sensitivity of Air Quality Model Predictions to Various Parameterizations of Vertical Eddy Diffusivity Zhiwei Han and Meigen Zhang Institute of Atmospheric.
Impact of high resolution modeling on ozone predictions in the Cascadia region Ying Xie and Brian Lamb Laboratory for Atmospheric Research Department of.
Impact of Meteorological Inputs on Surface O 3 Prediction Jianping Huang 9 th CMAS Annual Conference Oct. 12, 2010, Chapel, NC.
U.S. EPA and WIST Rob Gilliam *NOAA/**U.S. EPA
1 Impact on Ozone Prediction at a Fine Grid Resolution: An Examination of Nudging Analysis and PBL Schemes in Meteorological Model Yunhee Kim, Joshua S.
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
Diagnostic Study on Fine Particulate Matter Predictions of CMAQ in the Southeastern U.S. Ping Liu and Yang Zhang North Carolina State University, Raleigh,
Evaluating temporal and spatial O 3 and PM 2.5 patterns simulated during an annual CMAQ application over the continental U.S. Evaluating temporal and spatial.
Forecasting a Better Future CMAS Workshop, October 28, 2003 Forecasting Air Quality with Models-3 Components: Performance Expectations John N. McHenry,
Boundary layer depth verification system at NCEP M. Tsidulko, C. M. Tassone, J. McQueen, G. DiMego, and M. Ek 15th International Symposium for the Advancement.
Robert W. Pinder, Alice B. Gilliland, Robert C. Gilliam, K. Wyat Appel Atmospheric Modeling Division, NOAA Air Resources Laboratory, in partnership with.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division Office of Research and Development.
Cloud-mediated radiative forcing of climate due to aerosols simulated by newly developed two-way coupled WRF-CMAQ during 2006 TexAQS/GoMACCS over the Gulf.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division October 21, 2009 Evaluation of CMAQ.
Evaluation of CMAQ Driven by Downscaled Historical Meteorological Fields Karl Seltzer 1, Chris Nolte 2, Tanya Spero 2, Wyat Appel 2, Jia Xing 2 14th Annual.
The Impact of Lateral Boundary Conditions on CMAQ Predictions over the Continental US: a Sensitivity Study Compared to Ozonsonde Data Youhua Tang*, Pius.
Impact of Temporal Fluctuations in Power Plant Emissions on Air Quality Forecasts Prakash Doraiswamy 1, Christian Hogrefe 1,2, Eric Zalewsky 2, Winston.
Impacts of Meteorological Variations on RRFs (Relative Response Factors) in the Demonstration of Attainment of the National Ambient Air Quality for 8-hr.
Emission reductions needed to meet proposed ozone standard and their effect on particulate matter Daniel Cohan and Beata Czader Department of Civil and.
Meteorological Development Laboratory / OST / National Weather Service  1200 and 0600 UTC OZONE 48-h experimental, 8-h (daily max) 48-h experimental,
Operational and diagnostic evaluations of ozone forecasts by the Eta-CMAQ model suite during the 2002 new England air quality study (NEAQS) Shaocai Yu.
Georgia Institute of Technology Evaluation of the 2006 Air Quality Forecasting Operation in Georgia Talat Odman, Yongtao Hu, Ted Russell School of Civil.
A study of process contributions to PM 2.5 formation during the 2004 ICARTT period using the Eta-CMAQ forecast model over the eastern U.S. Shaocai Yu $,
On the Verification of Particulate Matter Simulated by the NOAA-EPA Air Quality Forecast System Ho-Chun Huang 1, Pius Lee 1, Binbin Zhou 1, Jian Zeng 6,
Robin L. Dennis, Jesse O. Bash, Kristen M. Foley, Rob Gilliam, Robert W. Pinder U.S. Environmental Protection Agency, National Exposure Research Laboratory,
Daiwen Kang 1, Rohit Mathur 2, S. Trivikrama Rao 2 1 Science and Technology Corporation 2 Atmospheric Sciences Modeling Division ARL/NOAA NERL/U.S. EPA.
Preliminary Evaluation of the June 2002 Version of CMAQ Brian Eder Shaocai Yu Robin Dennis Jonathan Pleim Ken Schere Atmospheric Modeling Division* National.
PREMAQ: A New Pre-Processor to CMAQ for Air Quality Forecasting Tanya L. Otte*, George Pouliot*, and Jonathan E. Pleim* Atmospheric Modeling Division U.S.
Statistical Methods for Model Evaluation – Moving Beyond the Comparison of Matched Observations and Output for Model Grid Cells Kristen M. Foley1, Jenise.
A Performance Evaluation of Lightning-NO Algorithms in CMAQ
16th Annual CMAS Conference
Simulation of primary and secondary (biogenic and anthropogenic) organic aerosols over the United States by US EPA Models-3/CMAQ: Evaluation and regional.
Quantification of Lightning NOX and its Impact on Air Quality over the Contiguous United States Daiwen Kang, Rohit Mathur, Limei Ran, Gorge Pouliot, David.
Shaocai Yu*, Brian Eder*++, Robin Dennis*++,
Atmospheric Modeling and Analysis Division,
SELECTED RESULTS OF MODELING WITH THE CMAQ PLUME-IN-GRID APPROACH
The Value of Nudging in the Meteorology Model for Retrospective CMAQ Simulations Tanya L. Otte NOAA Air Resources Laboratory, RTP, NC (In partnership with.
Update on 2016 AQ Modeling by EPA
WRAP Modeling Forum, San Diego
REGIONAL AND LOCAL-SCALE EVALUATION OF 2002 MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS Pat Dolwick*, U.S. EPA, RTP, NC, USA.
Evaluation of Models-3 CMAQ Annual Simulation Brian Eder, Shaocai Yu, Robin Dennis, Alice Gilliland, Steve Howard,
Presentation transcript:

AN EVALUATION OF THE ETA-CMAQ AIR QUALITY FORECAST MODEL AS PART OF NOAA’S NATIONAL PROGRAM CMAQ AIRNOW AIRNOW Brian Eder* Daiwen Kang * Ken Schere* Ken Schere* Robert Gilliam* Jonathan Pleim* Atmospheric Modeling Division Atmospheric Modeling Division Air Resources Laboratory, NOAA August 26,2003 August 26,2003 * On assignment to NERL EPA * On assignment to NERL EPA RTP, NC RTP, NC 27711

Forecast Configuration -Eta Meteorology -CBIV Mechanism -SMOKE Emissions (Offline) -12 km grid resolution -22 Vertical Layers 48 Hr. Forecast (12Z Initialization) 7 July – 31 September, July – 31 August (shown) 7 July – 31 August (shown) 48 Hr.Forecast (Corrected Land-use) August 48 Hr. Forecast (Corrected Land-use) August Domain Models-3 CMAQ

This evaluation used: Hourly O 3 concentrations (ppb) from EPA’s AIRNOW network 521 stations 7 July - 31 August A suite of statistical metrics for both: discrete forecasts and categorical forecasts for the: hourly, maximum 1-hr, maximum 8-hr O 3 simulations

Two Forecast / Evaluation Types - Discrete Forecasts [Observed] versus [Forecast] - Category Forecasts (Two Category) Observed Exceedances, Non-Exceedances versus Forecast Exceedances, Non-Exceedances

Discrete Forecast / Evaluation Statistics Statistics - Summary - Regression - Biases - Errors AIRNOW [Observed] versus [Forecast]

Category Forecast / Evaluation - Two Category Forecasts Observed Exceedances, Non-Exceedances versus Forecast Exceedances, Non-Exceedances ab cd Forecast Exceedance No Yes Observed Exceedance a b c d

Category Forecast A ccuracy Percent of forecasts that correctly predict event or non-event. B ias Indicates if forecasts are under-predicted (false negatives) or over-predicted (false positives ) F alse A larm R ate Percent of times a forecast of high ozone did not occur a b c d

C ritical S uccess I ndex How well the high ozone events were predicted. P robability O f D etection Ability to predict high ozone events Category Forecast a b c d

a c a b c d a= 151 b= 1 c= 24,227 d= 4 n= 24,383 CMAQ = (AIRNOW) Max 1-hr O 3 7 July – 31 August

Summary Statistics Discrete Evaluation Categorical Evaluation [ppb]CMAQAIRNOWCMAQ = (AIRNOW) Ozone  125 ppb Mean r0.60A99.4% SD n24,383B25.5 CV Max BIASES 95 th MB15.1FAR99.4% 75 th NMB26.9%CSI0.6% 50 th th ERRORS 5 th RMSE21.9POD16.7% Min01NME31.7% Max 1- hr O 3

Temporal Evaluation – Max 1 hr O 3 7 July 1 August 31 August

Spatial Evaluation Max 1- hr O 3 Correlation 0.00 – – – – 1.00 Mean = 0.60

Max 1- hr O 3 Mean Bias Spatial Evaluation -10 – – – 50 Mean = 15.1

Spatial Evaluation Max 1- hr O 3 Root Mean Square Error 0 – – – 50 Mean = 21.9

CMAQ = (AIRNOW) a b c d a= 3276 b= 149 c= 20,979 d= 65 n= 24,469 Max 8-hr O 3

Summary Statistics Discrete Evaluation Categorical Evaluation [ppb]CMAQAIRNOWCMAQ = (AIRNOW) Ozone  85 ppb Mean r0.57A86.3% SD n24,469B16.0 CV24.5%29.7% Max BIASES 95 th MB17.6FAR95.6% 75 th NMB35.8%CSI4.2% 50 th th ERRORS 5 th RMSE23.0POD69.6% Min01NME39.1% Max 8- hr O 3

Temporal Evaluation – Max 8 hr O 3 1 August 31 August 7 July

Spatial Evaluation Max 8- hr O 3 Correlation 0.00 – – – – 1.00 Mean = 0.57

Spatial Evaluation Max 8- hr O 3 Mean Bias -10 – – – 50 Mean = 17.6

Spatial Evaluation Max-8 hr O 3 Root Mean Square Error 0 – – – 50 Mean = 23.0

Land-Use Error Land-use fields associated with Eta were being post-processed incorrectly. As a result : - Most of the domain was classified as water. - Dry deposition was greatly under simulated This error was discovered/corrected by NCEP on Sept. 9 th. This error was discovered/corrected by NCEP on Sept. 9 th. - An eight day period (12-19 August) was re-simulated. - Positive biases were cut in half, errors reduced also.

Runr MB (ppb) NMB (%) RMSE (ppb) NME (%) A (%) BFAR (%) CSI (%) POD (%) Initial Corrected Max 1-hr O 3 Max 8-hr O 3 Comparison Between Initial and Corrected Simulations August 12 – Runr MB (ppb) NMB (%) RMSE (ppb) NME (%) A (%) BFAR (%) CSI (%) POD (%) Initial Corrected

Temporal Evaluation (Corrected August 12 –19) – Max 1 hr O 3 – Max 8 hr O 3

Summary The Eta-CMAQ modeling system performed reasonably well, in this, its first attempt at forecasting ozone concentrations: Correlation: Bias: 15.1 ppb (26.9%) ppb (31.7%) Error:21.9 ppb (31.7%) ppb (39.1%) Accuracy: % An error was discovered in Eta’s post processed land-use designation that resulted in the: – under-estimation of dry deposition and – hence over-simulation of concentrations Once corrected, the positive biases and errors were greatly reduced: Correlation: Bias:7.6 ppb (13.0%) ppb (20.1%) Error:16.6 ppb (21.7%) ppb (26.3%) Accuracy: %

Contact: Brian K. Eder