Evaluation of Models-3 CMAQ - 2001 Annual Simulation Brian Eder, Shaocai Yu, Robin Dennis, Alice Gilliland, Steve Howard,

Slides:



Advertisements
Similar presentations
A PERFORMANCE EVALUATION OF THE ETA - CMAQ AIR QUALITY FORECAST MODEL FOR THE SUMMER OF 2004 CMAS Workshop Chapel Hill, NC 20 October, 2004.
Advertisements

Georgia Institute of Technology Evaluation of CMAQ with FAQS Episode of August 11 th -20 th, 2000 Yongtao Hu, M. Talat Odman, Maudood Khan and Armistead.
Perspectives in Designing and Operating a Regional Ammonia Monitoring Network Gary Lear USEPA Clean Air Markets Division.
An initial linkage of the CMAQ modeling system at neighborhood scales with a human exposure model Jason Ching/Thomas Pierce Air-Surface Processes Modeling.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2008 CAMx Modeling Model Performance Evaluation Summary University of North Carolina.
Christian Seigneur AER San Ramon, CA
The AIRPACT-3 Photochemical Air Quality Forecast System: Evaluation and Enhancements Jack Chen, Farren Thorpe, Jeremy Avis, Matt Porter, Joseph Vaughan,
The AIRPACT-3 Photochemical Air Quality Forecast System: Evaluation and Enhancements Jack Chen, Farren Thorpe, Jeremy Avis, Matt Porter, Joseph Vaughan,
Jenny Stocker, Christina Hood, David Carruthers, Martin Seaton, Kate Johnson, Jimmy Fung The Development and Evaluation of an Automated System for Nesting.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling Division, Applied Modeling Research Branch October 8, 2008.
Shaocai Yu *, Brian Eder* ++, Robin Dennis* ++, Shao-hang Chu**, Stephen Schwartz*** *Atmospheric Sciences Modeling Division, National Exposure Research.
Simulating diurnal changes of speciated particulate matter in Atlanta, Georgia using CMAQ Yongtao Hu, Jaemeen Baek, Bo Yan, Rodney Weber, Sangil Lee, Evan.
E X P E R I E N C E Y O U R A M E R I C A Management of Air Quality Monitoring Data Debbie Miller National Park Service U.S. Department of the Interior.
Calculating Statistics: Concentration Related Performance Goals James W. Boylan Georgia Department of Natural Resources PM Model Performance Workshop Chapel.
Sensitivity of top-down correction of 2004 black carbon emissions inventory in the United States to rural-sites versus urban-sites observational networks.
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
Importance of Lightning NO for Regional Air Quality Modeling Thomas E. Pierce/NOAA Atmospheric Modeling Division National Exposure Research Laboratory.
1 Using Hemispheric-CMAQ to Provide Initial and Boundary Conditions for Regional Modeling Joshua S. Fu 1, Xinyi Dong 1, Kan Huang 1, and Carey Jang 2 1.
CMAS special session Oct 13, 2010 Air pollution exposure estimation: 1.what’s been done? 2.what’s wrong with that? 3.what can be done? 4.how and what to.
Impacts of Biomass Burning Emissions on Air Quality and Public Health in the United States Daniel Tong $, Rohit Mathur +, George Pouliot +, Kenneth Schere.
The Use of Source Apportionment for Air Quality Management and Health Assessments Philip K. Hopke Clarkson University Center for Air Resources Engineering.
Ambient Air Monitoring Networks 2010 CMAS Conference Chapel Hill, NC October 13, 2010 Rich Scheffe, Sharon Phillips, Wyatt Appel, Lew Weinstock, Tim Hanley,
University of North Carolina at Chapel Hill Carolina Environmental Programs Models-3 Adel Hanna Carolina Environmental Program University of North Carolina.
A detailed evaluation of the WRF-CMAQ forecast model performance for O 3, and PM 2.5 during the 2006 TexAQS/GoMACCS study Shaocai Yu $, Rohit Mathur +,
Model Evaluation Comparing Model Output to Ambient Data Christian Seigneur AER San Ramon, California.
Operational Evaluation and Comparison of CMAQ and REMSAD- An Annual Simulation Brian Timin, Carey Jang, Pat Dolwick, Norm Possiel, Tom Braverman USEPA/OAQPS.
Use of space-based tropospheric NO 2 observations in regional air quality modeling Robert W. Pinder 1, Sergey L. Napelenok 1, Alice B. Gilliland 1, Randall.
U.S. EPA and WIST Rob Gilliam *NOAA/**U.S. EPA
William G. Benjey* Physical Scientist NOAA Air Resources Laboratory Atmospheric Sciences Modeling Division Research Triangle Park, NC Fifth Annual CMAS.
Presented at the 7th Annual CMAS Conference, Chapel Hill, NC, October 6-8, 2008 Identifying Optimal Temporal Scale for the Correlation of AOD and Ground.
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
Diagnostic Study on Fine Particulate Matter Predictions of CMAQ in the Southeastern U.S. Ping Liu and Yang Zhang North Carolina State University, Raleigh,
Evaluating temporal and spatial O 3 and PM 2.5 patterns simulated during an annual CMAQ application over the continental U.S. Evaluating temporal and spatial.
Opening Remarks -- Ozone and Particles: Policy and Science Recent Developments & Controversial Issues GERMAN-US WORKSHOP October 9, 2002 G. Foley *US EPA.
Robert W. Pinder, Alice B. Gilliland, Robert C. Gilliam, K. Wyat Appel Atmospheric Modeling Division, NOAA Air Resources Laboratory, in partnership with.
Evaluation of 2002 Multi-pollutant Platform: Air Toxics, Mercury, Ozone, and Particulate Matter US EPA / OAQPS / AQAD / AQMG Sharon Phillips, Kai Wang,
Evaluation of CMAQ Driven by Downscaled Historical Meteorological Fields Karl Seltzer 1, Chris Nolte 2, Tanya Spero 2, Wyat Appel 2, Jia Xing 2 14th Annual.
October 1-3, th Annual CMAS Meeting Comparison of CMAQ and CAMx for an Annual Simulation over the South Coast Air Basin Jin Lu 1, Kathleen Fahey.
AN EVALUATION OF THE ETA-CMAQ AIR QUALITY FORECAST MODEL AS PART OF NOAA’S NATIONAL PROGRAM CMAQ AIRNOW AIRNOW Brian Eder* Daiwen Kang * Ken Schere* Ken.
Operational Evaluation and Model Response Comparison of CAMx and CMAQ for Ozone & PM2.5 Kirk Baker, Brian Timin, Sharon Phillips U.S. Environmental Protection.
W. T. Hutzell 1, G. Pouliot 2, and D. J. Luecken 1 1 Atmospheric Modeling Division, U. S. Environmental Protection Agency 2 Atmospheric Sciences Modeling.
1 Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Temporal Analysis A more complete statistical evaluation, including diurnal variations, of.
Operational and diagnostic evaluations of ozone forecasts by the Eta-CMAQ model suite during the 2002 new England air quality study (NEAQS) Shaocai Yu.
A study of process contributions to PM 2.5 formation during the 2004 ICARTT period using the Eta-CMAQ forecast model over the eastern U.S. Shaocai Yu $,
The application of Models-3 in national policy Samantha Baker Air and Environment Quality Division, Defra.
Robin L. Dennis, Jesse O. Bash, Kristen M. Foley, Rob Gilliam, Robert W. Pinder U.S. Environmental Protection Agency, National Exposure Research Laboratory,
Atmospheric Deposition Unit 4451 Research on Air Pollution Distribution and Effects in California Mountains Andrzej Bytnerowicz, Michael Arbaugh, Nancy.
7. Air Quality Modeling Laboratory: individual processes Field: system observations Numerical Models: Enable description of complex, interacting, often.
Daiwen Kang 1, Rohit Mathur 2, S. Trivikrama Rao 2 1 Science and Technology Corporation 2 Atmospheric Sciences Modeling Division ARL/NOAA NERL/U.S. EPA.
Preliminary Evaluation of the June 2002 Version of CMAQ Brian Eder Shaocai Yu Robin Dennis Jonathan Pleim Ken Schere Atmospheric Modeling Division* National.
The Influence on CMAQ Modeled Wet and Dry Deposition of Advances in the CMAQ Systems for Meteorology and Emissions Robin Dennis, Jesse Bash, Kristen Foley,
N Engl J Med Jun 29;376(26): doi: 10
Simulation of PM2.5 Trace Elements in Detroit using CMAQ
Improving an Air Quality Decision Support System through the Integration of Satellite Data with Ground-Based, Modeled, and Emissions Data Demonstration.
Development of a Multipollutant Version of the Community Multiscale Air Quality (CMAQ) Modeling System Shawn Roselle, Deborah Luecken, William Hutzell,
Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Spatial Analysis A more complete statistical evaluation, including diurnal variations, of the.
Statistical Methods for Model Evaluation – Moving Beyond the Comparison of Matched Observations and Output for Model Grid Cells Kristen M. Foley1, Jenise.
A Performance Evaluation of Lightning-NO Algorithms in CMAQ
Simulation of primary and secondary (biogenic and anthropogenic) organic aerosols over the United States by US EPA Models-3/CMAQ: Evaluation and regional.
Shaocai Yu*, Brian Eder*++, Robin Dennis*++,
Atmospheric Modeling and Analysis Division,
Kristen Olsen and Yang Zhang
Using CMAQ to Interpolate Among CASTNET Measurements
SELECTED RESULTS OF MODELING WITH THE CMAQ PLUME-IN-GRID APPROACH
PMcoarse , Monitoring Budgets, and AQI
J. Burke1, K. Wesson2, W. Appel1, A. Vette1, R. Williams1
7th Annual CMAS Conference
Improving an Air Quality Decision Support System through the Integration of Satellite Data with Ground-Based, Modeled, and Emissions Data Demonstration.
Update on 2016 AQ Modeling by EPA
A Review of Time Integrated PM2.5 Monitoring Data in the United States
Presentation transcript:

Evaluation of Models-3 CMAQ - 2001 Annual Simulation Brian Eder, Shaocai Yu, Robin Dennis, Alice Gilliland, Steve Howard, Alfreida Rankin Atmospheric Modeling Division National Exposure Research Laboratory U.S. Environmental Protection Agency Atmospheric Sciences Modeling Division Air Resources Laboratory National Oceanic and Atmospheric Administration Research Triangle Park, NC 27711

CMAQ General Description CMAQ is an Eulerian model that simulates the atmospheric and surface processes affecting the transport, transformation and deposition of air pollutants and their precursors. CMAQ employs a “one atmosphere” philosophy that tackles the complex interactions among multiple atmospheric pollutants on a myriad of scales. Pollutants considered within CMAQ include tropospheric ozone, particulate matter, airborne toxics, as well as acidic and nutrient species. The model also calculates visibility parameters.

CMAQ Evaluation CMAQ needs to be evaluated against observational data: - characterize its performance - establish confidence in the regulatory community Evaluation compares simulated ambient concentrations using: - IMPROVE - CASTNet - STN Unfortunately, these networks have different sampling protocols: - temporal issues: weekly vs. daily - spatial issues: remote, rural, urban - measurement issues: filters, flow rates

Species: SO4, NO3, NH4, PM2.5, OC, EC (μg m-3) Statistics: Summary statistics and plots Mean Bias (MB) Normalized Mean Bias (NMB) Root Mean Square Error (RMSE) Normalized Mean Error (NME) Observational Data: - IMPROVE (111 sites) CASTNet (73 sites) STN (varied)

SO4 – Annual Results NMB NME CASTNet IMPROVE STN n 3,737 13,447 6,970 0.92 0.86 0.78 MB -0.12 0.04 0.22 NMB(%) -4.0 2.0 6.0 RMSE 1.12 1.29 2.32 NME(%) 24.0 40.0 43.0 NME