Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Spatial Analysis A more complete statistical evaluation, including diurnal variations, of the.

Slides:



Advertisements
Similar presentations
Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP October 27, 2003, CMAS Annual Meeting, RTP, NC University of California, Riverside.
Advertisements

Georgia Institute of Technology Evaluation of CMAQ with FAQS Episode of August 11 th -20 th, 2000 Yongtao Hu, M. Talat Odman, Maudood Khan and Armistead.
U.S. EPA Office of Research & Development October 30, 2013 Prakash V. Bhave, Mary K. McCabe, Valerie C. Garcia Atmospheric Modeling & Analysis Division.
An Assessment of CMAQ with TEOM Measurements over the Eastern US Michael Ku, Chris Hogrefe, Kevin Civerolo, and Gopal Sistla PM Model Performance Workshop,
Photo image area measures 2” H x 6.93” W and can be masked by a collage strip of one, two or three images. The photo image area is located 3.19” from left.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2008 CAMx Modeling Model Performance Evaluation Summary University of North Carolina.
The AIRPACT-3 Photochemical Air Quality Forecast System: Evaluation and Enhancements Jack Chen, Farren Thorpe, Jeremy Avis, Matt Porter, Joseph Vaughan,
A Comparative Dynamic Evaluation of the AURAMS and CMAQ Air Quality Modeling Systems Steven Smyth a,b, Michael Moran c, Weimin Jiang a, Fuquan Yang a,
The AIRPACT-3 Photochemical Air Quality Forecast System: Evaluation and Enhancements Jack Chen, Farren Thorpe, Jeremy Avis, Matt Porter, Joseph Vaughan,
Model Performance Evaluation Data Base and Software - Application to CENRAP Betty K. Pun, Shu-Yun Chen, Kristen Lohman, Christian Seigneur PM Model Performance.
PM Model Performance Goals and Criteria James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26,
CMAS Conference, October 16 – 18, 2006 The work presented here was performed by the New York State Department of Environmental Conservation with partial.
Hydrologic Modeling: Verification, Validation, Calibration, and Sensitivity Analysis Fritz R. Fiedler, P.E., Ph.D.
University of California Riverside, ENVIRON Corporation, MCNC WRAP Regional Modeling Center WRAP Regional Haze CMAQ 1996 Model Performance and for Section.
Calculating Statistics: Concentration Related Performance Goals James W. Boylan Georgia Department of Natural Resources PM Model Performance Workshop Chapel.
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 3SAQS 2011 Modeling Update University of North Carolina (UNC-IE) ENVIRON International.
Model Performance Evaluation Database and Software Betty K. Pun, Kristen Lohman, Shu-Yun Chen, and Christian Seigneur AER, San Ramon, CA Presentation at.
1 Using Hemispheric-CMAQ to Provide Initial and Boundary Conditions for Regional Modeling Joshua S. Fu 1, Xinyi Dong 1, Kan Huang 1, and Carey Jang 2 1.
Science Investigation Discussion of Results to Date & Future Work Red Deer Particulate Matter Information Session Maxwell Mazur
V:\corporate\marketing\overview.ppt CRGAQS: Initial CAMx Results Presentation to the Gorge Study Technical Team By ENVIRON International Corporation October.
A comparison of PM 2.5 simulations over the Eastern United States using CB-IV and RADM2 chemical mechanisms Michael Ku, Kevin Civerolo, and Gopal Sistla.
PM Model Performance in Southern California Using UAMAERO-LT Joseph Cassmassi Senior Meteorologist SCAQMD February 11, 2004.
CMAS Conference, October 6 – 8, 2008 The work presented in this paper was performed by the New York State Department of Environmental Conservation with.
Model Evaluation Comparing Model Output to Ambient Data Christian Seigneur AER San Ramon, California.
A Comparative Performance Evaluation of the AURAMS and CMAQ Air Quality Modelling Systems Steven C. Smyth, Weimin Jiang, Helmut Roth, and Fuquan Yang ICPET,
Post-processing air quality model predictions of fine particulate matter (PM2.5) at NCEP James Wilczak, Irina Djalalova, Dave Allured (ESRL) Jianping Huang,
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
Evaluating temporal and spatial O 3 and PM 2.5 patterns simulated during an annual CMAQ application over the continental U.S. Evaluating temporal and spatial.
Evaluation of 2002 Multi-pollutant Platform: Air Toxics, Mercury, Ozone, and Particulate Matter US EPA / OAQPS / AQAD / AQMG Sharon Phillips, Kai Wang,
Implementation Workgroup Meeting December 6, 2006 Attribution of Haze Workgroup’s Monitoring Metrics Document Status: 1)2018 Visibility Projections – Alternative.
Evaluation of CMAQ Driven by Downscaled Historical Meteorological Fields Karl Seltzer 1, Chris Nolte 2, Tanya Spero 2, Wyat Appel 2, Jia Xing 2 14th Annual.
AN EVALUATION OF THE ETA-CMAQ AIR QUALITY FORECAST MODEL AS PART OF NOAA’S NATIONAL PROGRAM CMAQ AIRNOW AIRNOW Brian Eder* Daiwen Kang * Ken Schere* Ken.
AoH/MF Meeting, San Diego, CA, Jan 25, 2006 WRAP 2002 Visibility Modeling: Summary of 2005 Modeling Results Gail Tonnesen, Zion Wang, Mohammad Omary, Chao-Jung.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 3SAQS 2011 CAMx Model Performance Evaluation University of North Carolina (UNC-IE)
Impact of Temporal Fluctuations in Power Plant Emissions on Air Quality Forecasts Prakash Doraiswamy 1, Christian Hogrefe 1,2, Eric Zalewsky 2, Winston.
1 Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Temporal Analysis A more complete statistical evaluation, including diurnal variations, of.
Western Air Quality Study (WAQS) Intermountain Data Warehouse (IWDW) Model Performance Evaluation CAMx and CMAQ 2011b University of North Carolina (UNC-IE)
Georgia Institute of Technology Evaluation of the 2006 Air Quality Forecasting Operation in Georgia Talat Odman, Yongtao Hu, Ted Russell School of Civil.
V:\corporate\marketing\overview.ppt CRGAQS: CAMx Sensitivity Results Presentation to the Gorge Study Technical Team By ENVIRON International Corporation.
Daiwen Kang 1, Rohit Mathur 2, S. Trivikrama Rao 2 1 Science and Technology Corporation 2 Atmospheric Sciences Modeling Division ARL/NOAA NERL/U.S. EPA.
Preliminary Evaluation of the June 2002 Version of CMAQ Brian Eder Shaocai Yu Robin Dennis Jonathan Pleim Ken Schere Atmospheric Modeling Division* National.
The Influence on CMAQ Modeled Wet and Dry Deposition of Advances in the CMAQ Systems for Meteorology and Emissions Robin Dennis, Jesse Bash, Kristen Foley,
WRAP Technical Work Overview
Simulation of PM2.5 Trace Elements in Detroit using CMAQ
Global Influences on Local Pollution
Statistical Methods for Model Evaluation – Moving Beyond the Comparison of Matched Observations and Output for Model Grid Cells Kristen M. Foley1, Jenise.
A Performance Evaluation of Lightning-NO Algorithms in CMAQ
Two Decades of WRF/CMAQ simulations over the continental U. S
Dynamic evaluation of WRF-CMAQ PM2
Photochemical Model Performance and Consistency
OZONE Monitoring Network
Development of a 2007-Based Air Quality Modeling Platform
Proposed Ozone Monitoring Revisions Ozone Season and Methods
Yongtao Hu, Jaemeen Baek, M. Talat Odman and Armistead G. Russell
7th Annual CMAS Conference
SUMMER READING Learn about JPS Summer Reading requirements.
Update on 2016 AQ Modeling by EPA
A Review of Time Integrated PM2.5 Monitoring Data in the United States
WRAP Regional Modeling Center (RMC)
WRAP Modeling Forum, San Diego
REGIONAL AND LOCAL-SCALE EVALUATION OF 2002 MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS Pat Dolwick*, U.S. EPA, RTP, NC, USA.
RMC Activity Update Emissions Forum July 1, 2003.
Midwest RPO Fires Status
Results from 2018 Preliminary Reasonable Progress Modeling
PGM Boundary conditions
Evaluation of Models-3 CMAQ Annual Simulation Brian Eder, Shaocai Yu, Robin Dennis, Alice Gilliland, Steve Howard,
Current Research on 3-D Air Quality Modeling: wildfire!
Measurement Needs for AQ Models
Diagnostic and Operational Evaluation of 2002 and 2005 Estimated 8-hr Ozone to Support Model Attainment Demonstrations Kirk Baker Donna Kenski Lake Michigan.
Presentation transcript:

Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Spatial Analysis A more complete statistical evaluation, including diurnal variations, of the summer portion (May 15 – September 30, 2002) is available on the NYSDEC FTP site  ftp://ftp.dec.state.ny.us/dar/air_research/kevin Here we present a subset of the analysis of selected species on a spatial basis Daily maximum 8-hour O3, FRM PM2.5, and major speciation (EPA STN) data Only data from AQS are considered here; analysis of IMPROVE, CASTNet & special data are on the FTP site

Daily maximum 8-hour average O3 with a minimum observed threshold of 40 ppb at 208 SLAMS/NAMS sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.

Daily maximum 8-hour average O3 with a minimum observed threshold of 60 ppb at 208 SLAMS/NAMS sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.

Daily average FRM PM2.5 mass at 257 SLAMS/NAMS sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site, and only those days when the observed and predicted mass agreed to within a factor of 25 are included. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.

Daily average SO4 mass at 49 EPA/STN sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site, and only those days when the observed and predicted mass agreed to within a factor of 25 are included. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.

Daily average NO3 mass at 49 EPA/STN sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site, and only those days when the observed and predicted mass agreed to within a factor of 25 are included. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.

Daily average NH4 mass at 49 EPA/STN sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site, and only those days when the observed and predicted mass agreed to within a factor of 25 are included. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.

Daily average EC mass at 49 EPA/STN sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site, and only those days when the observed and predicted mass agreed to within a factor of 25 are included. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.

Daily average OM (=1.8×blank-corrected OC) mass at 49 EPA/STN sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site, and only those days when the observed and predicted mass agreed to within a factor of 25 are included. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.

Daily average soil/crustal mass at 49 EPA/STN sites across the OTR and Virginia: (A) observed averages, (B) root mean square error (RMSE), and (C) mean bias (MB=P-O). At least 10 days were required for displaying the statistics at a given site, and only those days when the observed and predicted mass agreed to within a factor of 25 are included. Data from the July 6-9 period are excluded since they may be affected by the Canadian forest fires.