REGIONAL AND LOCAL-SCALE EVALUATION OF MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS 1 Jason Brewer, 2 * Pat Dolwick, and 3 *

Slides:



Advertisements
Similar presentations
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division Changes in U.S. Regional-Scale Air.
Advertisements

2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris ENVIRON International Corporation Zion Wang UCR.
Georgia Chapter of the Air & Waste Management Association Annual Conference: Improved Air Quality Modeling for Predicting the Impacts of Controlled Forest.
Coupled NMM-CALMET Meteorology Development for the CALPUFF Air Dispersion Modelling in Complex Terrain and Shoreline Settings Presented at: European Geoscience.
Jared H. Bowden Saravanan Arunachalam
The Use of High Resolution Mesoscale Model Fields with the CALPUFF Dispersion Modelling System in Prince George BC Bryan McEwen Master’s project
Dynamical Downscaling of CCSM Using WRF Yang Gao 1, Joshua S. Fu 1, Yun-Fat Lam 1, John Drake 1, Kate Evans 2 1 University of Tennessee, USA 2 Oak Ridge.
1 Modelled Meteorology - Applicability to Well-test Flaring Assessments Environment and Energy Division Alex Schutte Science & Community Environmental.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2011 WRF Modeling Model Performance Evaluation University of North Carolina (UNC-IE)
Jenny Stocker, Christina Hood, David Carruthers, Martin Seaton, Kate Johnson, Jimmy Fung The Development and Evaluation of an Automated System for Nesting.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling Division, Applied Modeling Research Branch October 8, 2008.
Improving Cloud Simulation in Weather Research and Forecasting (WRF) Through Assimilation of GOES Satellite Observations Andrew White Advisor: Dr. Arastoo.
Tanya L. Otte and Robert C. Gilliam NOAA Air Resources Laboratory, Research Triangle Park, NC (In partnership with U.S. EPA National Exposure Research.
Regional Climate Modeling in the Source Region of Yellow River with complex topography using the RegCM3: Model validation Pinhong Hui, Jianping Tang School.
The National Environmental Agency of Georgia L. Megrelidze, N. Kutaladze, Kh. Kokosadze NWP Local Area Models’ Failure in Simulation of Eastern Invasion.
Jerold A. Herwehe Atmospheric Turbulence & Diffusion Division Air Resources Laboratory National Oceanic and Atmospheric Administration 456 S. Illinois.
Examination of the impact of recent laboratory evidence of photoexcited NO 2 chemistry on simulated summer-time regional air quality Golam Sarwar, Robert.
Russ Bullock 11 th Annual CMAS Conference October 17, 2012 Development of Methodology to Downscale Global Climate Fields to 12km Resolution.
O. Russell Bullock, Jr. National Oceanic and Atmospheric Administration (NOAA) Atmospheric Sciences Modeling Division (in partnership with the U.S. Environmental.
Importance of Lightning NO for Regional Air Quality Modeling Thomas E. Pierce/NOAA Atmospheric Modeling Division National Exposure Research Laboratory.
Implementation of the Particle & Precursor Tagging Methodology (PPTM) for the CMAQ Modeling System: Mercury Tagging 5 th Annual CMAS Conference Research.
1 Using Hemispheric-CMAQ to Provide Initial and Boundary Conditions for Regional Modeling Joshua S. Fu 1, Xinyi Dong 1, Kan Huang 1, and Carey Jang 2 1.
Jonathan Pleim 1, Robert Gilliam 1, and Aijun Xiu 2 1 Atmospheric Sciences Modeling Division, NOAA, Research Triangle Park, NC (In partnership with the.
1 Neil Wheeler, Kenneth Craig, and Clinton MacDonald Sonoma Technology, Inc. Petaluma, California Presented at the Sixth Annual Community Modeling and.
On the Model’s Ability to Capture Key Measures Relevant to Air Quality Policies through Analysis of Multi-Year O 3 Observations and CMAQ Simulations Daiwen.
Earth-Sun System Division National Aeronautics and Space Administration SPoRT SAC Nov 21-22, 2005 Regional Modeling using MODIS SST composites Prepared.
Modeling of Ammonia and PM 2.5 Concentrations Associated with Emissions from Agriculture Megan Gore, D.Q. Tong, V.P. Aneja, and M. Houyoux Department of.
Accounting for Uncertainties in NWPs using the Ensemble Approach for Inputs to ATD Models Dave Stauffer The Pennsylvania State University Office of the.
Preliminary Study: Direct and Emission-Induced Effects of Global Climate Change on Regional Ozone and Fine Particulate Matter K. Manomaiphiboon 1 *, A.
Meteorological Data Analysis Urban, Regional Modeling and Analysis Section Division of Air Resources New York State Department of Environmental Conservation.
4. Atmospheric chemical transport models 4.1 Introduction 4.2 Box model 4.3 Three dimensional atmospheric chemical transport model.
OThree Chemistry MM5/CAMx Model Diagnostic and Sensitivity Analysis Results Central California Ozone Study: Bi-Weekly Presentation 2 T. W. Tesche Dennis.
Seasonal Modeling (NOAA) Jian-Wen Bao Sara Michelson Jim Wilczak Curtis Fleming Emily Piencziak.
Georgia Institute of Technology Initial Application of the Adaptive Grid Air Quality Model Dr. M. Talat Odman, Maudood N. Khan Georgia Institute of Technology.
VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO Mike Abraczinskas North Carolina Division of Air Quality.
An air quality information system for cities with complex terrain based on high resolution NWP Viel Ødegaard, r&d department.
1 CRGAQS: Meteorological Modeling prepared for Southwest Clean Air Agency 19 June 2006 prepared by Alpine Geophysics, LLC ENVIRON International Corp.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division Using Dynamical Downscaling to Project.
Impact of high resolution modeling on ozone predictions in the Cascadia region Ying Xie and Brian Lamb Laboratory for Atmospheric Research Department of.
Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen.
Are Numerical Weather Prediction Models Getting Better? Cliff Mass, David Ovens, and Jeff Baars University of Washington.
Photo image area measures 2” H x 6.93” W and can be masked by a collage strip of one, two or three images. The photo image area is located 3.19” from left.
U.S. EPA and WIST Rob Gilliam *NOAA/**U.S. EPA
William G. Benjey* Physical Scientist NOAA Air Resources Laboratory Atmospheric Sciences Modeling Division Research Triangle Park, NC Fifth Annual CMAS.
1 Impact on Ozone Prediction at a Fine Grid Resolution: An Examination of Nudging Analysis and PBL Schemes in Meteorological Model Yunhee Kim, Joshua S.
Observed & Simulated Profiles of Cloud Occurrence by Atmospheric State A Comparison of Observed Profiles of Cloud Occurrence with Multiscale Modeling Framework.
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
Seasonal Modeling of the Export of Pollutants from North America using the Multiscale Air Quality Simulation Platform (MAQSIP) Adel Hanna, 1 Rohit Mathur,
Evaluating temporal and spatial O 3 and PM 2.5 patterns simulated during an annual CMAQ application over the continental U.S. Evaluating temporal and spatial.
Types of Models Marti Blad Northern Arizona University College of Engineering & Technology.
Boundary layer depth verification system at NCEP M. Tsidulko, C. M. Tassone, J. McQueen, G. DiMego, and M. Ek 15th International Symposium for the Advancement.
Robert W. Pinder, Alice B. Gilliland, Robert C. Gilliam, K. Wyat Appel Atmospheric Modeling Division, NOAA Air Resources Laboratory, in partnership with.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division October 21, 2009 Evaluation of CMAQ.
Operational Evaluation and Model Response Comparison of CAMx and CMAQ for Ozone & PM2.5 Kirk Baker, Brian Timin, Sharon Phillips U.S. Environmental Protection.
Intro to Modeling – Terms & concepts Marti Blad, Ph.D., P.E. ITEP
Impacts of Meteorological Variations on RRFs (Relative Response Factors) in the Demonstration of Attainment of the National Ambient Air Quality for 8-hr.
Impacts of Meteorological Conditions Modified by Urban Expansion on Surface Ozone over Yangtz River Delta and Pearl River Delta region, China Xuemei Wang,
Jonathan Pleim NOAA/ARL* RTP, NC A NEW COMBINED LOCAL AND NON-LOCAL PBL MODEL FOR METEOROLOGY AND AIR QUALITY MODELING * In Partnership with the U.S. Environmental.
Producing Meteorological Fields for Local Scale Pollutant Transport and Dispersion Estimates 1)Using the CALPUFF modeling system. 2)The model system produces.
The application of Models-3 in national policy Samantha Baker Air and Environment Quality Division, Defra.
Daiwen Kang 1, Rohit Mathur 2, S. Trivikrama Rao 2 1 Science and Technology Corporation 2 Atmospheric Sciences Modeling Division ARL/NOAA NERL/U.S. EPA.
Does nudging squelch the extremes in regional climate modeling?
Statistical Methods for Model Evaluation – Moving Beyond the Comparison of Matched Observations and Output for Model Grid Cells Kristen M. Foley1, Jenise.
A Performance Evaluation of Lightning-NO Algorithms in CMAQ
16th Annual CMAS Conference
C. Nolte, T. Spero, P. Dolwick, B. Henderson, R. Pinder
Impact of GOES Enhanced WRF Fields on Air Quality Model Performance
The Value of Nudging in the Meteorology Model for Retrospective CMAQ Simulations Tanya L. Otte NOAA Air Resources Laboratory, RTP, NC (In partnership with.
REGIONAL AND LOCAL-SCALE EVALUATION OF 2002 MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS Pat Dolwick*, U.S. EPA, RTP, NC, USA.
WRAP 2014 Regional Modeling
Presentation transcript:

REGIONAL AND LOCAL-SCALE EVALUATION OF MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS 1 Jason Brewer, 2 * Pat Dolwick, and 3 * Rob Gilliam 1 Department of Marine, Earth, and Atmospheric Sciences, North Carolina State University, Raleigh, North Carolina 2 Air Quality Modeling Group, Office of Air Quality Planning and Standards (OAQPS), USEPA, Research Triangle Park, North Carolina 3 Atmospheric Modeling Division, National Exposure Research Laboratory (NERL), USEPA, Research Triangle Park, North Carolina *On assignment from the Air Resources Laboratory, NOAA Prognostic meteorological models are often used in a retrospective mode to provide inputs to air quality models that are used for environmental planning. These inputs govern the advection, diffusion, chemical transformation, and eventual deposition of pollutants within regional air quality models such as CMAQ 1 (Community Multi-scale Air Quality modeling system) and are being investigated for use in local-scale assessments such as AERMOD 2. The air quality models have consistently been subjected to a rigorous performance assessment, but in many cases the meteorological inputs to these models are accepted as is, even though this component of the modeling arguably contains more uncertainty that could significantly affect the results of the analysis 3. Before initiating the air quality simulations, it is important to identify the biases and errors associated with the meteorological modeling. The goal of the meteorological evaluation 4 is to move toward an understanding of how the bias and error of the meteorological input data impact the resultant AQ modeling. Typically, there are two specific objectives: 1) determine if the meteorological model output fields represent a reasonable approximation of the actual meteorology that occurred during the modeling period (i.e., the “operational” evaluation), and 2) identify and quantify how the existing biases and errors in the meteorological predictions may affect the air quality modeling results (i.e., the “phenomenological” evaluation). This analysis looks at the performance of the Penn State University / National Center for Atmospheric Research mesoscale model 5 known as MM5 for two separate years (2001 and 2002) at two separate model resolutions (36 and 12km). The model evaluation is summarized for the entire domain, individual subregions within the domain, and certain individual sites to assess the suitability of the data to drive regional-scale, photochemical models (e.g., CMAQ) versus local-scale, dispersion models (e.g., AERMOD). The operational evaluation includes statistical comparisons of model/observed pairs (e.g., bias, index of agreement, root mean square errors, etc.) for multiple meteorological parameters (e.g., temperature, water vapor mixing ratio, winds, etc.). The phenomenological evaluation is based on existing air quality conceptual models and assesses performance for varied phenomena such as trajectories, low-level jets, frontal passages, and air mass residence time and uses a different universe of statistics such as false alarm rates and probabilities of detection. This poster is only able to show a small subset of all the completed analyses on which the conclusions are based. 1. Introduction / Background 3. Sample Regional-Scale Operational Evaluation ( km MM5) and 2002 MM5 Configuration Model version: 2001 (36): (12): (w/ minor fixes to KF2 & Reisner 2) 2002 (36): (12): Domain Size: 2001/2002 (36): 165 * 129 * /2002 (12): 290 * 251 * 34 Major Physics Options: Radiation: RRTM Long-wave Radiation Cumulus Parametrization: Kain-Fritsch 1 (2001/36 only), Kain-Fritsch 2 Microphysics: Reisner 2 (2001), Reisner 1 (2002) Land Surface Model / PBL Scheme: Pleim-Xiu / Asymmetric Convective Method (ACM1) Analysis Nudging (12km): winds (aloft): 1.0E-4; winds (surface): 1.0E-4, temperature (aloft): 1.0E-4; temperature (surface): N/A moisture (aloft): 1.0E-5; moisture (surface): N/A Run Durations: 5.5 day individual runs, w/in 7 two-month simulations Evaluation software: Atmospheric Model Evaluation Tool (AMET) 7. Conclusions All four sets of meteorological model output fields represent a reasonable approximation of the actual meteorology that occurred during the modeling period. (See panel 3.) Qualitative comparisons of synoptic patterns (not shown) indicate the model captures large scale features such as high pressure domes and upper-level troughs. Certainly, the most troublesome aspect of meteorological model performance is the surface temperature “cold bias” during the winter, especially January. Across the four MM5 simulations, the January cold bias typically averaged around 2-3 deg C. The effect is largest overnight (panel 5d). The resultant tendency is to overestimate stability in the lowest layers. This could have a significant impact on the air quality results as pollutants emitted at the surface may not be properly mixed. Generally, bias/error does not appear to be a function of region. However, individual model / observation comparisons in space/time can show large deviations. Caution should be exercised when using these meteorological data for air quality modeling in the Rocky Mountain region where the model errors/biases are much larger than in other regions analyzed. (See panel 3.) Care will have to be exercised when using these MM5 results on the local scale. When averaged regionally, there is little to no bias in wind directions, but as shown in panel 4, local variances can be considerably higher. Users of Gaussian plume based models should scrutinize the MM5 performance closely over their areas of interest. The model is generally unbiased for precipitation at large scale (panel 5a), though the 2001 results appear to better match the observations than 2002, perhaps indicating that use of the Reisner 2 microphysics scheme was justified. The “key site” analysis shown in panel 6 looked at MM5 performance over a specific ozone event in the Ohio Valley. These evaluations can be time-consuming but are important for identifying appropriate modeling episodes. This evaluation is not entirely complete. We would like to do more analysis on cloud coverage, PBL heights, as well as model performance as a function of meteorological regime (clusters), and wind field comparisons against trajectory models. Note: All four of these data sets are available. If you are interested in acquiring the data, please Pat Dolwick The transfer process requires the user to provide USB drives. Acknowledgements / References The MM5 runs evaluated as part of this study were completed by Alpine Geophysics (2001 simulations) and Computer Science Corporation (2002 simulations). The authors would like to thank Dennis McNally, Lara Reynolds, and Allan Huffman for the effort they put into completing the meteorological modeling. 1 Byun, D.W., and K. L. Schere, 2006: Review of the Governing Equations, Computational Algorithms, and Other Components of the Models-3 Community Multiscale Air Quality (CMAQ) Modeling System. Applied Mechanics Reviews, Volume 59, Number 2 (March 2006), pp U.S. Environmental Protection Agency, User’s Guide for the AMS/EPA Regulatory Model AERMOD, EPA-454/B , September Tesche T. W., D.E. McNally, and C. Tremback, (2002), “Operational evaluation of the MM5 meteorological model over the continental United States: Protocol for annual and episodic evaluation.” Submitted to USEPA as part of Task Order 4TCG (July 2002) 4- U.S. Environmental Protection Agency, Guidance on the Use of Models and Other Analyses for Demonstrating Attainment of Air Quality Goals for Ozone, PM2.5, and Regional Haze, Draft 3.2, September Grell, G.A., J. Dudhia and D.R. Stauffer, (1994), “A Description of the Fifth-Generation Penn State/NCAR Mesoscale Model (MM5)”, NCAR/TN-398+STR, 138 pp. 4. Sample Local-Scale Operational Evaluation Results (3 locations: Birmingham, Detroit, and Seattle) Moisture 2001 – 36 km 2002 – 12 km2002 – 36 km2001 – 12 km Comparison of MM5 predictions vs. NWS observations Temperature: (3 sites, by quarter, km MM5) Bias (K) Error (K) DET 1Q DET 2Q DET 3Q DET 4Q BHM 1Q BHM 2Q BHM 3Q BHM 4Q SEA 1Q SEA 2Q SEA 3Q SEA 4Q Temperature Wind Direction Wind Speed 5. Sample Phenomenological Evaluation Results ( km MM5) a) Observed vs. Modeled Precipitation (May 2002) b) Seasonally-averaged vertical profiles: Model vs. Obis (GSO) c) Detailed temperature performance d) Diurnal temperature performance Winte r Summe r 6. “Key Site” Evaluation Results Northern Indiana – August 3, 2002 (12 km MM5) Cincinnati, OH – Summer 2002 (12km MM5) Note: 2-meter temperature (T), mixing ratio (Q), wind speed (WS), and wind direction (WD) are in units of: K, g/kg, m/s, and degrees, respectively.