2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris ENVIRON International Corporation Zion Wang UCR.

Slides:



Advertisements
Similar presentations
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Advertisements

Mei Xu, Jamie Wolff and Michelle Harrold National Center for Atmospheric Research (NCAR) Research Applications Laboratory (RAL) and Developmental Testbed.
David J. Sailor1 and Hongli Fan2 1. Portland State University
East Texas Air Quality Forecasting Systems (ETAQ-F) Evaluation of Summer 2006 Simulations for TexAQS-II and Transition to Assessment Study Daewon W. Byun.
Template Meteorological Modeling Protocol for the Three States Air Quality Study (3SAQS) Ralph Morris and Bart Brashers ENVIRON International Corporation.
Updates on NOAA MM5 Assessment Where we left off Buoy assessment Temperature problems Solar radiation assessment Z T simulation Analysis nudging Where.
Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP October 27, 2003, CMAS Annual Meeting, RTP, NC University of California, Riverside.
1 CODATA 2006 October 23-25, 2006, Beijing Cryospheric Data Assimilation An Integrated Approach for Generating Consistent Cryosphere Data Set Xin Li World.
Jared H. Bowden Saravanan Arunachalam
An Assessment of CMAQ with TEOM Measurements over the Eastern US Michael Ku, Chris Hogrefe, Kevin Civerolo, and Gopal Sistla PM Model Performance Workshop,
Dynamical Downscaling of CCSM Using WRF Yang Gao 1, Joshua S. Fu 1, Yun-Fat Lam 1, John Drake 1, Kate Evans 2 1 University of Tennessee, USA 2 Oak Ridge.
Huang et al: MTG-IRS OSSEMMT, June MTG-IRS OSSE on regional scales Xiang-Yu Huang, Hongli Wang, Yongsheng Chen and Xin Zhang National Center.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2011 WRF Modeling Model Performance Evaluation University of North Carolina (UNC-IE)
Robert LaPlante NOAA/NWS Cleveland, OH David Schwab Jia Wang NOAA/GLERL Ann Arbor, MI 22 March 2011.
Warm Up 3/31/08 1.True or False: More water vapor can exist in warm air than cold air. 2.Explain briefly how wind forms. 3.What are low, sheetlike clouds.
Jamie Wolff Jeff Beck, Laurie Carson, Michelle Harrold, Tracy Hertneky 15 April 2015 Assessment of two microphysics schemes in the NOAA Environmental Modeling.
OAQPS Air Quality Modeling Group Fine-scale Meteorological Simulation of Cold Pools in Salt Lake City Chris Misenis, Kirk Baker, Pat Dolwick October 29,
Tanya L. Otte and Robert C. Gilliam NOAA Air Resources Laboratory, Research Triangle Park, NC (In partnership with U.S. EPA National Exposure Research.
V:\corporate\marketing\overview.ppt CRGAQS: Meteorological Modeling Presentation to the SWCAA By ENVIRON International Corporation Alpine Geophysics, LLC.
The Sensitivity of a Real-Time Four- Dimensional Data Assimilation Procedure to Weather Research and Forecast Model Simulations: A Case Study Hsiao-ming.
November 1, 2013 Bart Brashers, ENVIRON Jared Heath Bowden, UNC 3SAQS WRF Modeling Recommendations.
Russ Bullock 11 th Annual CMAS Conference October 17, 2012 Development of Methodology to Downscale Global Climate Fields to 12km Resolution.
Modeling Studies of Air Quality in the Four Corners Region National Park Service U.S. Department of the Interior Cooperative Institute for Research in.
CMAQ Evaluation Preliminary 2002 version C WRAP 2002 Visibility Modeling: Annual CMAQ Performance Evaluation using Preliminary 2002 version C Emissions.
Page1 PAGE 1 The influence of MM5 nudging schemes on CMAQ simulations of benzo(a)pyrene concentrations and depositions in Europe Volker Matthias, GKSS.
Dynamical Downscaling Developing a Model Framework for WRF for Future GCM Downscaling Jared H. Bowden Tanya L. Otte June 25, th Annual Meteorological.
2004 Workplan WRAP Regional Modeling Center Prepared by: Gail Tonnesen, University of California Riverside Ralph Morris, ENVIRON Corporation Zac Adelman,
1. Objectives Impacts of Land Use Changes on California’s Climate Hideki Kanamaru Masao Kanamitsu Experimental Climate Prediction.
Template WRF Evaluation Against Observations in Louisiana Chris Emery ENVIRON International Corporation, Novato CA November 14, 2012.
1 Using Hemispheric-CMAQ to Provide Initial and Boundary Conditions for Regional Modeling Joshua S. Fu 1, Xinyi Dong 1, Kan Huang 1, and Carey Jang 2 1.
Comparison of Different Approaches NCAR Earth System Laboratory National Center for Atmospheric Research NCAR is Sponsored by NSF and this work is partially.
Jonathan Pleim 1, Robert Gilliam 1, and Aijun Xiu 2 1 Atmospheric Sciences Modeling Division, NOAA, Research Triangle Park, NC (In partnership with the.
WRAP Regional Modeling Center April 25-26, 2006 AoH Work Group Meeting Regional Modeling Center Status Report AoH Workgroup Meeting Seattle, WA April 25-26,
Meteorological Data Analysis Urban, Regional Modeling and Analysis Section Division of Air Resources New York State Department of Environmental Conservation.
OThree Chemistry MM5/CAMx Model Diagnostic and Sensitivity Analysis Results Central California Ozone Study: Bi-Weekly Presentation 2 T. W. Tesche Dennis.
Seasonal Modeling (NOAA) Jian-Wen Bao Sara Michelson Jim Wilczak Curtis Fleming Emily Piencziak.
VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO Mike Abraczinskas North Carolina Division of Air Quality.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
V:\corporate\marketing\overview.ppt GRGAQS: Meteorological Modeling Presentation to the SWCAA By ENVIRON International Corporation Alpine Geophysis, LLC.
Melanie Follette-Cook (MSU/GESTAR) Christopher Loughner (ESSIC, UMD) Kenneth Pickering (NASA GSFC) Rob Gilliam (EPA) Jim MacKay (TCEQ) CMAS Oct 5-7, 2015.
1 CRGAQS: Meteorological Modeling prepared for Southwest Clean Air Agency 19 June 2006 prepared by Alpine Geophysics, LLC ENVIRON International Corp.
Evaluation of 2002 Annual 12km MM5 Surface Parameters for OTC Modeling Shan He and Gary Kleiman NESCAUM And Winston Hao NYDEC Review of Application and.
Photo image area measures 2” H x 6.93” W and can be masked by a collage strip of one, two or three images. The photo image area is located 3.19” from left.
William G. Benjey* Physical Scientist NOAA Air Resources Laboratory Atmospheric Sciences Modeling Division Research Triangle Park, NC Fifth Annual CMAS.
1 Impact on Ozone Prediction at a Fine Grid Resolution: An Examination of Nudging Analysis and PBL Schemes in Meteorological Model Yunhee Kim, Joshua S.
Evapotranspiration Estimates over Canada based on Observed, GR2 and NARR forcings Korolevich, V., Fernandes, R., Wang, S., Simic, A., Gong, F. Natural.
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology Tropospheric Emission Spectrometer Studying.
Template Reducing Vertical Transport Over Complex Terrain in Photochemical Grid Models Chris Emery, Ed Tai, Ralph Morris, Greg Yarwood ENVIRON International.
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
VISTAS Meteorological Modeling 2002 Simulation May 25, 2004 National RPO Modeling Meeting Denver, CO George Bridgers North Carolina Division of Air Quality.
Boundary layer depth verification system at NCEP M. Tsidulko, C. M. Tassone, J. McQueen, G. DiMego, and M. Ek 15th International Symposium for the Advancement.
WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Visibility Modeling: Emission, Meteorology Inputs and CMAQ Performance.
General Meeting Moscow, 6-10 September 2010 High-Resolution verification for Temperature ( in northern Italy) Maria Stefania Tesini COSMO General Meeting.
AoH/MF Meeting, San Diego, CA, Jan 25, 2006 WRAP 2002 Visibility Modeling: Summary of 2005 Modeling Results Gail Tonnesen, Zion Wang, Mohammad Omary, Chao-Jung.
WRAP RMC Phase II Wind Blown Dust Project Results & Status ENVIRON International Corporation and University of California, Riverside Dust Emission Joint.
Evaluation of CAMx: Issues Related to Sectional Models Ralph Morris, Bonyoung Koo, Steve Lau and Greg Yarwood ENVIRON International Corporation Novato,
Operational Verification at HNMS
2002 MM5 Model Evaluation 12 & 36 km Sensitivity Tests
Impact of Traditional and Non-traditional Observation Sources using the Grid-point Statistical Interpolation Data Assimilation System for Regional Applications.
Warm Up 3/31/08 True or False: More water vapor can exist in warm air than cold air. Explain briefly how wind forms. What are low, sheetlike clouds called?
Dynamical downscaling of ERA-40 with WRF in complex terrain in Norway – comparison with ENSEMBLES U. Heikkilä, A. D. Sandvik and A.
Recent Climate Change Modeling Results
Winter storm forecast at 1-12 h range
Lidia Cucurull, NCEP/JCSDA
The Value of Nudging in the Meteorology Model for Retrospective CMAQ Simulations Tanya L. Otte NOAA Air Resources Laboratory, RTP, NC (In partnership with.
WRAP Modeling Forum, San Diego
REGIONAL AND LOCAL-SCALE EVALUATION OF 2002 MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS Pat Dolwick*, U.S. EPA, RTP, NC, USA.
WRAP 2014 Regional Modeling
Presentation transcript:

2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris ENVIRON International Corporation Zion Wang UCR CE-CERT Western Regional Air Partnership (WRAP) Regional Modeling Center (RMC) National RPO Meeting May 25, 2004

2002 MM5 Evaluation Review IA/WI 2002 MM5 Configuration on National RPO 36 km Grid, except: > Used MM5 v3.6.2 > Invoked Reisner II, disregarded INTERPX Evaluation Methodology > Synoptic Evaluation > Statistical Evaluation using METSTAT and surface data WS, WD, T, RH > Evaluation against upper-air obs Compared statistical performance against EDAS, VISTAS

METSTAT Evaluation Package Statistics: > Absolute Bias and Error, RMSE, IOA Daily and, where appropriate, hourly evaluation Statistical Performance Benchmarks > Based on an analysis of > 30 MM5 and RAMS runs > Not meant as a pass/fail test, but to put modeling results into perspective

Datasets for Met Evaluation NCAR dataset ds472 airport surface met observations Twice-Daily Upper-Air Profile Obs (~120 in US) > Temperature > Moisture Scatter plots of performance metrics > Include box for benchmark > Include historical MM5/RAMS simulation results > WS RMSE vs. WD Gross Error > Temperature Bias vs. Temperature Error > Humidity Bias vs. Humidity Error

Subdomains for Model Evaluation 1 = Pacific NW 2 = SW 3 = North 4 = Desert SW 5 = CenrapN 6 = CenrapS 7 = Great Lakes 8 = Ohio Valley 9 = SE 10 = NE 11 = MidAtlantic

Evaluation of 36-km WRAP MM5 Results Model performed reasonably well for eastern subdomains, but not the west (WRAP region) > General cool moist bias in Western US > Difficulty with resolving Western US orography? May get better performance with higher resolution > Pleim-Xiu scheme optimized more for eastern US? More optimization needed for desert and rocky ground? MM5 performs better in winter than in summer > Weaker forcing in summer July 2002 Desert SW subdomain exhibits low temperature and high humidity bias

Comparison: EDAS vs. WRAP MM5 Is it possible that 36-km MM5 biases may be caused by the analyses used to nudge (FDDA) the model? We evaluated EDAS analysis fields to see whether biases exist > Used Metstat to look at the EDAS surface fields Input EDAS fields do not have the cold moist bias seen in the 36 km MM5 simulation, but wind speed underestimation bias is present > Performance issues not due to EDAS analysis fields, must be internally generated by MM5

Comparison: VISTAS vs. WRAP MM5 Evaluate VISTAS 2002 MM5 simulation to see whether similar bias exists > Different configuration: KF II, Reisner I Both MM5 simulations had trouble in western U.S. – same subdomains lie outside the statistical benchmarks Both MM5 simulations performed better in winter than in summer

Comparison: VISTAS vs. WRAP MM5 VISTAS: > Better simulation of PBL temperature and humidity profiles > Less surface humidity bias in the western U.S. > Markedly better summer precipitation field WRAP: > Less surface temperature bias than VISTAS during winter Overall, VISTAS did better in the west > Further tests indicate use of KF II has larger effect on performance than Reisner I

Addition of 12-km WRAP Grid IC/BC’s extracted from 36-km MM5 fields 3-D FDDA fields extracted from 36-km MM5 fields Preliminary 5-day run starting 12Z July 1

Comparison: 12 vs. 36-km WRAP MM5 Performance scatter plots prepared > Directly compare 36-km statistics with 12-km statistics for each western sub-region > Provides mean stats over July 1-6 preliminary test period

Comparison: 12 vs. 36-km WRAP MM5 Results: > No significant or consistent impact on wind speed/direction performance > Temperature bias dramatically improved for all areas, but gross error is made worse > Impacts on humidity performance are minor, and worse in the Desert SW There appear to be larger issues that 12-km grid resolution does not improve upon > Remember that all IC/BC and 3-D FDDA are derived from 36-km results > This issue addressed in 12-km sensitivity tests