IS WRF REALLY IMPROVING? A COMPREHENSIVE VERIFICATION OVER THE PACIFIC NORTHWEST Cliff Mass and David Ovens University of Washington.

Slides:



Advertisements
Similar presentations
Mei Xu, Jamie Wolff and Michelle Harrold National Center for Atmospheric Research (NCAR) Research Applications Laboratory (RAL) and Developmental Testbed.
Advertisements

Consortium October 4, Hits: Canadians Don’t Give Up.
Advancing Numerical Weather Prediction of Great Salt Lake-Effect Precipitation John D McMillen.
Effects of the Great Salt Lake’s Temperature and Size on the Regional Precipitation in the WRF Model Joe Grim Jason Knievel National Center for Atmospheric.
PBL simulated from different PBL Schemes in WRF during DICE
WRF Physics Options Jimy Dudhia. diff_opt=1 2 nd order diffusion on model levels Constant coefficients (khdif and kvdif) km_opt ignored.
Jared H. Bowden Saravanan Arunachalam
Modifications to the MYNN PBL and Surface Layer Scheme for WRF-ARW Joseph Olson1,2 John M. Brown1 1NOAA-ESRL/GSD/AMB 2Cooperative Institute for Research.
Hongli Jiang 1,3, Michelle Harrold 2,3 and Jamie Wolff 2,3 1: NOAA/ESRL/CIRA, Colorado State University 2: NCAR/Research Applications Laboratory 3: Developmental.
5/18/2015 Prediction of the 10 July 2004 Beijing Flood with a High- Resolution NWP model Ying-Hwa Kuo 1 and Yingchun Wang 2 1. National Center for Atmospheric.
Fire Summary The simulations presented in this study represent the meteorological conditions associated with the Warren Grove Wildfire in south-central.
The impact of mesoscale PBL parameterizations on the evolution of mixed-layer processes important for fire weather Joseph J. Charney USDA Forest Service,
An Update on the Stony Brook University Ensemble Forecast System        Brian A. Colle, Matthew Jones, Yanluan Lin, and Joseph B. Olson Institute.
Recent performance statistics for AMPS real-time forecasts Kevin W. Manning – National Center for Atmospheric Research NCAR Earth System Laboratory Mesoscale.
Issues in Very High Resolution Numerical Weather Prediction Over Complex Terrain in Juneau, Alaska Don Morton 1,2, Delia Arnold 3,4, Irene Schicker 3,
Regional Modeling Update March 31, Who is using our model output?
MOS Developed by and Run at the NWS Meteorological Development Lab (MDL) Full range of products available at:
Update on the Northwest Regional Modeling System Cliff Mass, Dave Ovens, Jeff Baars, Mark Albright, Phil Regulski, Dave Carey University of Washington.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
The Puget Sound Regional Environmental Prediction System: An Update.
Fixing WRF’s High Speed Wind Bias: A New Subgrid Scale Drag Parameterization and the Role of Detailed Verification Cliff Mass and Dave Ovens University.
Boundary Layer Parameterization Issues in the MM5 and Other Models. Cliff Mass, Eric Grimit, Jeff Baars, David Ovens, University of Washington.
Northwest AIRQUEST 12/4/06 Cliff Mass University of Washington.
Consortium Meeting June 3, Thanks Mike! Hit Rates.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Office of Research and Development Atmospheric Modeling and Analysis Division, National Exposure Research Laboratory Simple urban parameterization for.
LAKE EFFECT SNOW SIMULATION John D McMillen. LAKE BONNEVILLE EFFECT SNOW.
“1995 Sunrise Fire – Long Island” Using an Ensemble Kalman Filter to Explore Model Performance on Northeast U.S. Fire Weather Days Michael Erickson and.
Tanya L. Otte and Robert C. Gilliam NOAA Air Resources Laboratory, Research Triangle Park, NC (In partnership with U.S. EPA National Exposure Research.
The National Environmental Agency of Georgia L. Megrelidze, N. Kutaladze, Kh. Kokosadze NWP Local Area Models’ Failure in Simulation of Eastern Invasion.
Mesoscale Modeling Review the tutorial at: –In class.
Jerold Herwehe 1, Kiran Alapaty 1, Chris Nolte 1, Russ Bullock 1, Tanya Otte 1, Megan Mallard 1, Jimy Dudhia 2, and Jack Kain 3 1 Atmospheric Modeling.
Sensitivity of PBL Parameterization on Ensemble Forecast of Convection Initiation Bryan Burlingame M.S. Graduate Research Assistant University of Wisconsin-Milwaukee.
The IMPOWR (Improving the Mapping and Prediction of Offshore Wind Resources) project: Evaluation of WRF PBL Schemes Brian A. Colle and Matthew J. Sienkiewicz.
November 1, 2013 Bart Brashers, ENVIRON Jared Heath Bowden, UNC 3SAQS WRF Modeling Recommendations.
Atmospheric Modeling in an Arctic System Model John J. Cassano Cooperative Institute for Research in Environmental Sciences and Department of Atmospheric.
Improvements of WRF Simulation Skills of Southeast United States Summer Rainfall: Focus on Physical Parameterization and Horizontal Resolution Laifang.
winter RADIATION FOGS at CIBA (Spain): Observations compared to WRF simulations using different PBL parameterizations Carlos Román-Cascón
Jonathan Pleim 1, Robert Gilliam 1, and Aijun Xiu 2 1 Atmospheric Sciences Modeling Division, NOAA, Research Triangle Park, NC (In partnership with the.
Earth-Sun System Division National Aeronautics and Space Administration SPoRT SAC Nov 21-22, 2005 Regional Modeling using MODIS SST composites Prepared.
Seasonal Modeling (NOAA) Jian-Wen Bao Sara Michelson Jim Wilczak Curtis Fleming Emily Piencziak.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
WRF Problems: Some Solutions, Some Mysteries Cliff Mass and David Ovens University of Washington.
Are Numerical Weather Prediction Models Getting Better? Cliff Mass, David Ovens, and Jeff Baars University of Washington.
Ui-Yong Byun, Song-You Hong, Hyeyum Shin Deparment of Atmospheric Science, Yonsei Univ. Ji-Woo Lee, Jae-Ik Song, Sook-Jung Ham, Jwa-Kyum Kim, Hyung-Woo.
1 Impact on Ozone Prediction at a Fine Grid Resolution: An Examination of Nudging Analysis and PBL Schemes in Meteorological Model Yunhee Kim, Joshua S.
NARCCAP WRF Simulations L. Ruby Leung Pacific Northwest National Laboratory NARCCAP Users Meeting February , 2008 Boulder, CO.
Update on the Northwest Regional Modeling System 2015 Cliff Mass and David Ovens University of Washington.
Modeling and Evaluation of Antarctic Boundary Layer
Evaluation of regional climate simulations with WRF model in conditions of central Europe Jan Karlický, Tomáš Halenka, Michal Belda, (Charles University.
Consortium Meeting 6/4/2015. Usage is Up Some Additions BUFKIT files available for consortium users New ventilation graphics (deep stable layer graphics)
Consortium Meeting Feb 07, Our Audience, Hits: January 2013.
National Weather Service Houston/Galveston Lance Wood Science and Operations Officer Assessing the Impact of MODIS SST Utilizing a local WRF.
WRF-based rapid updating cycling system of BMB(BJ-RUC) and its performance during the Olympic Games 2008 Min Chen, Shui-yong Fan, Jiqin Zhong Institute.
Northwest Modeling Consortium
Evaluating NWP Ensemble Configurations for AT&D Applications
Update on the Northwest Regional Modeling System 2013
University of Washington Ensemble Systems for Probabilistic Analysis and Forecasting Cliff Mass, Atmospheric Sciences University of Washington.
University of Washington Modeling Infrastructure Available for Olympex
ASM Project Update: Atmospheric Modeling
Northwest Modeling Consortium December 3, 2013
Update on the Northwest Regional Modeling System 2017
Cliff Mass University of Washington
Seong Soo Yum1, Wonheung Kim1, Jae-In Song1 and Chang Ki Kim2
Cliff Mass and David Ovens
Consortium Meeting June 14th 2012
Cliff Mass and David Ovens University of Washington
Consortium Meeting June 6, 2019
WRAP 2014 Regional Modeling
Presentation transcript:

IS WRF REALLY IMPROVING? A COMPREHENSIVE VERIFICATION OVER THE PACIFIC NORTHWEST Cliff Mass and David Ovens University of Washington

A lot of effort has been expended… We have all worked hard over the past ten years transitioning from MM5 to WRF. In addition, a great deal of effort has gone into improving physics parameterizations, numerics, and adding additional modeling options.

But … Does WRF with all its improvements verify better than MM5 for key case studies and over extended verification periods? Do we even have the tools and capabilities to monitoring the evolving quality of our modeling systems? Is it possible that some of the “improvements” have actually detracted from modeling system skill when used with other components?

In general, we don’t have satisfactory answers for these questions. Neither NCEP nor DTC nor any national entity appears to have such information. We need the mechanisms and capabilities in place to evaluate and guide our model development

What has been the results over the Pacific Northwest where much of this information is available?

Northwest U.S. MM5 and WRF Real-time since 1995 Now running: – MM5 (36-12 km) nested in NWS NAM – WRF ARW 3.0 ( km) nested in NWS GFS – WRF currently uses Thompson microphysics, YSU PBL, NOAH LSM, RRTM LW, Dudhia SW, K-F PBL – MM5 uses MRF PBL, K-F. Extensive multi-year verification on QC data. Have run extensive tests of WRF V3.1, MM5 driven by GFS, and a collection of varying physics, including with and without LSM

The Analysis Based on this extensive series of runs, let us try to answer (for the NW) the following questions: – What have we gained by moving to WRF? – What have we lost? – What advantages can one realize from V3.1? – Is the NOAH LSM a plus or minus for the key parameters? – Are we making progress?

0000 UTC (5 PM) MAE, July-August 2008 With LSM

1200 UTC (5 AM) MAE, July-August 2008 With LSM

0000 UTC (5 PM) MAE, Jan-Feb 2009 With LSM

1200 UTC (5 AM) MAE, Jan-Feb 2009 With LSM

What do verification scores tell us about MM5 and WRF? The LSM greatly improves the dewpoint temperature forecast. So WRF with LSM is much better for dewpoint than MM5 without. For temperature, the LSM helps in the afternoon, but hurts in the morning. WRF is better than MM5 for wind direction. For precipitation, summer is better for MM5, winter for WRF. Very little difference in wind speed.

Scatter Diagrams Can Reveal the Subtleties of Model Performance

Cold Bias

Model Warm Bias for Cold Temps

Model Cold Bias

Model Warm Bias

A National Effort for WRF Verification is Required to Guide Our Work We have pieces of the puzzle: The Developmental Testbed Center (DTC) is a natural center for such activities. Powerful verification capabilities have been developed (Model Evaluation Tools, MET) We need a long-term baseline of model performance for “ best combinations” of model physics options or promising option combinations. DTC should take on this key responsibility as an “honest” and unbiased evaluator of model performance.

What do verification scores tell us about WRF? The LSM greatly improves the dewpoint temperature forecast. For temperature, the LSM helps in the afternoon, but hurts in the morning. CAM and the new RRTMG schemes have very similar verification scores.

PBL tests with WRF 3.1‏ 3 new PBL schemes (MYNN, QNSE, Pleim-Xiu) have been tested for a stable case, Jan – Our current scheme: YSU = Yonsei University non-local-K scheme with explicit entrainment layer and a parabolic K profile. – MYNN = Mellor-Yamada Nakanishi and Nino Level 2.5 PBL. Predicts sub-grid TKE terms. – QNSE = Quasi-Normal Scale Elimination PBL. A TKE-prediction option that uses a new theory for stably stratified regions. – Pleim-Xiu = Asymmetric Convective Model with non-local upward mixing and local downward mixing. – BouLac = Bougeault-Lacarrere PBL (new in 3.1, not yet tested) designed for use with BEP urban model.

PBL tests with WRF 3.1 The LSM is a bigger contributor to temperature errors than the PBL schemes. – No obvious improvement has been noted. – Further tests and analysis is needed.

Cases without LSM compared to the case with the best 2-m temperature score.

LSM with best 2-m Temperature

Cases without LSM compared to the case with the best 2-m temperature score.

LSM with best 2-m Temperature

2-m Temperatures, LSM vs no LSM LSMNo LSM

2-m Temperatures, LSM vs no LSM LSMNo LSM

Some Conclusions LSM greatly improves dewpoint temperature forecasts. LSM improves maximum temperature forecasts, but degrades minimum temperature forecasts.

LSMNo LSM

LSMNo LSM

LSMNo LSM

LSMNo LSM

PBL tests with WRF 3.1‏ 3 new PBL schemes (MYNN, QNSE, Pleim-Xiu) have been tested for a stable case, Jan – Our current scheme: YSU = Yonsei University non-local-K scheme with explicit entrainment layer and a parabolic K profile. – MYNN = Mellor-Yamada Nakanishi and Nino Level 2.5 PBL. Predicts sub-grid TKE terms. – QNSE = Quasi-Normal Scale Elimination PBL. A TKE-prediction option that uses a new theory for stably stratified regions. – Pleim-Xiu = Asymmetric Convective Model with non-local upward mixing and local downward mixing. – BouLac = Bougeault-Lacarrere PBL (new in 3.1, not yet tested) designed for use with BEP urban model.

PBL tests with WRF 3.1 The LSM is a bigger contributor to temperature errors than the PBL schemes. – No obvious improvement has been noted. – Further tests and analysis is needed.

Cases without LSM compared to the case with the best 2-m temperature score.

LSM with best 2-m Temperature

Cases without LSM compared to the case with the best 2-m temperature score.

LSM with best 2-m Temperature

2-m Temperatures, LSM vs no LSM LSMNo LSM

2-m Temperatures, LSM vs no LSM LSMNo LSM

Some Conclusions LSM greatly improves dewpoint temperature forecasts. LSM improves maximum temperature forecasts, but degrades minimum temperature forecasts.

LSMNo LSM

LSMNo LSM

LSMNo LSM

LSMNo LSM

LSMNo LSM

LSMNo LSM

LSMNo LSM

LSMNo LSM