1 Mesoscale Modeling Branch: Where We Are and Where We’re Going Geoff DiMego 301-763-8000 ext7221 7 December 2011 NCEPNCEP Where.

Slides:



Advertisements
Similar presentations
Sustaining National Meteorological Services Dr. Louis W. Uccellini Director, National Weather Service Riverdale, MD June 18, 2013.
Advertisements

Review of EMC regional ensembles: SREF, NARRE-TL, HRRRE-TL/HREF
Storm Prediction Center Highlights NCEP Production Suite Review December 3, 2013 Steven Weiss, Israel Jirak, Chris Melick, Andy Dean, Patrick Marsh, and.
© The Aerospace Corporation 2014 Observation Impact on WRF Model Forecast Accuracy over Southwest Asia Michael D. McAtee Environmental Satellite Systems.
Adaptation of the Gridpoint Statistical Interpolation (GSI) for hourly cycled application within the Rapid Refresh Ming Hu 1,2, Stan Benjamin 1, Steve.
Integrated Regional Modeling Geoff DiMego EMC Stan Benjamin GSD Steve Weiss & Israel Jirak SPC Dave Novak & Wallace Hogsett WPC David Bright AWC, Joe Sienkiewicz.
NEMS/GFS Overview Mark Iredell, Software Team Lead.
1 NMM, ARW in NCEP Operations Pre-WRF NMM at NCEP May 2000: nonhydrostatic option released in upgrade to NCEP’s workstation EtaNCEP’s workstation Eta May.
Regional Ensembles Into 2020: A Possible Vision Geoff DiMego December 3, “Where America’s Climate, Weather, Ocean.
“Where America’s Climate, Weather and Ocean Services Begin” NCEP CONDUIT UPDATE Brent A Gordon NCEP Central Operations January 31, 2006.
Operational Forecasting and Sensitivity-Based Data Assimilation Tools Dr. Brian Ancell Texas Tech Atmospheric Sciences.
Rapid Refresh and RTMA. RUC: AKA-Rapid Refresh A major issue is how to assimilate and use the rapidly increasing array of off-time or continuous observations.
Transitioning unique NASA data and research technologies to the NWS 1 Radiance Assimilation Activities at SPoRT Will McCarty SPoRT SAC Wednesday June 13,
NOAA/NWS Change to WRF 13 June What’s Happening? WRF replaces the eta as the NAM –NAM is the North American Mesoscale “timeslot” or “Model Run”
The 2014 Warn-on-Forecast and High-Impact Weather Workshop
ESRL – Some Recommendations for Mesoscale Ensemble Forecasts Consolidate all NCEP regional storm-scale model runs perhaps under HRRRE (or other) banner.
1 GOES Users’ Conference October 1, 2002 GOES Users’ Conference October 1, 2002 John (Jack) J. Kelly, Jr. National Weather Service Infusion of Satellite.
Overview of NEMS infrastructure Jun Wang Mark Iredell NEMS-NMMB tutorial April 1,
Nesting. Eta Model Hybrid and Eta Coordinates ground MSL ground Pressure domain Sigma domain  = 0  = 1  = 1 Ptop  = 0.
Rapid Update Cycle Model William Sachman and Steven Earle ESC452 - Spring 2006.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
Exploitation of Ensemble Output (and other operationally cool stuff) at NCEP HPC Peter C. Manousos NCEP HPC Science & Operations Officer
Evaluation and Comparison of Multiple Convection-Allowing Ensembles Examined in Recent HWT Spring Forecasting Experiments Israel Jirak, Steve Weiss, and.
Transitioning CMAQ for NWS Operational AQ Forecasting Jeff McQueen*, Pius Lee*, Marina Tsildulko*, G. DiMego*, B. Katz* R. Mathur,T. Otte, J. Pleim, J.
NCEP Site Update WG-CSAB – Spring 2010 Allan Darling Deputy Director, NCEP Central Operations April 7-8, 2010.
UKmet February Hybrid Ensemble-Variational Data Assimilation Development A partnership to develop and implement a hybrid 3D-VAR system –Joint venture.
UMAC data callpage 1 of 34NAM – HiResW - SREF EMC Operational Models North American Mesoscale (NAM) Modeling System Geoff DiMego, MMB Chief, and Eric Rogers,
The Urban Atmosphere Research Program Focus – To take our forecasts and analyses to where people live and work. Issues -- For forecasting: DispersionFloods.
Incorporation of TAMDAR into Real-time Local Modeling Tom Hultquist Science & Operations Officer NOAA/National Weather Service Marquette, MI.
Operational Global Model Plans John Derber. Timeline July 25, 2013: Completion of phase 1 WCOSS transition August 20, 2013: GDAS/GFS model/analysis upgrade.
Numerical Modeling at NCEP/EMC: Progress and Plans Geoff DiMego Mesoscale Modeling Branch ext November 2010 NCEPNCEP.
1 NCEP data assimilation systems status and plans John C. Derber Environmental Modeling Center NCEP/NWS/NOAA With input from: Many others.
Verification Summit AMB verification: rapid feedback to guide model development decisions Patrick Hofmann, Bill Moninger, Steve Weygandt, Curtis Alexander,
1 What is the “NAM”? David Novak Science and Operations Officer NOAA/NCEP/ Hydrometeorological Prediction Center.
Space and Time Multiscale Analysis System A sequential variational approach Yuanfu Xie, Steven Koch Steve Albers and Huiling Yuan Global Systems Division.
June 19, 2007 GRIDDED MOS STARTS WITH POINT (STATION) MOS STARTS WITH POINT (STATION) MOS –Essentially the same MOS that is in text bulletins –Number and.
GSI EXPERIMENTS FOR RUA REFLECTIVITY/CLOUD AND CONVENTIONAL OBSERVATIONS Ming Hu, Stan Benjamin, Steve Weygandt, David Dowell, Terra Ladwig, and Curtis.
Development of an EnKF/Hybrid Data Assimilation System for Mesoscale Application with the Rapid Refresh Ming Hu 1,2, Yujie Pan 3, Kefeng Zhu 3, Xuguang.
ARL 1 NWS/NCEP Modeling for Potential Fire Weather Support September 14, 2011 NOAA/OAR/ARL and NWS/NCEP/EMC Air Quality Team 1.
Higher Resolution Operational Models. Operational Mesoscale Model History Early: LFM, NGM (history) Eta (mainly history) MM5: Still used by some, but.
AMB Verification and Quality Control monitoring Efforts involving RAOB, Profiler, Mesonets, Aircraft Bill Moninger, Xue Wei, Susan Sahm, Brian Jamison.
Rebecca Cosgrove NCEP/NCO/Production Management Branch March 26, 2015
P1.85 DEVELOPMENT OF SIMULATED GOES PRODUCTS FOR GFS AND NAM Hui-Ya Chuang and Brad Ferrier Environmental Modeling Center, NCEP, Washington DC Introduction.
1 DET Mesoscale Ensemble Workshop August 2010 NCEP Regional Ensemble Status and Plans Geoff DiMego and Jun Du NOAA/NWS/NCEP Environmental Modeling.
Higher Resolution Operational Models. Major U.S. High-Resolution Mesoscale Models (all non-hydrostatic ) WRF-ARW (developed at NCAR) NMM-B (developed.
Slides for NUOPC ESPC NAEFS ESMF. A NOAA, Navy, Air Force strategic partnership to improve the Nation’s weather forecast capability Vision – a national.
1 Future NCEP Guidance Support for Surface Transportation Stephen Lord Director, NCEP Environmental Modeling Center 26 July 2007.
1 DTC Ensemble Workshop September 2009 The NCEP Short-Range Ensemble Forecast (SREF) System Jun Du, Geoff DiMego and Bill Lapenta NOAA/NWS/NCEP Environmental.
Evaluation of radiance data assimilation impact on Rapid Refresh forecast skill for retrospective and real-time experiments Haidao Lin Steve Weygandt Stan.
1 National Environmental Modeling System (NEMS) Status M. Iredell and EMC Staff.
1 Evolution and Priorities for OCONUS and CONUS Guidance Systems including a convection permitting ensemble system Presented By: Geoff DiMego (NCEP/EMC)
1 Examination of Nesting Requirements for CONUS and OCONUS in Presented By: Geoff DiMego (NCEP/EMC) Contributors: Stan Benjamin, Curtis Alexander.
1 Aviation Forecasting – Works in Progress NCVF – Ceiling & Visibility CoSPA – Storm Prediction A Joint Effort Among: MIT Lincoln Laboratory NCAR – National.
The Unidata-NCEP Partnership: Transferring Model Output and Software to the University Community Dr. Louis W. Uccellini Director National Centers for Environmental.
Assimilation of AIRS SFOV Profiles in the Rapid Refresh Rapid Refresh domain Haidao Lin Ming Hu Steve Weygandt Stan Benjamin Assimilation and Modeling.
MDL Requirements for RUA Judy Ghirardelli, David Myrick, and Bruce Veenhuis Contributions from: David Ruth and Matt Peroutka 1.
Rapid Update Cycle-RUC. RUC A major issue is how to assimilate and use the rapidly increasing array of offtime or continuous observations (not a 00.
The Developmental Testbed Center: Historical Perspective and Benefits to NOAA Steve Koch DTC Deputy Director Director, NOAA/ESRL/Global Systems Division.
Higher Resolution Operational Models
15 June 2005RSA TIM – Boulder, CO Hot-Start with RSA Applications by Steve Albers.
June 20, 2005Workshop on Chemical data assimilation and data needs Data Assimilation Methods Experience from operational meteorological assimilation John.
Indirect impact of ozone assimilation using Gridpoint Statistical Interpolation (GSI) data assimilation system for regional applications Kathryn Newman1,2,
Jason Levit NOAA NextGen Weather Program June, 2013
A few examples of heavy precipitation forecast Ming Xue Director
Rapid Update Cycle-RUC
Update on the Northwest Regional Modeling System 2013
Rapid Update Cycle-RUC Rapid Refresh-RR High Resolution Rapid Refresh-HRRR RTMA.
Local Analysis and Prediction System (LAPS)
Lidia Cucurull, NCEP/JCSDA
Presentation transcript:

1 Mesoscale Modeling Branch: Where We Are and Where We’re Going Geoff DiMego ext December 2011 NCEPNCEP Where the Nation’s climate and weather services begin

2 T O P I C S Who we are (hasn’t changed, so I left it out) Observation Processing and Quality Control GSI Analysis and Data Assimilation HiResWindow and SPC runs NAM: NMMB + NEMS + Nesting Convergence of NAM+RUC/RR RTMA + Delayed Mesoscale Analysis B A C K U P S L I D E S

3 Obs Processing & QC Highlights: Current & Future BUFRLIB upgrade w/NCO (Q2FY2011) Mesonet Metadata effort (thank$ to Curtis Marshall & Tim Mcclung) –Broad & Aggressive Collection –Storage in MySQL database NRL aircraft QC package (Q? 2012) –Ascents/descents generated as profiles –Diagnose PBL ht from profiles (critical Ri) –Enables PBL verification & analysis Level II Radar QC –Fixed major bug in height assignment –Refined use of Level 2.5 (AK only) & Level 3 –Higher quality VAD wind profiles 400 day MySQL obs-dump database translation effort NAM 12 hr Forecast Ri-Based PBL Height with Verifying RAOBs

Relationships in Metadata MySQL Database Courtesy Steve Levine Updated station dictionaries Expanded uselists Updated dynamic reject lists Derived direction- dependent reject lists Deriving time- dependent reject lists 4

Gridpoint Statistical Interpolation Multi-Agency development effort led by NCEP/EMC –ESRL / GSD + ESRL / PSD –JCSDA: NOAA, NESDIS, NASA, DOD –Code management (SubVersion) with regression testing Community supported via DTC in Boulder – GSI Workshop and Tutorial June 28 - July 1, 2011 GSI Workshop and Tutorial Includes hybrid approach for EnKF + 3D- or 4D-VAR Works for global & NEMS/NMMB on full or subset of model domain / resolution While full EnKF tested for global, regional tested with simple use of hybrid and we get improved results using GEFS as input ensemble Will test SREF for finer scale mass-wind & flow dependent BE Will then test HRRRE-TL storm scale ensemble of opportunity to extract cross-covariances of state variables with fields such as reflectivityGEFS & experimental EnKF

GSI Upgrade for NAM Includes: Global upgrade Spring 2011 Faster code (~9%), improved optimization and additional options Recomputed background errors Limit moisture to be >= 1.e-10 in each outer iteration and at the end of analysis Locate buoys at 10 m (from 20 m) Ambiguous vector qc for ASCAT data Satellite radiance related changes Update to radiative transfer model - CRTM Inclusion of Field of View Size/Shape/Power for radiative transfer Relax AMSU-A Channel 5 QC Remove down weighting of collocated radiances Inclusion of uniform (higher resolution) thinning for satellite radiances Stratospheric satellite Improved OMI QC Removal of redundant SBUV/2 total ozone Retune SBUV/2 ozone ob errors Inclusion of SBUV from NOAA-19 New ob sources for NAM Fall 2011 New conventional obs –MESONET p s, T, q –ACARS moisture (WVSS-II) –MAP Profiler winds –RASS Profiler Tv –WINDSAT & ASCAT ocean winds (from scatterometer) New unconventional obs –Satellite Radiances AMSUA from aqua & NOAA19 HIRS4 & MHS from NOAA19 IASI from METOP-A –Refractivity GPS radio occultation 6

~2.7 Million New Observations Per Day after NAM Upgrade Upper Air ~43K –RASS 126 t = 1403 –MAP 227 uv= 8859 –AIRCAR133 q = 8533 –WDSATR 289 uv =17392 –WDSATR 290 uv =7198 Surface ~1170K –MESONET 188 q = –MESONET 188 t = –MESONET 188 p= New Satellite ~1528K –Radiance NOAA19 AMSUA= NOAA19 HIRS4 = AQUA AMSUA= IASI METOPA= –Refractivity GPS-RO [COSMIC] =

Guam 18Z 06Z 00Z 12Z 00Z 12Z 06Z 18Z 00Z 12Z 4.0 km WRF-NMM 5.15 km WRF-ARW 48 hr fcsts from both Unless there are hurricanes Expanded PR/Hispaniola domain March 2011 Upgrade of HiResWindow Briefing Package can be seen HEREHERE Sized to fit on the previous CCS! Upgrade NMM & ARW to WRF v3.2 with improved passive advection in both cores Add Guam runs Add product generation: High Resolution Ensemble Forecast (HREF), BUFR, and SPC hourly max, fire wx and 80m agl fields. Now on NOMADS & ftp serverNOMADSftp server NOW on SBN/NOAAPORT too !!! Daily displays of these runs can be seen at: and Matt Pyle’s full CONUS NMM runs [ /00 or /12 ] for SPC can be seen at

New Output Fields from HiResW [also added to NAM and soon to GFS] Hourly maxima of: –1000 m reflectivity –updraft velocity –downdraft velocity –updraft helicity –10 m wind speed –2 m temperature –2 m RH Hourly minima of: –2 m temperature –2 m RH 80 m AGL U + V wind 80 m AGL temperature 80 m AGL spec humidity 80 m AGL pressure Radar echo top height (18 dBZ level) Richardson Number based PBL height Ventilation Rate Transport Wind 9

10 HiResWindow WRF v3.1 + Configurations (No Parameterized Convection) Dynamic Core WRF-NMMWRF-ARW Horizontal Spacing4.0 km5.1 km Vertical Domain 35 levels 50 mb top Sigma-Pressure 35 levels 50 mb top Sigma PBL/TurbulenceMYJYSU MicrophysicsFerrierWSM3 Land-SurfaceNOAH Radiation (Shortwave/Longwave) GFDL/GFDL Lacis-Hansen/Fels-Schwartzkopf Dudhia/RRTM Advection of Passive Variables Conservative Positive Definite Monotonic Positive Definite

11 HiResWindow Evaluations HPC liked it (acceptable qpf bias), but SPC didn’t like it (anemic storm structure) … sigh … Mitigation for SPC: Matt Pyle will continue to run twice daily WRF-NMM runs with the old passive advection scheme (but using v3.3 eventually). Matt Pyle WebpageMatt Pyle Webpage

12 Plans For HiResWindow Upgrade ARW to WRF version 3.3 Replace WRF-NMM with NEMS-NMMB Increase resolution to ~3 km Expand to full CONUS –CONUS, Hawaii & Guam at 00z and 12z –Alaska, Puerto Rico-Hispaniola at 06z and 18z –How soon can AWIPS distribution adapt to this? Improve Initialization of HiResWindow runs –GSI using all available data & mini-NDAS –GSI adapted specially for Level II winds –Digital filter with Level II reflectivity (ala RUC/RR) Some or all of the above Replace HREF product stream with routine construction of HRRRE-TL

There is Agreement & Commitment on a ‘One NOAA’ Modeling Framework This goes back to the first days of Admiral L. The ultimate target is a completed NOAA framework of ESMF components within which NOAA scientists can work efficiently Consistency with NUOPC is expected as well NCEP has been building NEMS for this purpose Community involvement is expected / encouraged Support for ESMF has moved permanently from NCAR/SCD to NOAA/ESRL 13

NEMS Component Structure MAIN EARTH(1:NM) Ocean AtmIce Below the first dashed line, the source codes are organized by the model developers. FIM Dyn Phy Wrt NEMS GFS Dyn Phy Wrt NMM Solver Nest Domains(1:ND) Wrt All boxes represent ESMF components. NEMS LAYER ARW Dyn Phy Wrt 14 common physics layer Ensemble Coupler GOCART WRF Chem WRF Chem

Runtime & optimal node apportionment for NMMB nesting with a Fire Wx nest over CONUS (30 nodes): 12 hr fcst in 1619 s [Matt Pyle] 3 km Puerto Rico nest 1.5/30 or 5% 1.33 km CONUS FireWx nest 5/30 or 17% 4 km CONUS nest 17/30 or 57% 15 3 km Hawaii nest 1.5/30 or 5% 12 km parent 3/30 or 10% 6 km Alaska nest 2/30 or 7%

WRF-NMM takes 3.6 times longer to run comparable nesting with Fire Wx nest over CONUS (30 nodes): 12 hr fcst in 5857 s [Matt Pyle] 4 km* Puerto Rico nest 30/30 or 100% 1.33 km CONUS FireWx nest 30/30 or 100% 4 km CONUS nest 30/30 or 100% 16 4 km* Hawaii nest 30/30 or 100% 12 km parent 30/30 or 100% 4 km* Alaska nest 30/30 or 100%

17 More Stats Showing Improved Computational Speed & Efficiency Runtime for NAM with 5 nests on 72 nodes: Current opnl code: > 4 hours New code: 70 minutes New NAM is doing 11 times more work than the current NAM, but uses only 7.7 times more compute resources! IBM estimates the NMMB will easily scale to at least 24,000 processors (if we could ever get them)

Why Does NMMB Run So Much Faster? NMMBNMM Runtimes1619 s5857 s3.6 x faster Contribution to speed up New Model Dynamics NMMBNMM~2% InfrastructureNEMSWRF~2% Nesting NMMB specific Outside of the NEMS infrastructure Processor apportionment 1-way nests solved simultaneously ~Core independent* Part of the WRF infrastructure No processor apportionment 1-way nests solved sequentially ~96% Horizontal resolution step- down ratio Any integer ratio, e.g. 2:1, 3:1, 4:1, … Only 3:1* 0% - this relates to flexibility, not speed 18

Hypothetical NMMB Simultaneous Run Global [with Igor & Julia] and NAM [with CONUS nest] 27 km Global NMMB 12 km NAM NMMB 4 km NAM-nest NMMB 9 km Igor NMMB 9 km Julia NMMB

Hypothetical NMMB Simultaneous Run This graphic demonstrates both where we are with NMMB nesting and where we want to go. This loop was made by super-imposing just two NMMB runs: –NAM with a fixed CONUS nest (it could have included Alaska, Puerto Rico-Hispaniola and a Fire Weather nest too) –global NMMB with two movable nests for Igor and Julia (it could have included a movable nest inside Igor and/or Julia as well) This is all done with one-way interactive lateral boundaries, but if you took away the heavy black grid outlines, you'd be hard pressed to pick out where they were. Once the nesting is generalized, everything in the loop will be doable in a single NMMB executable. Greatly facilitates running global and regional forecasts concurrently which is our goal in the JPSS era when sat obs are to be delivered quickly enough to start global & regional at the earlier regional time. 20

21 October 2011 NAM Upgrade Current NAM WRF-NMM (E-grid) 4/Day = 6 hr update Forecasts to 84 hours 12 km horizontal grid spacing New NAM see briefings here & herehere NEMS based NMMB B-grid replaces E-grid Parent remains 12 km to 84 hr Four Fixed Nests Run to 60 hr –4 km CONUS nest –6 km Alaska nest –3 km HI & PR nests Single placeable 1.33km or 1.5 km FireWeather/IMET/DHS run to 36hrSingle placeable 1.33km or 1.5 km FireWeather/IMET/DHS run to 36hr

NPS & Changes to NDAS NEMS Preprocessing System (NPS) for NMMB ( Matt Pyle ) –To create the first guess at the start of the NDAS (at time T- 12hr), NPS uses GFS spectral coefficients rather than post- processed pressure level fields on a 1 deg lat/lon grid as has to be done with WRF Preprocessing System (WPS) –Lateral boundary conditions also based on GFS spectral coefficients (as is done in current NAM but not in WRF REAL) Changes to the NAM Data Assimilation System (NDAS) –First guess at T-12 reflects relocation of tropical cyclones –Use of 1/12 th deg SST (RTG_SST_HR) in place of ½ deg –GSI updates 2 m temperature & moisture and 10 m winds with portion of 1 st layer correction –Updated background errors for NMMB –5X divergence damping in NMMB in NDAS only 22

Much Better NDAS First Guess [vs RAOBs] 23 ZT V RH March 2011 Black/ Solid = Opnl Red / Dash = Parallel

Scaled down BMJ convection for NMMB nests Different model forecast customers interpret high- resolution guidance differently (e.g. HPC vs SPC) With the NMMB implementation in NAM, an effort was made to satisfy both camps – sorta kinda. New scaling factor in the BMJ allows for relaxation toward moister profiles in finer grid-spacing runs: –Smaller modification of thermodynamic profiles –Goal is to improve QPF performance in nests without destroying fine-scale forecast structure 24

6 km NMMB nest 48 h total precip ending /00Z w/o parameterized convection Max precip = 4.91” w/ scaled down BMJ convection Max precip = 3.39” 25 SPC likes it, but HPC hates it.HPC loves it, but SPC hates it. More acceptance by SPC after anemic vertical velocities are corrected with bug fix in August.

The NAM nests were not designed or tuned to provide the severe weather guidance needed by SPC The NAM nests were designed to provide NWS WFOs and other users with basic weather guidance, e.g. QPF The nest resolutions were selected to match the NDFD grids on which WFOs produce their gridded forecasts Currently, the NAM-DNG WFOs use to initialize their GFE, is downscaled from NAM’s 12 km to local NDFD resolutions [ km] by the not-so-accurately-named “smartinit” processing Having NAM nests will mean very little (if any) downscaling will be needed to produce NAM-DNG Non Severe Weather Applications of the NAM 4km Nest 26

NAM-DNG is already distributed to WFOs via AWIPS-SBN and thus available to private sector users via NOAAPORT. This is the primary distribution mechanism for NAM nest fields including QPF and newly added simulated reflectivity. Also available on ftp servers and NOMADS. Viewable from Eric Rogers’ most excellent webpages.webpages New double resolution NAM-DNG grids will be made for CONUS and Alaska which anticipate the future move of NDFD to those resolutions and recognize & support the fact that a majority of WFOs are already doing their forecast prep at those double resolutions. NWS/HQ, the OSIP/TOC/SBN enterprise, NCO & EMC have geared up to distribute the new NAM-DNG grids. Only remaining choke-point is at the TOC. Dissemination of NAM Nests via NAM-DNG 27

28

NOAA/ARL’s HYSPLIT Dispersion Model Wild-fire smoke applications driven by NAM, NAM nests & FireWx IMETSupport runs via NOAA/ARL’s READY-testbed siteREADY-testbed Example for March 11, 2011 fires in Central OK: Harrah and Chatow counties

Irene Assessment Placeable FWIS Nest in NAM ParallelNAM Parallel This is a fixed nest run at 1.33 km resolution within the 4km CONUS NAM nest. First placement was outside the NAM’s CONUS nest and failed. Eric Rogers ran 24 th 12z case over the counter Starting with 18z run 8/25 FWIS was placed by SDM ahead of (initially) or over Irene (later sadly) as it moved up the east coast.

Convergence of NAM & RR into hourly NARRE & HRRRE There is a signed agreement between NCEP/EMC and ESRL/GSD to build an hourly updated NARRE Based on NEMS common modeling infrastructure Ensembles: Sample uncertainty within membership Initial & Lateral Boundary conditions Dynamics & Physics Provide full description of uncertainty Can adapt to rapidly evolving science of underlying data assimilation and modeling

2012 NAM NEMS based NMMB Bgrid replaces Egrid Parent remains at 12 km to 84 hr Multiple Nests Run to 60hr – 4 km CONUS nest – 6 km Alaska nest – 3 km HI & PR nests Reinstate Fire Weather/IMET Support/DHS run to 36hr km – Locate a single km run – In either CONUS or Alaska Rapid Refresh WRF-based ARW NCEP’s GSI analysis Expanded 13 km Domain to include Alaska Experimental 3 km HRRR RUC-13 CONUS domain WRF-Rapid Refresh domain – 2010 Original CONUS domain Experimental 3 km HRRR 34

? North American Rapid Refresh ENSEMBLE (NARRE) NMMB (from NCEP) & ARW (from ESRL) dynamic cores Common use of NEMS infrastructure and GSI analysis Common NAM parent domain at km Initially ~6 member ensemble made up of equal numbers of NMMB- & ARW-based configurations Hourly updated with forecasts to 24 hours NMMB & ARW control data assimilation cycles with 3 hour pre-forecast period (catch-up) with hourly updating NAM & SREF 84 hr forecasts are extensions of the 00z, 06z, 12z, & 18z runs – for continuity sake. –SREF will be at same km resolution as NARRE by then –SREF will have 21 members plus 6 from NARRE for total of 27 NARRE requires an increase in current HPCC funding 35

? High Resolution Rapid Refresh ENSEMBLE (HRRRE) Each member of NARRE contains 3 km nests –CONUS, Alaska, Hawaii & Puerto Rico/Hispaniola nests –The two control runs initialized with radar data & other hi-res obs This capability puts NWS/NCEP[+OAR/ESRL] in a position to –Provide NextGen Enroute AND Terminal guidance (FWIS-like) –Provide PROBABILITY guidance with full Probability Density Function specified, hence uncertainty information too –Provide a vehicle to improve assimilation capabilities using hybrid (EnKF+4DVar) technique with current & future radar & satellite –Address Warn-on-Forecast as resolutions evolve towards ~1 km NAM nests are extensions of the 00z, 06z, 12z & 18Z runs. HRRRE requires an increase in HPCC funding over and above that required for the NARRE

37 Hourly updated 12/13km ensemble for aviation out to 12 hr Combines members from RR & NAM over CONUS & Alaska NARRE-TL example member combination for 06z cycle run 4 NAM cycles (6z & previous 0z, 18z and 12z) 6 RR cycles (6z & previous 5z, 4z, 3z, 2z, and 1z) Member Weighting = 1 - forecast range (hr)/30: In the Meantime, Implement NARRE-TL* North American Rapid Refresh Ensemble (NARRE) Time-Lagged [TL] System [courtesy of Binbin Zhou] *To be implemented with the Rapid Refresh in Q2FY2012, replaces VSREF

Take advantage of 5 existing convection allowing runs Example: 14 member combination for 06Z cycle run 4 NAM-nest (6z & previous 0z, 18z and 12z) 2 HRW-ARW cycles (previous 0z and 12z)* 2 HRW-NMM cycles (previous 0z and 12z)* 2 Pyle-SPC cycles (previous 0z and 12z) 4 HRRR cycles (6z & previous 5z, 4z and 3z) ** NAM-N:NMMB HRW: ARW HRW: NMM P-SPC:NMM HRRR High Resolution Rapid Refresh Ensemble (HRRRE) Time-Lagged [TL] System 38 *Eastern 2/3 of CONUS for now **runs to 15 hr only

Next steps for NARRE-TL & HRRRE-TL Build them and run them in/off parallels Share with users including NWS, AWC, CoSPA & FAA Perform Verifications Icing vs CIP (Current Icing Product) Reflectivity, Echo Top and VIL Lightning? Combine with AFWA’s 10 member 4 km ensemble for CONUS Combine with hybrid approach (at least) Implement HRRRE-TL as replacement for HREF Pursue use of cluster analysis to find most meaningful SREF members for hybrid approach Frequency matching bias correction & Other promising post processing techniques Ultimately, more computing is needed to allow more convection allowing forecast runs to reduce our dependence on time-lagged members

Brilliant Minds Think Alike Israel Jirak from SPC constructed Storm Scale Ensemble of Opportunity (SSEO), comprised of 7 convection-allowing members from 00 UTC: NSSL WRF-ARW (01), HRW WRF-ARW East (02), HRW WRF-ARW East 12-hr time lag (03), CONUS WRF-NMM (04), HRW WRF-NMM East (05), HRW WRF-NMM East 12-hr time lag (06), NMMB-Nest (07). I know Israel also put together a 12 UTC run, where the membership changes a bit. Also, SSEO only coverd the eastern CONUS domain since that is what is covered by current HiResWindow.

Figure 1. Experimental model performance based on participant feedback from subjective evaluation surveys conducted during the QPF component of the 2011 HWT Spring Experiment. Experimental deterministic models were compared to the operational 12km NAM while experimental ensembles (SSEO and SSEF) were compared to the operational SREF.

Google Map of 4 RTMA Domains First Phase of Analysis of Record Real Time Mesoscale Analysis Analyzed every hour on the NWS’ NDFD grids 10 m wind + est. anal. uncertainty 2m Temperature + est.anal.unc. 2m dew point + est.anal.unc. Sfc pressure + est.anal.uncertainty 1 hr precip (Stage 2) GOES Eff. Cloud Amount Courtesy of Yan Zheng University of Utah

RTMA - Really Good News Thanks to the efforts of Jamie Vavra (OST), OSIP approval has been obtained to declare RTMA operational. The vote was unanimous with all regions in attendance. RTMA is considered to have met its FOC conditions. Continued effort by NCEP/EMC to improve it and extend it to more variables is assumed.

RTMA Winter UPGRADE PACKAGE To Follow Rapid Refresh - Expand CONUS RTMA-2.5km domain further north into Canada to provide support for Northwest RFC - Double the resolution for Alaska RTMA from 6 => 3km - Add Juneau RTMA at 1.5km resolution - Add routine cross-validation to RTMA runs - Other RTMA Enhancements (blended first guess, analysis of wind gust, visibility, analysis error, use/reject lists, etc.)

EXPANDED CONUS RTMA-2.5km NWRFC NDFD CONUS Add Support of the Northwest RFC by expanding the CONUS 2.5- km domain northern boundary from 51 N to 56 N. In practice, added 220 pts in y-direction. Produce two GRIB2 files: One for true NDFD CONUS and the other for NWRFC domain.

Alaska-3km NDFD Juneau-1.5km Contours: terrain in meters ALASKA RTMA - Change spatial resolution from 6- km to 3-km (doubling the resolution like was done last year for CONUS) - Blend forecasts from Rapid Refresh and NAM to make first guess (work in progress) - Make mesonet observation accept list (Levine) JUNEAU RTMA -New system at 1.5-km resolution.

ROUTINELY COMPUTE CROSS-VALIDATION - Make multiple disjoint datasets for each ob type, each containing about 10% of the data but uniformly distributed. Datasets contain representative data from all the geographical regions observed but without the redundancy of close pairs or tight clusters - Constructed with the help of Hilbert curve - For each analysis, randomly pick one of the disjoint datasets to use for cross-validation

Other RTMA Enhancements Add blending of first guess 10m winds with 10m winds from Hurricane WRF to improve handling of tropical systems. - Add analysis of wind gust & horizontal visibility - Improve global rescaling of analysis uncertainty - Synchronize all RTMA applications (RTMA CONUS, Alaska, Hawaii, Puerto Rico and Guam) to use the same code and features (e.g. FGAT, bias correction, improved observation accept and reject lists)

49 Future Plans for RTMA, AoR & DNG Use HiResWindow as first guess for Guam Expand RTMA variables: sea-level pressure, cloud amount, cloud base height, PBL height, etc. Add GLERL coastal observation adjustment to increase ob density along coast of (at least) Great Lakes Improve wind analyses over oceans (sat winds) Apply non-linear quality control within the GSI Bias correct 1 st guess prior to applying smartinit Pursue dynamical methods sensitive to terrain for downscaling the wind Pursue Delayed Mesoscale Analysis (now funded) Apply DNG to RUC Add/Fix Weather Type for DNG NAM & GFS Retire DGEX [once DNG-NAM beats it]

T h a n k s ! A n y Q U E S T I O N S ? 50

B A C K U P S L I D E S 51

52 Air Quality Modeling Progress Meteorological Model Coupling Coupled with NEMS-NMM-B for all CMAQ domains (CONUS, AK, HI) CMAQ Model (Developmental testing) Included wild fire smoke sources Retrospective tests of 4 km CMAQ driven by NAMB nest Real-time testing of AIRNOW PM data assimilation HYSPLIT Regional Model Experimental interim dust system over CONUS Improvements to Volcanic Ash and RSMC Capabilities

Upgrade to CMAQ V4.7 Update Emissions from 2008 NEI, include smoke Tight grid coupling w/ NAM-B Improved gas/aerosol mechanisms Implement surface PM data assimilation Global NGAC Implement on-line dust system Developmental Testing 4 km NAM-B nest coupling NGAC full PM LBC coupling with CMAQ Surface O3 and GOES/MODIS AOD data assimilation 53 Air Quality Modeling FY12 Plans

Air Quality Implementation Matrix 54