WRF Tutorial For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell

Slides:



Advertisements
Similar presentations
ATMO5332 WRF-ARW Tutorial 0.01”.
Advertisements

WRF Modeling System V2.0 Overview
WRF Tutorial For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell or
WRF model exercise Experimental Boundary layer meteorology Björn Claremar 2012.
WRF Physics Options Jimy Dudhia. diff_opt=1 2 nd order diffusion on model levels Constant coefficients (khdif and kvdif) km_opt ignored.
For Version / Synoptic Lab
Nesting. Eta Model Hybrid and Eta Coordinates ground MSL ground Pressure domain Sigma domain  = 0  = 1  = 1 Ptop  = 0.
WRF demo/tutorial Robert Fovell
This is the footer WRF Basics Weather Research and Forecasting.
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
This is the footer Running WRF on HECToR Ralph Burton, NCAS (Leeds) Alan Gadian, NCAS (Leeds) With thanks to Paul Connolly, Hector.
How to set up and run WRF model. Outline n How to download and compile the WRF code? n Namelist n Input and output files.
WRF namelist.input Dr Meral Demirtaş
Installing WPS and WRF Michael Duda1 and Wei Wang1
1 Weather Research and Forecasting (WRF) Modeling System A Brief Overview WMO, Training Course, September 2011 Alanya, Turkey Dr Meral Demirtaş Turkish.
Mesoscale & Microscale Meteorological Division / NCAR How to Set Up and Run WRF (real.exe & wrf.exe)? Wei Wang June 29, 2004.
1 WRF PreProcessing System (WPS) A Brief Overview WMO, Training Course, September 2011 Alanya, Turkey Dr Meral Demirtaş Turkish State Meteorological.
L. Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO Developmental Testbed Center, Boulder, CO Sara Michelson.
Introduction to the WRF Modeling System Wei Wang NCAR/MMM.
Introduction to Running the WRF in LES Mode Jeff Massey April 1, 2014 University of Utah WRF Users Group.
Development of WRF-CMAQ Interface Processor (WCIP)
WRF Portal (A GUI Front End For WRF) WRF Domain Wizard (A GUI Front End For WPS) Presented by Jeff Smith January 18, 2007.
Dynamic and Buoyancy Pressure A rising thermal plus… Unix basics.
WRF Domain Wizard A tool for the WRF Preprocessing System Jeff Smith Paula McCaslin July 17, 2008.
WRF Modeling System Overview
How to run RSM on imtf4 As of 2010/8/2 by Kei Yoshimura (AORI)
Initialization for Real Data Cases Dave Gill
How to set up and run WRF model. Outline n How to download and compile the WRF code? n Namelist n Input and output files.
A Public Release of WRF Portal Jeff Smith and Mark Govett June 24, 2008.
WRF Domain Wizard The WRF Preprocessing System GUI Jeff S Smith Paula McCaslin and Mark Govett AMS 2008 Presentation.
RAMS Evaluation and Visualization Utilities (REVU) Post Processing.
The WRF Preprocessing System Michael Duda 2006 WRF-ARW Summer Tutorial.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
Mesoscale & Microscale Meteorological Division / NCAR WRF Modeling System Overview Jimy Dudhia.
L. Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO Developmental Testbed Center, Boulder, CO Sara Michelson.
WRF Four-Dimensional Data Assimilation (FDDA) Jimy Dudhia.
Donald Stark National Center for Atmospheric Research (NCAR) The Developmental Testbed Center (DTC) 15 January, 2014 Building the HWRF Components.
The WRF Preprocessing System: Description of General Functions
The 4th East Asia WRF Tutorial, 7-10 April 2010 The WRF Preprocessing System: Description of General Functions Michael Duda.
This document gives one example of how one might be able to “fix” a meteorological file, if one finds that there may be problems with the file. There are.
How to set up and run WRF model
Introduction to the WRF Modeling System
WRF Modelling Aim:  18 th -19 th April Cyclone taking dust SW as opposed to SE  15 th – 16 th North Scotland hit  Cyclone has moved north dragging.
0 0 July, 2009 WRF-Var Tutorial Syed RH Rizvi WRFDA Analysis/Forecast Verification Syed RH Rizvi National Center For Atmospheric Research NCAR/ESSL/MMM,
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
Module 6 MM5: Overview William J. Gutowski, Jr. Iowa State University.
The WRF Preprocessing System
Initialization for Idealized Cases
Higher Resolution Operational Models
Hernán García CeCalcULA Universidad de los Andes.
Real-data WRF: Setup and run ATM 419 Spring 2016 Fovell 1.
NESIS estimates for the SOC case ATM 419 Spring 2016 Fovell 1.
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
California Central Valley (Experiment 6) ATM 419 Spring 2016 Fovell 1.
Supercell storms: In-class demo and Experiment 3
Running the WRF Preprocessing System
Potpourri ATM 419/563 Spring 2017 Fovell.
“Storm of the Century”: grid nudging & stochastic perturbations (and Experiment 6A) ATM 419/563 Spring 2017 Fovell.
Andrew White, Brian Freitag, Udaysankar Nair, and Arastoo Pour Biazar
Introduction to Metview
WRF Four-Dimensional Data Assimilation (FDDA)
“Storm of the Century”: grid nudging and stochastic perturbatons
National Scientific Library at Tbilisi State University
Overview of the COSMO NWP model
National Scientific Library at Tbilisi State University
gWRF Workflow and Input Data Requirements
Practice for Real Cases
The linear sea-breeze circulation and effect of Earth’s rotation (and Experiment #1) ATM 419/563 Spring 2019 Fovell Reference: Rotunno (1983, J. Atmos.
Coastal Atmospheric Modeling for both Operational and Research Applications using the Weather Research Forecast (WRF) Model.
WRF Application in COAWST
Presentation transcript:

WRF Tutorial For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell rfovell@ucla.edu http://tinyurl.com/2uagrc or http://www.atmos.ucla.edu/~fovell/WRF/wrf_tutorial_2007.ppt.htm

Background on WRF model “Weather Research and Forecasting” Co-developed by research and operational communities ARW core “Advanced Research WRF” NMM core “Nonhydrostatic Mesoscale Model” Supercedes MM5 and Eta models Current version 2.2 Platforms include Linux and Mac OS X

WRF advantages Better numerics than MM5 Arakawa C grid, R-K scheme, odd order advection w/ implicit diffusion Much less diffusive, larger effective resolution, permits longer time steps Better handling of topography than Eta (original NAM) NAM model is now WRF-NMM Fortran 95 (MM5 was F77) NetCDF, GRIB1 and GRIB2

Further advantages MPI from the ground up Allows real data and idealized simulations in same framework Plug-in architecture (different groups will supply WRF “cores”) Recently added: moving nests and nudging NetCDF output - many great tools such as NetCDF operators: http://nco.sourceforge.net/ Examples here ??

WRF disadvantages Bleeding edge Smaller range of physics choices (tho more modern) Software design is unintuitive for physical scientists Can take hours to compile But does not need frequent recompiling Comparatively slower than MM5 NetCDF files can be huge

WRF and related software WRF Preprocessing System (WPS) Replaces/supercedes WRF SI WRF-ARW model Single node and MPI WRF postprocessing software RIP (read/interpolate/plot) GrADS Specific to “hurricane” Synoptic Lab environment Neglecting for now: GRIB2, ARWpost

Web resources WRF model users site ARW users’ guide http://www.mmm.ucar.edu/wrf/users/user_main.html ARW users’ guide http://www.mmm.ucar.edu/wrf/users/docs/user_guide/contents.html WRF-ARW/WPS online tutorial http://www.mmm.ucar.edu/wrf/OnLineTutorial/index.htm WRF namelist description http://www.mmm.ucar.edu/wrf/users/docs/user_guide/users_guide_chap5.html#Nml Tutorial presentations http://www.mmm.ucar.edu/wrf/users/tutorial/tutorial_presentation.htm

My resources This presentation (PPT format) http://www.atmos.ucla.edu/~fovell/WRF/wrf_tutorial_2007.ppt WRF on Mac OS X http://www.atmos.ucla.edu/~fovell/WRF/WRF_ports.html http://macwrf.blogspot.com

Setup on “hurricane” machines Presumed: • tcsh environment • Intel Fortran compiler (64-bit) • my environment setup employed • precompiled versions of WRF, WPS, RIP and wrf_to_grads

Environment setup > …is the command line prompt If you don’t have a .cshrc file (worth saving) - recommended > cp /home/fovell/.cshrc . > source .cshrc If you want to keep your present .cshrc > cp /home/fovell/cshrc_fovell.csh . > ./cshrc_fovell.csh [you need to have the compiler environment set up already]

This environment uses my versions of # RGF additions [abridged] setenv RIP_ROOT /home/fovell/RIP4 setenv GADDIR /home/fovell/lib/grads setenv GASCRP /home/fovell/gradslib # alias lsm 'ls -alt | more' alias rm 'rm -i' alias cp 'cp -i' alias mv 'mv -i' alias trsl ' tail -f rsl.out.0000' alias mpirun 'nohup time /home/fovell/mpich-1.2.7p1/bin/mpirun' alias w2g '/home/fovell/WRF2GrADS/wrf_to_grads' setenv P4_GLOBMEMSIZE 4096000 setenv P4_SOCKBUFSIZE 65536 unlimit limit coredumpsize 0 This environment uses my versions of netcdf, mpich, grads, RIP

Set up a run directory > cd > mkdir FELIX > cd FELIX > cp /home/fovell/WRFtutorial/make_all_links.csh . > make_all_links.csh > cp /home/fovell/WRFtutorial/namelist.* . [copies namelist.input, namelist.wps]

WRF for real-data run Hurricane Felix (2007) [This example uses data that will not remain online]

WPS overview Tasks Controlled by namelist.wps (1) set up a domain (can be reused) geogrid.exe (2) unpack parent model data (e.g., from GFS, NAM, etc.) ungrib.exe (3) prepare unpacked data for WRF metgrid.exe Controlled by namelist.wps

namelist.wps &share wrf_core = 'ARW', max_dom = 1, start_date = '2007-09-02_00:00:00','2007-09-02_00:00:00', end_date = '2007-09-03_12:00:00','2007-09-03_12:00:00', interval_seconds = 10800, io_form_geogrid = 2, / For start_date, end_date need one column for each domain interval_seconds is parent model data frequency (here, 3 h)

namelist.wps (cont.) &geogrid there is more… parent_id = 1, 1, parent_grid_ratio = 1, 3, i_parent_start = 1, 53, j_parent_start = 1, 65, e_we = 70, 259, e_sn = 40, 199 geog_data_res = '2m','2m', dx = 36000, dy = 36000, map_proj = 'lambert', ref_lat = 15.0 ref_lon = -75.0, truelat1 = 29.6, truelat2 = 29.6, stand_lon = -75.0, geog_data_path = '/home/fovell/WPS_GEOG/geog' / there is more…

geogrid - create domain > geogrid.exe * creates geo_em.d01.nc (a NetCDF file) * look for “Successful completion of geogrid.” > plotgrids.exe * creates gmeta > idt gmeta * uses NCAR graphics tool to view domain

‘gmeta’ file

> ncdump geo_em.d01.nc | more netcdf geo_em.d01 { dimensions: Time = UNLIMITED ; // (1 currently) DateStrLen = 19 ; west_east = 69 ; south_north = 39 ; south_north_stag = 40 ; west_east_stag = 70 ; land_cat = 24 ; soil_cat = 16 ; month = 12 ; variables: char Times(Time, DateStrLen) ; float XLAT_M(Time, south_north, west_east) ; XLAT_M:FieldType = 104 ; XLAT_M:MemoryOrder = "XY " ; XLAT_M:units = "degrees latitude" ; XLAT_M:description = "Latitude on mass grid" ; XLAT_M:stagger = "M" ;

Parent model data issues Sources include GFS, NAM, NARR reanalysis data, etc. Need a different Vtable (variable table) for each source e.g., Vtable.GFS, Vtable.AWIP (NAM), Vtable.NARR, etc. Look in /home/fovell/WRFtutorial

Accessing parent model data > link_grib.csh /home/fovell/2007090200/gfs * links to where parent model for case resides ** data files start with ‘gfs*’ > ln -sf /home/fovell/WRFtutorial/Vtable.GFS Vtable * specifies appropriate Vtable > ungrib.exe * extracts parent model data * look for “Successful completion of ungrib.”

Next step: metgrid > metgrid.exe ...hopefully you see ... “Successful completion of metgrid.” ...Output looks like... met_em.d01.2007-09-02_00:00:00.nc met_em.d01.2007-09-02_21:00:00.nc met_em.d01.2007-09-02_03:00:00.nc met_em.d01.2007-09-03_00:00:00.nc met_em.d01.2007-09-02_06:00:00.nc met_em.d01.2007-09-03_03:00:00.nc met_em.d01.2007-09-02_09:00:00.nc met_em.d01.2007-09-03_06:00:00.nc met_em.d01.2007-09-02_12:00:00.nc met_em.d01.2007-09-03_09:00:00.nc met_em.d01.2007-09-02_15:00:00.nc met_em.d01.2007-09-03_12:00:00.nc met_em.d01.2007-09-02_18:00:00.nc

ncdump on a metgrid file netcdf met_em.d01.2007-09-02_00:00:00 { dimensions: Time = UNLIMITED ; // (1 currently) DateStrLen = 19 ; west_east = 69 ; south_north = 39 ; num_metgrid_levels = 27 ; num_sm_levels = 4 ; num_st_levels = 4 ; south_north_stag = 40 ; west_east_stag = 70 ; z-dimension0012 = 12 ; z-dimension0016 = 16 ; z-dimension0024 = 24 ; This data source has 27 vertical levels. This will vary with source.

WRF model steps Tasks Both use namelist.input Run real.exe (to finish creation of WRF model input data) Run wrf.exe Both use namelist.input Configured separately from namelist.wps but includes overlapping information

For start_*, end_*, one column per domain namelist.input &time_control run_days = 0, run_hours = 36, run_minutes = 0, run_seconds = 0, start_year = 2007 , 2007 , start_month = 09 , 09 , start_day = 02 , 02 , start_hour = 00 , 00 , start_minute = 00, 00, start_second = 00, 00, end_year = 2007 , 2007 , end_month = 09 , 09 , end_day = 03 , 03 , end_hour = 12 , 12 , end_minute = 00, 00, end_second = 00, 00, For start_*, end_*, one column per domain

namelist.input (cont.) interval_seconds matches namelist.wps input_from_file = .true., .true., history_interval = 60, 60, frames_per_outfile = 6, 6, restart = .false., restart_interval = 5000, interval_seconds matches namelist.wps input_from_file should normally be ‘true’ for each domain history_interval - how frequently (in min) output created frames_per_outfile - number of writes in each history file If wish to restart mode, restart = .true. (and set model start_* data to restart time) restart_interval = frequency (min) for writing restart files

namelist.input (cont.) time_step = 150, time_step_fract_num = 0, &domains time_step = 150, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, s_we = 1, 1, 1, e_we = 70, 259, 94, s_sn = 1, 1, 1, e_sn = 40, 199, 91, s_vert = 1, 1, 1, e_vert = 31, 31, 31, num_metgrid_levels = 27 dx = 36000, 12000, 333, dy = 36000, 12000, 333, grid_id = 1, 2, 3, parent_id = 0, 1, 2, i_parent_start = 0, 53, 30, j_parent_start = 0, 65, 30, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3,

namelist.input (cont.) mp_physics [Microphysics] = 1 , 1 , ra_lw_physics [Longwave rad] = 1 , 1 , ra_sw_physics [Shortwave rad] = 1 , 1 , radt [Radiation time step; min] = 10 , 10 , sf_sfclay_physics [Surface layer] = 1 , 1 , sf_surface_physics [Surface] = 1 , 1 , bl_pbl_physics [Boundary layer] = 1 , 1 , bldt [Boundary layer time step; min]= 0, 0, cu_physics [cumulus scheme] = 1 , 0 , cudt [cumulus time step; min] = 5 , isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 5 , mp_zero_out = 0 ,

Notes on physics Need to use SAME microphysics (mp) scheme in each domain, but can use different cumulus (cu) schemes Some physics combinations work better than others, some don’t work at all -- this is only lightly documented bldt = 0 means boundary layer scheme is called every time step

namelist.input (cont.) &dynamics w_damping = 0, diff_opt [subgrid turbulence] = 1, km_opt [ “ ] = 4, diff_6th_opt [numerical smoothing] = 0, diff_6th_factor [ “ ] = 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.01, 0.01, 0.01 khdif = 0, 0, 0, kvdif = 0, 0, 0, Only some diff_opt/km_opt combinations make sense, and choices are resolution-dependent. More info: http://www.mmm.ucar.edu/wrf/users/tutorial/tutorial_presentation.htm

http://www.mmm.ucar.edu/wrf/users/tutorial/200707/WRF_Physics_Dudhia.pdf

real.exe Has changed a lot since version 2.1.2 Number of vertical model levels now specified w/ real.exe The num_metgrid_levels comes from parent model; you set e_vert (number of WRF levels) here Can reset WRF levels by rerunning real.exe Can also specify which levels you want e_vert = 31, 31, 31, num_metgrid_levels = 27

Setting levels in namelist.input (optional) WRF uses “sigma” or “eta” coordinates (1.0 is model bottom, 0.0 is top) Added lines to &domains in namelist.input, presuming e_vert = 51, requests a model top pressure of 50 mb (5000 Pa) and concentrates vertical resolution in lower trop p_top_requested = 5000 eta_levels = 1.00,0.9969,0.9935,0.9899,0.9861,0.9821, 0.9777,0.9731,0.9682,0.9629,0.9573,0.9513, 0.9450,0.9382,0.9312,0.9240,0.9165,0.9088, 0.9008,0.8925,0.8840,0.8752,0.8661,0.8567, 0.8471,0.8371,0.8261,0.8141,0.8008,0.7863,0.7704, 0.7531,0.7341,0.7135,0.6911,0.6668,0.6406, 0.6123,0.5806,0.5452,0.5060,0.4630,0.4161, 0.3656,0.3119,0.2558,0.1982,0.1339,0.0804,0.0362,0.0000,

Run real.exe > mpirun -np 2 real.exe wrf@iniki.atmos.ucla.edu's password: starting wrf task 0 of 2 starting wrf task 1 of 2 2.624u 1.248s 0:12.63 30.5% 0+0k 0+0io 0pf+0w > tail rsl.out.0000 --> extrapolating TEMPERATURE near sfc: i,j,psfc, p target d01 2007-09-03_12:00:00 forcing artificial silty clay loam LAND CHANGE = 0 WATER CHANGE = 0 d01 2007-09-03_12:00:00 Timing for processing 0 s. LBC valid between these times 2007-09-03_09:00:00.0000 2007-09-03_12:00:00 d01 2007-09-03_12:00:00 Timing for output 0 s. d01 2007-09-03_12:00:00 Timing for loop # 13 = 0 s. d01 2007-09-03_12:00:00 real_em: SUCCESS COMPLETE REAL_EM INIT

Aside: password-less execution Last slide’s mpirun command asked for 2 cpus (-np 2) By default, 2 cpus on same workstation are accessed To avoid being asked for password: > cd ~/.ssh > ssh-keygen -t dsa [then hit return 4 times] Your public key has been saved in /home/wrf/.ssh/id_dsa.pub. The key fingerprint is: cc:78:50:1e:77:23:ca:8f:81:3d:f0:d2:a4:8a:2e:a7 wrf@iniki.atmos.ucla.edu > cp id_dsa.pub authorized_keys [if does not already exist] > cd ../FELIX

Run wrf.exe Output of real.exe is wrfbdy_d01 and wrfinput_d01 (NetCDF files) Additional wrfinput files created for nests if max_dom > 1 Run the model > mpirun -np 4 wrf.exe & • Creates wrfout_d01* files keyed by simulation date, and rsl.out/rsl.error files for each CPU requested

FELIX output Namelist set up to do 36 h run Look for at end of rsl.out.0000 file: d01 2007-09-03_12:00:00 wrf: SUCCESS COMPLETE WRF Output files created: This is because history_interval was 60 min and frames_per_outfile was 6 wrfout_d01_2007-09-02_00:00:00 wrfout_d01_2007-09-03_00:00:00 wrfout_d01_2007-09-02_06:00:00 wrfout_d01_2007-09-03_06:00:00 wrfout_d01_2007-09-02_12:00:00 wrfout_d01_2007-09-03_12:00:00 wrfout_d01_2007-09-02_18:00:00

Postprocessing WRF output: RIP and GrADS (Vis5D and ARWpost also exist)

RIP RIP operates in batch mode, using input scripts RIP can overlay fields, do arbitrary cross-sections, calculate trajectories, and create Vis5D output files RIP tasks include Unpack model output data (ripdp_wrf) Create RIP plotting scripts (rip.in files) Execute scripts (rip) RIP can create a LOT of output files

RIP procedure > ripdp_wrf run1 all wrfout_d01* [this creates a new dataset called ‘run1’ and uses all wrfout_d01 files created] > rip run1 rip.T2.in [the rip.T2.in file is a script containing RIP plotting commands] [the output file, rip.T2.cgm, is a graphics metafile] > You can view the cgm file using idt or ictrans

36 h forecast (2 m T - color; SLP - contour; 10 m winds - vector)

RIP script http://www.mmm.ucar.edu/mm5/documents/ripug_V4.html =========================================================================== feld=T2; ptyp=hc; vcor=s; levs=1fb; cint=0.5; cmth=fill;> arng; cbeg=283; cend=309; cosq=0,violet,12.5,blue,25,green,37.5,> light.green,50,white,62.5,yellow,75,orange,87.5,red,100,brown feld=U10,V10; ptyp=hv; vcmx=20.0; colr=black; linw=1; intv=2; feld=slp; ptyp=hc; vcor=s; levs=1fb; cint=4; nohl;colr=blue;linw=2;nolb feld=map; ptyp=hb; colr=dark.blue; linw=2; feld=tic; ptyp=hb http://www.mmm.ucar.edu/mm5/documents/ripug_V4.html

GrADS and wrf_to_grads GrADS produces beautiful graphics Batch scriptable AND interactive Interactive: good for overlaying different datasets, computing difference fields [can also be done in RIP] Doesn’t create huge numbers of intermediate files like RIP Arbitrary cross-sections are very difficult to construct

GrADS procedure Copy control_file from /home/fovell/WRFtutorial and edit Select variables desired and define wrfout files to be accessed (next slide) w2g control_file run1g Creates run1g.ctl, run1g.dat http://grads.iges.org/grads/head.html

List of available 2D fields follows control_file -3 ! times to put in GrADS file, negative ignores this 0001-01-01_00:00:00 0001-01-01_00:05:00 0001-01-01_00:10:00 end_of_time_list ! 3D variable list for GrADS file ! indent one space to skip U ! U Compoment of wind V ! V Component of wind UMET ! U Compoment of wind - rotated (diagnostic) VMET ! V Component of wind - rotated (diagnostic) W ! W Component of wind THETA ! Theta TK ! Temperature in K TC ! Temperature in C List of available 2D fields follows

control_file (cont.) ! Now we check to see what to do with the data ! All list of files to read here ! Indent not to read ! Full path OK wrfout_d01_2007-09-02_00:00:00 wrfout_d01_2007-09-02_06:00:00 wrfout_d01_2007-09-02_12:00:00 wrfout_d01_2007-09-02_18:00:00 wrfout_d01_2007-09-03_00:00:00 wrfout_d01_2007-09-03_06:00:00 wrfout_d01_2007-09-03_12:00:00 end_of_file_list ! Now we check to see what to do with the data real ! real (input/output) / ideal / static 1 ! 0=no map background in grads, 1=map background in grads -1 ! specify grads vertical grid ! 0=cartesian, ! -1=interp to z from lowest h ! 1 list levels (either height in km, or pressure in mb) 1000.0 950.0 900.0 850.0 800.0 750.0

Running GrADS > gradsnc -l [GrADS graphics output window opens] ga-> open run1g [ga-> is GrADS environment prompt] ga-> /home/fovell/WRFtutorial/T2_movie.gs [executes this GrADS script; hit return to advance a frame] ga-> quit [to exit]

36 h forecast (2 m T and 10 m winds)

= end =