Presentation is loading. Please wait.

Presentation is loading. Please wait.

Keith Kelley Presentation 2 Weather Forecasting with Parallel Computers.

Similar presentations


Presentation on theme: "Keith Kelley Presentation 2 Weather Forecasting with Parallel Computers."— Presentation transcript:

1 Keith Kelley Presentation 2 Weather Forecasting with Parallel Computers

2 Numerical Weather Prediction (NWP) Weather Forecast Computer Modeling

3

4

5 Weather Model: defined ● Mathematical model – Mathematical description of a system ● Runs on a computer – Practically speaking, a parallel computer ● Results in a numerical prediction

6 Types of weather model ● Climate model ● Forecast model: global ● Forecast model: regional (or mesoscale) ● Atmospheric dispersion models ● Others, including special types for cyclones

7 Forecast Models: basis ● Fluid dynamics ● Thermodynamics ● Specifically: ● Atmospheric dynamics ● Atmospheric thermodynamics

8 Forecast Models Global ● IFS ● GEM ● GFS ● NOGAPS ● UM ● JMA ● GME ● ARPEGE Regional ● MM5 ● NAM ● RUC ● RAMS ● WRF ● RAQMS ● HIRLAM ● LAPS

9 How the models work ● Choose an area and set a grid ● Gather weather readings at the grid points ● Run the model in small time steps until goal ● Compare to reality

10 Computers used ● From ENIAC ● To the most powerful supercomputers ● Clusters ● Even single processor computers ● Now Nvidia GPUs

11 WRF Model ● Weather Research and Forecasting ● Most common model ● Currently version 3.1 ● Replaces MM5 ● Mesoscale (regional) model ● First released in 2000 ● Used by the National Weather Service, the US Military ● Multiple variants including ARW, NMM, HWRF

12 WRF Developers ● National Center for Atmospheric Research ● National Oceanic and Atmospheric Administration ● National Centers for Environmental Prediction ● Forecast Systems Laboratory ● Air Force Weather Agency ● Naval Research Laboratory ● University of Oklahoma ● Federal Aviation Administration and others

13 WRF Design Goals ● Designed to replace MM5, RUC and ETA ● Modular, flexible, maintainable and extensible ● Overcome older language restrictions ● Completely new code base ● Run on different types of parallel computers

14 WRF Development Teams ● Numerics and Software ● Data Assimilation ● Analysis and Validation ● Community Involvement ● Operational Implementation

15 WRF Working Groups ● Dynamic Model Numerics ● Software Architecture, Standards, and Implementation ● Analysis and Visualization ● Data Handling and Archiving ● others

16 Realizing WRF ● Model design ● Model development ● Software infrastructure design ● Software design and development

17 Model Design ● Pluggable physics modules to share with other packages ● Major types: microphysics, cumulus parameterization, planetary boundary layer, turbulence, radiation

18 Software Infrastructure Design ● WRF Advanced Software Framework (ASF) ● four-dimensional variational data assimilation (4DVAR) ● single-source code methodology ● use of modern-programming language constructs in Fortran90 ● a layered software architecture with well-defined interfaces ● multi-level parallel decomposition ● a code registry data base ● application program interfaces (APIs) ● a nesting/coupling infrastructure that is scalable and efficient ● choice of a storage order and loop nesting order

19 WRF Software Architecture

20 WRF Software Design ● Parallelism: two-level decomposition ● Hierarchical software design – Driver layer – Model layer – Mediation layer ● External libraries for platform support, data input

21 WRF on CUDA ● Done by John Michalakes, one of the main developers of WRF ● Part of the main source tree ● Results in significant performance gains ● Not a port, an acceleration of certain kernels ● WRF Single Moment 5 Cloud Microphysics ● WRF Fifth Order Positive Definitive Tracer Advection

22 WRF CUDA Power (WSM5)

23 WRF CUDA Code Excerpt //--------------------------------------------------------------- // psfrz: freezing of rain water [HL A20] [LFO 45] // (T S) //--------------------------------------------------------------- if( supcol > 0. && qr_k > 0. ) { float temp = rsloper[k] ; temp = temp*temp*temp*temp*temp*temp*temp ; float pfrzdtr = MIN(20.*(pi*pi)*pfrz1*n0r*denr/den[k] *(exp(pfrz2*supcol)-1.)*temp*dtcld, qr_k) ; qs_k = qs_k + pfrzdtr ; t_k = t_k + xlf/cpm_k*pfrzdtr ; qr_k = qr_k-pfrzdtr ; }

24 Model to Weather Forecast ● 6-7 days practical limit per run of a model ● Model Output Statistics ● Ensemble forecasting

25 References ● mmm.ucar.edu ● wrf-model.org – Michalakes, J., J. Dudhia, D. Gill, J. Klemp and W. Skamarock: Design of a next- generation regional weather research and forecast model : Towards Teracomputing, World Scientific, River Edge, New Jersey, 1998, pp. 117-124. – Michalakes, J., S. Chen, J. Dudhia, L. Hart, J. Klemp, J. Middlecoff, and W. Skamarock (2001): "Development of a Next Generation Regional Weather Research and Forecast Model" in Developments in Teracomputing: Proceedings of the Ninth ECMWF Workshop on the Use of High Performance Computing in Meteorology. Eds. Walter Zwieflhofer and Norbert Kreitz. World Scientific, Singapore. pp. 269- 276. – Michalakes, J.G., M. McAtee, J. Wegiel, "Software Infrastructure for the Weather Research and Forecast Model", in proceedings of UGC 2002, June, Austin, Texas, 13pp. – more...

26 ● Q: What part of the WRF code was sped up by using a GPU? ● A: the microphysics kernel Question


Download ppt "Keith Kelley Presentation 2 Weather Forecasting with Parallel Computers."

Similar presentations


Ads by Google