Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 NEMS 3.0 and beyond Mark Iredell June 30, 2010.

Similar presentations


Presentation on theme: "1 NEMS 3.0 and beyond Mark Iredell June 30, 2010."— Presentation transcript:

1 1 NEMS 3.0 and beyond Mark Iredell June 30, 2010

2 2 NEMS 3.0 and beyond Status –Structure Eight layers Coupling –Rules Regression Subversion –Tasks IO Documentation –Schedule Proposal –Libraries –Postprocessor –Insulation

3 3 NEMS 3.0 Component Structure MAIN EARTH(1:M) Ocean AtmosIce NMM GFS FIM Dyn Phy Domains(1:N) Below the dashed line the cores can be organized as best suits their own needs. Wrt Dyn Phy Wrt Dyn Phy Wrt NEMS EnsCpl Chem

4 4 NEMS hierarchy of components A.NEMS Couples ensemble B.EARTH Couples atm/ocn C.ATM Couples gfs/nmm D.NMMdriver Couples nests E.NMMinstance Couples dyn/phys F.PHYS Couples phys/chemistry/land G.CHEM

5 5 NEMS hierarchy rules for each component One coupling per component, generally. Clock received is read only. Use start and end time given. A new clock may be created at each level. This local clock can be updated in a loop. The local clock increment is the coupling interval. Read own config file to determine options such as local clock increment Create own internal state as usual. Create import and export states of children, including couplers as usual. Attach import to child’s import and attach child’s export to its export.

6 6 Choosing a core Atmosphere chooses the core or cores to run. In Atmosphere Init, core name is read in from configure file. Atmosphere calls specific core component’s setservices and links it to generic Atmos core component name. Each possible core not used has a setservices stub file to resolve links.

7 7 Run-time Registry Each component lists fields and their disposition (History, Restart, Import, Export, Ownership) in a run-time registry. NMMB-Dynamics TemperatureHRIEO GFS-Phys TemperatureH-IEO PrecipitationH---- Names linked to internal state names in code, and can be aliased to CF names. Ownership can be shared during development and single during operations.

8 8 Same code can be used for both 1-way and 2-way coupling subroutine atmos_run 1-way coupling processors (N=total atmos, G=global) (set up in atmos_init) 2-way coupling processors (N=total atmos) (set up in atmos_init) do timeloop run global 1:G1:N couple g2m 1:N run meso G+1:N1:N couple m2g 01:N end do

9 9 Same code can be used for both standalone runs subroutine atmos_run standalone global processors (set up in atmos_init) standalone meso processors (set up in atmos_init) do timeloop run global 1:N0 couple g2m 00 run meso 01:N couple m2g 00 end do

10 10 Code is slightly different to enable concurrent 2-way coupling subroutine earth_run 2-way concurrent coupling (N=total earth, A=atmos) (set up in earth_init) 2-way ~concurrent coupling (N=total earth, A=atmos) (set up in earth_init) do timeloop run atmos 1:A1:N run ocean A+1:N1:N couple (skin or classic) 1:N end do

11 11 generic coupling component subroutine generic_run 1-way (standalone if N=1) 2-way seq2-way con2-way ~con do timeloop do N models run model(n) PnPn allPnPn coupler1(n) P n +C n all00 end do coupler2 00all end do

12 12 NEMS issues in high-level coupling between atmosphere, waves, ice, ocean, and land Under ESMF NEMS, there should be not much difference between SPMD and MPMD codes. However, the MPMD code might not be as portable. MPMD will allow more flexible optimizations though. Due to some rapidly interacting physics, some coupling would have to be frequent and possibly iterative. The fast mode components may need to run serially on the atmospheric component’s processors. The fast and slow “modes” of model components may have different import and output requirements, and notably different processor sets. Assume the “fast” physics is column-oriented and doesn’t require horizontal differencing. It may require coupling to different grids, however. If “fast” physics does not require different grids, physics subprograms with plug-compatible interfaces may suffice instead of full gridded components.

13 13 Earth system model Possible NEMS Earth system model component tree Atmosphere Color Key Column subprogram Fast-Slow Coupler Component class Coupler class Ocean (Slow) Waves (Slow) Ice (Slow) Land (Slow) Slow Processes Waves (Fast) Ice (Fast) Land (Fast) Ocean (Fast) Fast Surface Turbul ence PhysicsDynamics Etc. Rad. SlowFastC = iteration

14 14 Earth system model load-balance scenario assuming 3 fast timesteps for every 1 slow timestep FastCSlow Fast iceFast landFast wavesFast oceanSlow ice Sfc-Atm coupling Slow land Atmosphere Slow wavesSlow waves Slow ocean Atm-Sfc coupling Fast iceFast landFast wavesFast ocean Sfc-Atm coupling Atmosphere Atm-Sfc coupling Fast iceFast landFast wavesFast ocean Sfc-Atm coupling Atmosphere Atm-Sfc coupling and Fast-Slow two-way coupling processor timetime

15 15 Regression Testing and Subversion All NEMS developers have full access to the NEMS development trunk as well as their own branches Merging back to trunk is the responsibility of the developer. Full regression test set must be passed before committing back to trunk. Regression test set ensure certain capabilities are not broken. Regression test set may take a few hours. (Current) One day’s email notice is given before commitment. (Future) Changes are signed off using Trac ticketing system.

16 16 NEMS developers Ed Colonmakefiles, scripts, regression Nicole McKeedocumentation, web, testing Ratko Vasicupgrades, regression, atmos coupling Jun WangIO, post, configuration Weiyu Yangensemble, earth coupling, ESMF

17 17 NEMS developers Tom Black Dusan Jovic Jim Abeles NAM S Moorthi Henry Juang GFS Jesse Meng Jim Geiger Land Sarah Lu Arlindo da Silva GOCART Tom HendersonFIM

18 XML post control file - NMMB BGDAWP 255 ncep_nco v2003 local_tab_no fcst oper fcst hour nws_ncep nmm_8km jpeg2000 fltng_pnt lossless PRES hybrid_lvl 1 2 0 0 -2 -2

19 BGRDSF 255 ncep_nco v2003 local_tab_no fcst oper fcst hour nws_ncep nmm_8km jpeg2000 lossless PRES hybrid_lvl 1 0 -2

20 20 Results NMMBDisk spaceWall timeTotal Memory grib1950MB275740MB parallel grb2 (JPEG) 296MB 31% 220 80%1168MB 157% GFS master files from NEMS GFS ( 1152*576, 664 fields) B grid 8KM output from NEMS NMMB ( 954*835, 1098fields) GFSDisk spaceWall timeTotal Memory grib1605MB720.85G parallel grb2 (JPEG) 195MB 32% 60 83%1.1G 129%

21 21 NEMS delivery plans 2010 –NMMB with nests 2010-2011 –GFS –GEFS –Post on quilt 2011 –FIM –Multimodel ensemble –GRIB2 output –NMM nested in GFS 2012 –Moving nests –Coupled ocean atmosphere –Tiled land model –netCDF output

22 22 Proposals Global branch (GFS and GEFS) should buy in like Meso branch has. –Moorthi is working on bringing NEMS up to GFS Model 9.0.0 level.

23 23 Proposals Bring non-scientific library support management within software development team, maybe in collaboration with NCO. –GRIB2 –IPLIB –Physical constants module

24 24 Proposals Develop postprocessor as both a standalone code and one that runs on the quilt. Management of the scientific postprocessor code would be done outside of NEMS management. That is, regression testing of the NEMS and postprocessor would be independent. This could be implemented using tagged versions.

25 25 Proposals Insulate other scientific codes such as column physics from the NEMS. Column physics could become an independently managed library. Similarly, NAM and GFS science codes could be insulated as well. Their development would be independent of NEMS regression testing. This could be implemented using tagged versions.

26 26 Code Management Relationships = NEMS managed


Download ppt "1 NEMS 3.0 and beyond Mark Iredell June 30, 2010."

Similar presentations


Ads by Google