Report on POP & CICE of RACM components Jaromir Jakacki, IO PAS.

Slides:



Advertisements
Similar presentations
Basics of numerical oceanic and coupled modelling Antonio Navarra Istituto Nazionale di Geofisica e Vulcanologia Italy Simon Mason Scripps Institution.
Advertisements

COMPUTER PROGRAMMING Task 1 LEVEL 6 PROGRAMMING: Be able to use a text based language like Python and JavaScript & correctly use procedures and functions.
Computation of High-Resolution Global Ocean Model using Earth Simulator By Norikazu Nakashiki (CRIEPI) Yoshikatsu Yoshida (CRIEPI) Takaki Tsubono (CRIEPI)
Part 2: UM Data Files University of Reading, 3-5 December 2014.
WP4 Task T4.2 WP4-T4.2 : Establishment of validation criteria of multidisciplinary information products
Experiments with Monthly Satellite Ocean Color Fields in a NCEP Operational Ocean Forecast System PI: Eric Bayler, NESDIS/STAR Co-I: David Behringer, NWS/NCEP/EMC/GCWMB.
1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
CORE-II HYCOM Science application and test cases
Scientific Programming MAIN INPUTINITCOMPUTEOUTPUT SOLVER DERIV FUNC2 TABUL FUNC1 STATIC BLASLAPACKMEMLIB.
Coupled GCM The Challenges of linking the atmosphere and ocean circulation.
WRF-VIC: The Flux Coupling Approach L. Ruby Leung Pacific Northwest National Laboratory BioEarth Project Kickoff Meeting April 11-12, 2011 Pullman, WA.
NEMO Developments and application at the Bedford Institute of Oceanography, Canada F. Dupont, Y. Lu, Z. Wang, D. Wright Nemo user meeting 2009Dalhousie-DFO.
Configuring ROMS for South of Java Kate Hedstrom, ARSC/UAF October, 2007.
 What is an operating system? What is an operating system?  Where does the OS fit in? Where does the OS fit in?  Services provided by an OS Services.
Introduction to Visual Basic. Quick Links Windows Application Programming Event-Driven Application Becoming familiar with VB Control Objects Saving and.
Coupled Model Data Assimilation: Building an idealised coupled system Polly Smith, Amos Lawless, Alison Fowler* School of Mathematical and Physical Sciences,
CESM/RACM/RASM Update May 15, Since Nov, 2011 ccsm4_0_racm28:racm29:racm30 – vic parallelization – vic netcdf files – vic coupling mods and “273.15”
A1 Visual C++.NET Intro Programming in C++ Computer Science Dept Va Tech August, 2002 © Barnette ND & McQuain WD 1 Quick Introduction The following.
Model Coupling Environmental Library. Goals Develop a framework where geophysical models can be easily coupled together –Work across multiple platforms,
CS 114 – Class 02 Topics  Computer programs  Using the compiler Assignments  Read pages for Thursday.  We will go to the lab on Thursday.
Coupled Model Data Assimilation: Building an idealised coupled system Polly Smith, Amos Lawless, Alison Fowler* School of Mathematical and Physical Sciences,
Large company practices. Small company responsiveness. Working for YOU. Jose C. Renteria Kevin Lind 1 RASM: Enhancing VIC.
Understanding the TigerSHARC ALU pipeline Determining the speed of one stage of IIR filter – Part 3 Understanding the memory pipeline issues.
Imposed versus Dynamically Modeled Sea Ice: A ROMS study of the effects on polynyas and waters masses in the Ross Sea John M. Klinck, Y. Sinan Hüsrevoglu.
Level 2 Algorithm. Definition of Product Levels LevelDescription Level 1 1A Reconstructed unprocessed instrument data 1B Geolocated, calibrated sensor.
Sophie RICCI CALTECH/JPL Post-doc Advisor : Ichiro Fukumori The diabatic errors in the formulation of the data assimilation Kalman Filter/Smoother system.
Regional Models in CCSM CCSM/POP/ROMS: Regional Nesting and Coupling Jon Wolfe (CSEG) Mariana Vertenstein (CSEG) Don Stark (ESMF)
TEMPO Ground Systems: Cloud Processing Ewan O’Sullivan John C. Houck SDPC Software Developers May 27, 2015.
1 Motivation Motivation SST analysis products at NCDC SST analysis products at NCDC  Extended Reconstruction SST (ERSST) v.3b  Daily Optimum Interpolation.
ROMS in Alaska Waters Kate Hedstrom, ARSC/UAF Enrique Curchitser, IMCS/Rutgers August, 2007.
ROMS as a Component of the Community Climate System Model (CCSM) Enrique Curchitser, IMCS/Rutgers Kate Hedstrom, ARSC/UAF Bill Large, Mariana Vertenstein,
2nd GODAE Observing System Evaluation Workshop - June Ocean state estimates from the observations Contributions and complementarities of Argo,
Status of the COSMO-Model Package Ulrich Schättler.
The CCSM2.0 Quick Start Guide Lawrence Buja CCSM Software Engineering Group June
Workshop Agenda: Day One 9:30 IntroductionLagerloef / Le Vine 9:45 Workshop objectivesG. Feldman 10:00 Overview of the Aquarius Data Processing System:G.
Transfer of ITER SOLPS4.2 simulations to SOLPS5.1 X. Bonnin (CNRS-LIMHP), A. Kukushkin (ITER), D. Coster (IPP-Garching) ● ITER divertor and SOL have been.
DATA ASSIMILATION M. Derkova, M. Bellus, M. Nestiak.
Karsten Köneke October 22 nd 2007 Ganga User Experience 1/9 Outline: Introduction What are we trying to do? Problems What are the problems? Conclusions.
NCEP ESMF GFS Global Spectral Forecast Model Weiyu Yang, Mike Young and Joe Sela ESMF Community Meeting MIT, Cambridge, MA July 21, 2005.
ESMF Regridding Update Robert Oehmke Ryan O’Kuinghttons Amik St. Cyr.
2-Dec Offline Report Matthias Schröder Topics: Scientific Linux Fatmen Monte Carlo Production.
Mixed Layer Ocean Model: Model Physics and Climate
Lecture 21 MA471 Fall 03. Recall Jacobi Smoothing We recall that the relaxed Jacobi scheme: Smooths out the highest frequency modes fastest.
On the Road to a Sequential CCSM Robert Jacob, Argonne National Laboratory Including work by: Mariana Vertenstein (NCAR), Ray Loy (ANL), Tony Craig (NCAR)
Oceans and climate: an OCCAM perspective Andrew C. Coward 1 Large Scale modelling team James Rennell Division for Ocean Circulation and Climate Southampton.
An Overview of ROMS Code Kate Hedstrom, ARSC April 2007.
CCSM/RACM Update May CCSM Schedule April 1, 2010, CCSM4.0 Release June 15, 2010, CESM1.0 Release – adds ecosystem + glc (glimmer) + ? June 28.
Report on POP & CICE of RACM components Jaromir Jakacki, IO PAS Boulder, CO, 2010.
NCAS Computational Modelling Service (CMS) Group providing services to the UK academic modelling community Output of UM Diagnostics Directly in CF NetCDF;
TRMM TMI Rainfall Retrieval Algorithm C. Kummerow Colorado State University 2nd IPWG Meeting Monterey, CA. 25 Oct Towards a parametric algorithm.
HPC HPC-5 Systems Integration High Performance Computing 1 Application Resilience: Making Progress in Spite of Failure Nathan A. DeBardeleben and John.
One-year re-forecast ensembles with CCSM3.0 using initial states for 1 January and 1 July in Model: CCSM3 is a coupled climate model with state-of-the-art.
ARSC/IARC Report DOD/RACM Workshop, December Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center 2. International Arctic Research.
Ocean Data Assimilation for SI Prediction at NCEP David Behringer, NCEP/EMC Diane Stokes, NCEP/EMC Sudhir Nadiga, NCEP/EMC Wanqiu Wang, NCEP/EMC US GODAE.
CESM-CAM5 Large Ensemble Update May 13, list for all future updates -
1 Data assimilation developments for NEMO  A working group was created at the 2006 Developers Meeting with the objective of standardizing for NEMO certain.
Coupled HYCOM in CESM and ESPC Alexandra Bozec, Eric P. Chassignet.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarksEGEE-III INFSO-RI MPI on the grid:
Development of the Regional Arctic Climate System Model (RACM) --- Department of Civil and Environmental Engineering University of Washington May, 2010.
Lecture 3 Translation.
Operational COSMO of MeteoSwiss
Hands-On Microsoft Windows Server 2008
Short note on seasonal forecasts
UM Science Configurations
Call to Fix Dell Printer Error Code
Performance of the VIC land surface model in coupled simulations
ARSC & IARC Report Juanxiong He1,2 Greg Newby1
Jacki Kinney Climatology December 6, 2005
UM Science Configurations
Report on RACM – POP & CICE components
Presentation transcript:

Report on POP & CICE of RACM components Jaromir Jakacki, IO PAS

Main problems during period Extended domain in POP Netcdf of the ocean component does not work correctly (fixed temporary - need to be fixed using other make file) Kinetic energy problem was resolved after when wind are dumped – successful integration of 6 years for the ocn/ice/datm/slnd components Also negative area problem was fixed when coupling time step was set to 1800 s

Extended domain Monthly climatology data was interpolated for extended domain (temperature and salinity from PHC 3.0) Some code was added to read those data and mask (ocn_init_mct routine) Some code was added to interpolate and merge data with ocean domain at coupling time step (ocn_setdef_mct routine) All was tested and seems to be working fine, only mask will require a little work

Extended domain – interpolation data (at coupling time step) 1)Read in data (monthly SST, SSS (PHC 3.0) and mask) 2)Interpolation data in occ_comp_mct.F90 at each coupling time step simple method – linear interpolation Y = AX+B For two months Y1=AX1+B and Y2=AX2+B A=(Y2-Y1)/(X2-X1) and B=y2-AX2 3) Merge interpolated data with ocean sea surface temperature

Extended domain (24 points- current version has 48) SST_OUT=SST_CLIM*mask+SST_POP*(1-mask)

Interpolated and merged extended domain (2 nd result) – sst POP and climatology sst (PHC)

Interpolated extended domain (1 st result)

Interpolated and merged extended domain (2 nd result) – sss POP and climatology sss

Examples of results from ocn and ice components

Future plans We are going to switch to other grid (will not exactly match with current) Mask for merging sst and sss requires some work. But only mask. It is ready for running, mask can be replaced at any time After switching to new grid we are able to begin integration - I am planning to begin to work with ocn/ice/wrf/slnd model using converted old ocean restart file

New grid

Important things Net cdf in ocean model – optional compilation of the io_netcdf.F90, io.F90, io_types.F90 and io_binary.F90 Diagnostic output in all models (performance) there is debbuging level from 0 to 4? (not sure), but there is a lot of write statements from all cores Sometimes model hangs when after saving ocean restart file (I have no idea – there is no output – jobs are killed because of wall time) – about 20% of jobs Computer resources – waiting time for one 256 processors job is about 2 days!!!!

Performance chart for all models Number of cores Average time [sec]

Code for linear interpolation (just for checking – maybe somebody will see mistake – not for presentation) data data_days /-14.5,15.5,45,74.5,105,135.5,166,196.5, & 227.5,258,288.5,319,349.5,380.5/ call mct_aVect_copy(avSST12,av_xp1,temp_flds(xp1),'fld') call mct_aVect_copy(avSST12,av_xp2,temp_flds(xp2),'fld') av_A%rAttr(1,:)=(av_xp2%rAttr(1,:)-av_xp1%rAttr(1,:))/(data_days(xp2)-data_days(xp1)) av_B%rAttr(1,:)=av_xp2%rAttr(1,:)-av_A%rAttr(1,:)*data_days(xp2) av_l_temp%rAttr(1,:)=av_A%rAttr(1,:)*rday+av_B%rAttr(1,:) !interpolation of salinity call mct_aVect_copy(avSSS12,av_xp1,salt_flds(xp1),'fld') call mct_aVect_copy(avSSS12,av_xp2,salt_flds(xp2),'fld') av_A%rAttr(1,:)=(av_xp2%rAttr(1,:)-av_xp1%rAttr(1,:))/(data_days(xp2)-data_days(xp1)) av_B%rAttr(1,:)=av_xp2%rAttr(1,:)-av_A%rAttr(1,:)*data_days(xp2) av_l_salt%rAttr(1,:)=av_A%rAttr(1,:)*rday+av_B%rAttr(1,:)