Download presentation
Presentation is loading. Please wait.
Published byEvelyn Miles Modified over 9 years ago
1
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide
2
Panel Format Sequence – Alphabetical Few bullet points for each question, each participant can address/supplement it After each panel member has finished it, we move on to the next question Moderators can adjust depending on discussions and time constraints
3
Panel Members Steve Finn & Sharan Kalwani Panel ParticipantAffiliation Jim DoyleDoD HPC Modernization Program Jim HackORNL John MichalakesNCAR Henry TufoUniversity of Colorado
4
Q1. Relative Importance of data/resolution/micro-physics ! Please quantify the relative importance of improvements in observational data, grid resolution, cloud micro-physics for future forecast accuracy ? For prediction 1.Observations and understanding of observations Data assimilation Ensembles 2.Physics Scale appropriate Sensitivities Superparameterizations 3.Resolution Explicitly resolve scales Convergence studies, feed back to prediction
5
Q2. Adaptive Mesh or Embedded Grids: their impact… Please discuss the use of Adaptive Mesh or embedded grids ( urban area, terrain impact on advection..) and how would future increased use of this, impact system requirements such as system interconnects? Nesting Domains interact sequential Scatter-gather 3D fields between domains Spatial refinement In place, adding cells Temporal refinement (future) Adaptivity (future) Coupling Load balancing, bandwidth
6
Q3. Ratio of Date to Compute: Background… What are your Bytes per Flop for future requirements? Assuming the question means “bytes of main memory per sustained flop/s” (D. H. Bailey) Current – lots of headroom ~2000 ops per cell per second ~800 bytes (4 byte floats) per cell = 0.4 bytes per op Future Resolution follows 3/4 rule (2/3 in practice) Adding physics or chemistry should not upset this ratio * This is a relatively *low* ratio compared to some other benchmarks? Geerd Hoffman of DW said 4Bytes/Flop @ the Oct 2007 HPC User Forum in Stuttgart)
7
Q4. Open Source codes in the community… What is the Importance and impact of open source / community code applications such as CCSM, WRF,….? Common modeling tool to foster interaction, outreach, and ultimately advancement of the science Relevant HPC application benchmarks
8
Q5. Data and collaboration, formats, future needs… What is the level of collaboration and standardization of data management, observational & results data bases: such as use of common file formats, web based data, etc. What is needed in the future? Scientific and technical interoperability for multi-model simulation systems Metadata formalisms, conventions, infrastructure: Earth System Curator (www.earthsystemcurator.org) Earth System Grid (www.earthsystemgrid.org)
9
Q6. Ensemble model: your experiences… Has the use of ensemble model had any positive or negative impact in reducing the code scaling requirements? Ensembles have a positive effect on scaling because they are trivially scalable
10
Q7. Obligatory Question: (no pun intended!) Cloud computing: your views (unfiltered)… What is your current / future interest in grid or cloud computing ? Computational grids are not feasible for tightly coupled parallel applications Reproducibility across platforms also an issue Data and observing grids are useful WRF is used in LEAD (portal.leadproject.org)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.