IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.

Slides:



Advertisements
Similar presentations
Weather Research & Forecasting: A General Overview
Advertisements

Customization of a Mesoscale Numerical Weather Prediction System for Energy & Utility Applications Anthony P. Praino and Lloyd A. Treinish Deep Computing.
Earth System Curator Spanning the Gap Between Models and Datasets.
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech.
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
The Role of High-value Observations for Forecast Simulations in a Multi- scale Climate Modeling Framework Gabriel J. Kooperman, Michael S. Pritchard, and.
The Problem of Parameterization in Numerical Models METEO 6030 Xuanli Li University of Utah Department of Meteorology Spring 2005.
A Cloud Resolving Model with an Adaptive Vertical Grid Roger Marchand and Thomas Ackerman - University of Washington, Joint Institute for the Study of.
COST ES0602: Towards a European Network on Chemical Weather Forecasting and Information Systems.
THORPEX-Pacific Workshop Kauai, Hawaii Polar Meteorology Group, Byrd Polar Research Center, The Ohio State University, Columbus, Ohio David H. Bromwich.
Mesoscale & Microscale Meteorological Division / NCAR ESMF and the Weather Research and Forecast Model John Michalakes, Thomas Henderson Mesoscale and.
B1 -Biogeochemical ANL - Townhall V. Rao Kotamarthi.
1 Geophysical Fluid Dynamics Laboratory Review June 30 - July 2, 2009.
Next Gen AQ model Need AQ modeling at Global to Continental to Regional to Urban scales – Current systems using cascading nests is cumbersome – Duplicative.
V. Chandrasekar (CSU), Mike Daniels (NCAR), Sara Graves (UAH), Branko Kerkez (Michigan), Frank Vernon (USCD) Integrating Real-time Data into the EarthCube.
Challenges in Urban Meteorology: A Forum for Users and Providers OFCM Panel Summaries Bob Dumont Senior Staff Meteorologist OFCM.
Domain Applications: Broader Community Perspective Mixture of apprehension and excitement about programming for emerging architectures.
Status of RSA LAPS/MM5 System Sustainment John McGinley, Steve Albers*, Chris Anderson*, Linda Wharton NOAA Earth System Research Laboratory Global Systems.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
Future role of DMR in Cyber Infrastructure D. Ceperley NCSA, University of Illinois Urbana-Champaign N.B. All views expressed are my own.
Birmingham Urban Heat Islands during 2003 Heatwave Period  Preliminary WRF/BEP Simulation Xiaoming Cai, Richard Bassett and John E. Thornes School of.
Technology + Process SDCI HPC Improvement: High-Productivity Performance Engineering (Tools, Methods, Training) for NSF HPC Applications Rick Kufrin *,
ESIP Federation: Connecting Communities for Advancing Data, Systems, Human & Organizational Interoperability November 22, 2013 Carol Meyer Executive Director.
Modelling Theme White Paper G. Flato, G. Meehl and C. Jakob.
Comparison of Different Approaches NCAR Earth System Laboratory National Center for Atmospheric Research NCAR is Sponsored by NSF and this work is partially.
INFSO-RI Enabling Grids for E-sciencE ES applications in EGEEII – M. Petitdidier –11 February 2008 Earth Science session Wrap up.
The New Zealand Institute for Plant & Food Research Limited Use of Cloud computing in impact assessment of climate change Kwang Soo Kim and Doug MacKenzie.
Office of Science Office of Biological and Environmental Research DOE Workshop on Community Modeling and Long-term Predictions of the Integrated Water.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Kelvin K. Droegemeier and Yunheng Wang Center for Analysis and Prediction of Storms and School of Meteorology University of Oklahoma 19 th Conference on.
Large Scale Nuclear Physics Calculations in a Workflow Environment and Data Provenance Capturing Fang Liu and Masha Sosonkina Scalable Computing Lab, USDOE.
PARTNERING WITH THE NATIONAL SCIENCE FOUNDATION Michael C. Morgan Director, Division of Atmospheric and Geospace Sciences National Science Foundation.
Components, Coupling and Concurrency in the Earth System Modeling Framework N. Collins/NCAR, C. DeLuca/NCAR, V. Balaji/GFDL, G. Theurich/SGI, A. da Silva/GSFC,
BalticGrid-II Project BalticGrid-II Kick-off Meeting, , Vilnius1 Joint Research Activity Enhanced Application Services on Sustainable e-Infrastructure.
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.
3 rd Annual WRF Users Workshop Promote closer ties between research and operations Develop an advanced mesoscale forecast and assimilation system   Design.
A Portable Regional Weather and Climate Downscaling System Using GEOS-5, LIS-6, WRF, and the NASA Workflow Tool Eric M. Kemp 1,2 and W. M. Putman 1, J.
Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM)
The CF Conventions: Options for Sustained Support Involving Unidata Russ Rew Unidata Policy Committee May 12, 2008.
Joint Meeting of the AUS, US, XS Working Groups TG10 Tuesday August 3, hrs Elwood II.
Theme 2: Data & Models One of the central processes of science is the interplay between models and data Data informs model generation and selection Models.
High Impact Weather Emerging challenge identified at CASXVI Mariane DIOP KANE Mariane DIOP KANE CASMG9, Geneva, April 2014.
CESD 1 SAGES Scottish Alliance for Geoscience, Environment & Society The challenges of geo-simulation data Centre For Earth System Dynamics
Alison Pamment 1, Steve Donegan 1, Calum Byrom 2, Oliver Clements 3, Bryan Lawrence 1, Roy Lowry 3 1 NCAS/BADC, Science and Technology Facilities Council,
1 Accomplishments. 2 Overview of Accomplishments  Sustaining the Production Earth System Grid Serving the current needs of the climate modeling community.
Highest performance parallel storage for HPC environments Garth Gibson CTO & Founder IDC HPC User Forum, I/O and Storage Panel April 21, 2009.
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
1 Symposium on the 50 th Anniversary of Operational Numerical Weather Prediction Dr. Jack Hayes Director, Office of Science and Technology NOAA National.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Data Requirements for Climate and Carbon Research John Drake, Climate Dynamics Group Computer.
OGCE Workflow and LEAD Overview Suresh Marru, Marlon Pierce September 2009.
Proposed THORPEX/HEPEX Hydrologic Ensemble Project (THEPS) Presentation for 3 rd THORPEX Science Symposium September 14-18, 2009 Prepared by John Schaake,
Presented by LCF Climate Science Computational End Station James B. White III (Trey) Scientific Computing National Center for Computational Sciences Oak.
Support to scientific research on seasonal-to-decadal climate and air quality modelling Pierre-Antoine Bretonnière Francesco Benincasa IC3-BSC - Spain.
DET Module 1 Ensemble Configuration Linda Wharton 1, Paula McCaslin 1, Tara Jensen 2 1 NOAA/GSD, Boulder, CO 2 NCAR/RAL, Boulder, CO 3/8/2016.
The Global Scene Wouter Los University of Amsterdam The Netherlands.
The CF Conventions: Governance and Community Issues in Establishing Standards for Representing Climate, Forecast, and Observational Data Russ Rew 1, Bob.
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca Dr. Robert Detrick visit March 28, 2012
AMPS Update – July 2010 Kevin W. Manning Jordan G. Powers Mesoscale and Microscale Meteorology Division NCAR Earth System Laboratory National Center for.
Project Management Approach
STMAS/LAPS for Convective Weather Analysis and Short-range Forecast
Geoffrey Fox Panel Talk: February
Energy efficient SCalable
Grid infrastructure development: current state
Tropical Cyclone Structure-2008 (TCS-08) ONR/NRL Funded Projects
Ruisdael Observatory:
High Impact Weather Emerging challenge identified at CASXVI
The value cycle discovery-translation-application
Presentation transcript:

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide

Panel Format Sequence – Alphabetical Few bullet points for each question, each participant can address/supplement it After each panel member has finished it, we move on to the next question Moderators can adjust depending on discussions and time constraints

Panel Members Steve Finn & Sharan Kalwani Panel ParticipantAffiliation Jim DoyleDoD HPC Modernization Program Jim HackORNL John MichalakesNCAR Henry TufoUniversity of Colorado

Q1. Relative Importance of data/resolution/micro-physics ! Please quantify the relative importance of improvements in observational data, grid resolution, cloud micro-physics for future forecast accuracy ? For prediction 1.Observations and understanding of observations Data assimilation Ensembles 2.Physics Scale appropriate Sensitivities Superparameterizations 3.Resolution Explicitly resolve scales Convergence studies, feed back to prediction

Q2. Adaptive Mesh or Embedded Grids: their impact… Please discuss the use of Adaptive Mesh or embedded grids ( urban area, terrain impact on advection..) and how would future increased use of this, impact system requirements such as system interconnects? Nesting Domains interact sequential Scatter-gather 3D fields between domains Spatial refinement In place, adding cells Temporal refinement (future) Adaptivity (future) Coupling Load balancing, bandwidth

Q3. Ratio of Date to Compute: Background… What are your Bytes per Flop for future requirements? Assuming the question means “bytes of main memory per sustained flop/s” (D. H. Bailey) Current – lots of headroom ~2000 ops per cell per second ~800 bytes (4 byte floats) per cell = 0.4 bytes per op Future Resolution follows 3/4 rule (2/3 in practice) Adding physics or chemistry should not upset this ratio * This is a relatively *low* ratio compared to some other benchmarks? Geerd Hoffman of DW said the Oct 2007 HPC User Forum in Stuttgart)

Q4. Open Source codes in the community… What is the Importance and impact of open source / community code applications such as CCSM, WRF,….? Common modeling tool to foster interaction, outreach, and ultimately advancement of the science Relevant HPC application benchmarks

Q5. Data and collaboration, formats, future needs… What is the level of collaboration and standardization of data management, observational & results data bases: such as use of common file formats, web based data, etc. What is needed in the future? Scientific and technical interoperability for multi-model simulation systems Metadata formalisms, conventions, infrastructure: Earth System Curator ( Earth System Grid (

Q6. Ensemble model: your experiences… Has the use of ensemble model had any positive or negative impact in reducing the code scaling requirements? Ensembles have a positive effect on scaling because they are trivially scalable

Q7. Obligatory Question: (no pun intended!) Cloud computing: your views (unfiltered)… What is your current / future interest in grid or cloud computing ? Computational grids are not feasible for tightly coupled parallel applications Reproducibility across platforms also an issue Data and observing grids are useful WRF is used in LEAD (portal.leadproject.org)