CyberShake Study 2.3 Readiness Review

Slides:



Advertisements
Similar presentations
10/09/2007CIG/SPICE/IRIS/USAF1 Broadband Ground Motion Simulations for a Mw 7.8 Southern San Andreas Earthquake: ShakeOut Robert W. Graves (URS Corporation)
Advertisements

Disk Scrubbing in Large Archival Storage Systems Thomas Schwarz, S.J. 1,2 Qin Xin 1,3, Ethan Miller 1, Darrell Long 1, Andy Hospodor 1,2, Spencer Ng 3.
EPOS use case: High-resolution seismic tomography of Italy
1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California.
AMI S.A. Datasets… Solveig Albrand. AMI S.A. A set is… A number of things grouped together according to a system of classification, or conceived as forming.
CyberShake Study 14.2 Technical Readiness Review.
Overview of Broadband Platform Software as used in SWUS Project Philip Maechling BBP Modelers Meeting 12 June 2013.
Authors: Weiwei Chen, Ewa Deelman 9th International Conference on Parallel Processing and Applied Mathmatics 1.
Descriptive Data Analysis of File Transfer Data Sudarshan Srinivasan Victor Hazlewood Gregory D. Peterson.
Small File File Systems USC Jim Pepin. Level Setting  Small files are ‘normal’ for lots of people Metadata substitute (lots of image data are done this.
NSF Geoinformatics Project (Sept 2012 – August 2014) Geoinformatics: Community Computational Platforms for Developing Three-Dimensional Models of Earth.
1 SCEC Broadband Platform Development Using USC HPCC Philip Maechling 12 Nov 2012.
1.UCERF3 development (Field/Milner) 2.Broadband Platform development (Silva/Goulet/Somerville and others) 3.CVM development to support higher frequencies.
ICS 321 Fall 2011 Overview of Storage & Indexing (i) Asst. Prof. Lipyeow Lim Information & Computer Science Department University of Hawaii at Manoa 11/9/20111Lipyeow.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
CyberShake Study 15.4 Technical Readiness Review.
CyberShake Study 2.3 Technical Readiness Review. Study re-versioning SCEC software uses year.month versioning Suggest renaming this study to 13.4.
Fig. 1. A wiring diagram for the SCEC computational pathways of earthquake system science (left) and large-scale calculations exemplifying each of the.
COUNTING Directions: Write your word on the I pad and on the paper. Count the amount of letters in your word and then find the color that matches that.
06/22/041 Data-Gathering Systems IRIS Stanford/ USGS UNAVCO JPL/UCSD Data Management Organizations PI’s, Groups, Centers, etc. Publications, Presentations,
CyberShake Study 15.3 Science Readiness Review. Study 15.3 Scientific Goals Calculate a 1 Hz map of Southern California Produce meaningful 2 second results.
Phase 1: Comparison of Results at 4Hz Phase 1 Goal: Compare 4Hz ground motion results from different codes to establish whether the codes produce equivalent.
Southern California Earthquake Center CyberShake Progress Update 3 November 2014 through 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Experiences Running Seismic Hazard Workflows Scott Callaghan Southern California Earthquake Center University of Southern California SC13 Workflow BoF.
Southern California Earthquake Center SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) Infrastructure Philip J. Maechling (SCEC) September.
UCERF3 Uniform California Earthquake Rupture Forecast (UCERF3) 14 Full-3D tomographic model CVM-S4.26 of S. California 2 CyberShake 14.2 seismic hazard.
Overview of Broadband Platform Software as used in SWUS Project.
1 1.Used AWP-ODC-GPU to run 10Hz Wave propagation simulation with rough fault rupture in half-space with and without small scale heterogeneities. 2.Used.
Southern California Earthquake Center SI2-SSI: Community Software for Extreme-Scale Computing in Earthquake System Science (SEISM2) Wrap-up Session Thomas.
HMP Simulation - Introduction Deterministic vs. Stochastic Models Risk Analysis Random Variables Best Case/Worst Case Analysis What-If Analysis.
1 LPC Data Organization Robert M. Harris Fermilab With contributions from Jon Bakken, Lothar Bauerdick, Ian Fisk, Dan Green, Dave Mason and Chris Tully.
Southern California Earthquake Center CyberShake Progress Update November 3, 2014 – 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
A Desktop Client for HPC Chemistry Applications: GridChem Kent Milfeld Supported by the NSF NMI Program under Award #
1 USC Information Sciences InstituteYolanda Gil AAAI-08 Tutorial July 13, 2008 Part IV Workflow Mapping and Execution in Pegasus (Thanks.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
Probabilistic Slope Stability Analysis with the
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
SCEC CyberShake on TG & OSG: Options and Experiments Allan Espinosa*°, Daniel S. Katz*, Michael Wilde*, Ian Foster*°,
12/19/01MODIS Science Team Meeting1 MODAPS Status and Plans Edward Masuoka, Code 922 MODIS Science Data Support Team NASA’s Goddard Space Flight Center.
SCEC Capability Simulations on TeraGrid
Overview of Scientific Workflows: Why Use Them?
CyberShake Study 16.9 Science Readiness Review
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Computational Requirements
Innovative Uses of the Database Activity
Color Variation: o Fixed Probability of Exceedance & Duration:
Simplify Your Science with Workflow Tools
Seismic Hazard Analysis Using Distributed Workflows
High Performance Computing at SCEC
Scott Callaghan Southern California Earthquake Center
CyberShake Study 16.9 Discussion
SCEC Community Modeling Environment (SCEC/CME)
High-F Project Southern California Earthquake Center
ALICE Computing Upgrade Predrag Buncic
Philip J. Maechling (SCEC) September 13, 2015
Yu Su, Yi Wang, Gagan Agrawal The Ohio State University
CyberShake Study 17.3 Science Readiness Review
ECETOC TRA tool Proposed application to estimate PEC/PNEC ratio for substances with risk score = 1 ECETOC has developed a tiered approach for calculating.
Pegasus Workflows on XSEDE
What Color is it?.
CyberShake Study 2.2: Science Review Scott Callaghan 1.
Overview of Workflows: Why Use Them?
rvGAHP – Push-Based Job Submission Using Reverse SSH Connections
CyberShake Study 14.2 Science Readiness Review
Southern California Earthquake Center
Southern California Earthquake Center
CyberShake Study 18.8 Technical Readiness Review
CyberShake Study 2.2: Computational Review Scott Callaghan 1.
CyberShake Study 18.8 Planning
Ready?.
Presentation transcript:

CyberShake Study 2.3 Readiness Review

Study 2.3 Overview Compare codes and velocity models RWG V3.0.3 vs AWP-ODC-SGT CVM-S 4 vs CVM-H 11.9 Different version of CVM-H than previous runs Adds San Bernardino, Santa Maria basins 286 Southern California sites 0.5 Hz Deterministic post-processing only

Proposed Study sites

SGT Computational Requirements SGTs on Blue Waters Computational time: 8.4 M SUs RWG: 16k SUs/site x 286 sites = 4.6 M SUs AWP: 13.5k Sus/site x 286 sites = 3.8 M SUs 22.35 M SU allocation, 22 M SUs remaining Storage: 44.7 TB 160 GB/site x 286 sites = 44.7 TB

PP computational requirements Post-processing on Stampede Computational time: 4000 SUs/site x 286 sites = 1.1 M SUs 4.1 M SU allocation, 3.9 M remaining Storage: 44.7 TB input, 13 TB output 44.7 TB of SGT inputs; will need to rotate out Seismograms: 46 GB/site x 286 sites = 12.8 TB PSA files: 0.8 GB/site x 286 sites = 0.2 TB

Long-term storage 44.7 TB SGTs: 13 TB Seismograms, PSA data To be archived to tape (NCSA? TACC? Somewhere else?) 13 TB Seismograms, PSA data Have been using SCEC storage - scec-04? 5.5 TB workflow logs Can compress after mining for stats CyberShake database 1.4 B entries, 330 GB data (scaling issues?)

Verification work 4 sites (WNGC, USC, PAS, SBSM) RWG V3.0.3, CVM-S RWG V3.0.3, CVM-H AWP, CVM-S AWP, CVM-H Plotted with previously calculated RWG V3 Expect RWG V3 slightly higher than the others

WNGC CVM-S CVM-H RWG V3.0.3 - Green AWP - Purple RWG V3 - Orange

USC CVM-S CVM-H RWG V3.0.3 - Green AWP - Purple RWG V3 - Orange

PAS CVM-S CVM-H RWG V3.0.3 - Green AWP - Purple RWG V3 - Orange

SBSM CVM-S CVM-H RWG V3.0.3 - Green AWP - Purple RWG V3 - Orange

SBSM Velocity Profile

Estimated Duration Limiting factors: Blue Waters queue time Uncertain how many sites in parallel Blue Waters → Stampede transfer 100 MB/sec seems sustainable from tests, but could get much worse 50 sites/day; unlikely to reach Estimated completion by end of June

Risks Stampede becomes busier Post-processing still probably shorter than SGTs CyberShake database unable to handle data Would need to create other DBs, distributed DB, change technologies