CyberShake Study 16.9 Science Readiness Review

Slides:



Advertisements
Similar presentations
Standard Workflow Scheme v2.0. This is an interactive presentation of the Standard Workflow Scheme v2.0. Feel free to click on the boxes to see the possible.
Advertisements

10/09/2007CIG/SPICE/IRIS/USAF1 Broadband Ground Motion Simulations for a Mw 7.8 Southern San Andreas Earthquake: ShakeOut Robert W. Graves (URS Corporation)
© red ©
1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California.
Average properties of Southern California earthquake ground motions envelopes… G. Cua, T. Heaton Caltech.
Selection of Time Series for Seismic Analyses
CyberShake Study 14.2 Technical Readiness Review.
Overview of Broadband Platform Software as used in SWUS Project Philip Maechling BBP Modelers Meeting 12 June 2013.
1 The SCEC Broadband Ground Motion Simulation Platform Paul Somerville, Scott Callaghan, Philip Maechling, Robert Graves, Nancy Collins, Kim Olsen, Walter.
SCEC Information Technology Overview for 2012 Philip J. Maechling Information Technology Architect Southern California Earthquake Center SCEC Board of.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
NSF Geoinformatics Project (Sept 2012 – August 2014) Geoinformatics: Community Computational Platforms for Developing Three-Dimensional Models of Earth.
CHEP'07 September D0 data reprocessing on OSG Authors Andrew Baranovski (Fermilab) for B. Abbot, M. Diesburg, G. Garzoglio, T. Kurca, P. Mhashilkar.
1 SCEC Broadband Platform Development Using USC HPCC Philip Maechling 12 Nov 2012.
1.UCERF3 development (Field/Milner) 2.Broadband Platform development (Silva/Goulet/Somerville and others) 3.CVM development to support higher frequencies.
Computational Seismology at LLNL: A National Lab Perspective Arthur Rodgers Atmospheric, Earth and Environmental Sciences Department Lawrence Livermore.
CyberShake Study 15.4 Technical Readiness Review.
CyberShake Study 2.3 Technical Readiness Review. Study re-versioning SCEC software uses year.month versioning Suggest renaming this study to 13.4.
The SCEC Broadband Platform Recent Activities and Developments Philip Maechling, Fabio Silva, Scott Callaghan, Thomas H. Jordan Southern California Earthquake.
Fig. 1. A wiring diagram for the SCEC computational pathways of earthquake system science (left) and large-scale calculations exemplifying each of the.
Pegasus: Running Large-Scale Scientific Workflows on the TeraGrid Ewa Deelman USC Information Sciences Institute
COUNTING Directions: Write your word on the I pad and on the paper. Count the amount of letters in your word and then find the color that matches that.
Some General Implications of Results Because hazard estimates at a point are often dominated by one or a few faults, an important metric is the participation.
CyberShake Study 15.3 Science Readiness Review. Study 15.3 Scientific Goals Calculate a 1 Hz map of Southern California Produce meaningful 2 second results.
Phase 1: Comparison of Results at 4Hz Phase 1 Goal: Compare 4Hz ground motion results from different codes to establish whether the codes produce equivalent.
Southern California Earthquake Center CyberShake Progress Update 3 November 2014 through 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Experiences Running Seismic Hazard Workflows Scott Callaghan Southern California Earthquake Center University of Southern California SC13 Workflow BoF.
Southern California Earthquake Center SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) Infrastructure Philip J. Maechling (SCEC) September.
UCERF3 Uniform California Earthquake Rupture Forecast (UCERF3) 14 Full-3D tomographic model CVM-S4.26 of S. California 2 CyberShake 14.2 seismic hazard.
Funded by the NSF OCI program grants OCI and OCI Mats Rynge, Gideon Juve, Karan Vahi, Gaurang Mehta, Ewa Deelman Information Sciences Institute,
Overview of Broadband Platform Software as used in SWUS Project.
1 1.Used AWP-ODC-GPU to run 10Hz Wave propagation simulation with rough fault rupture in half-space with and without small scale heterogeneities. 2.Used.
Southern California Earthquake Center SI2-SSI: Community Software for Extreme-Scale Computing in Earthquake System Science (SEISM2) Wrap-up Session Thomas.
CyberShake and NGA MCER Results Scott Callaghan UGMS Meeting November 3, 2014.
Southern California Earthquake Center CyberShake Progress Update November 3, 2014 – 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Welcome to the CME Project Meeting 2013 Philip J. Maechling Information Technology Architect Southern California Earthquake Center.
1 USC Information Sciences InstituteYolanda Gil AAAI-08 Tutorial July 13, 2008 Part IV Workflow Mapping and Execution in Pegasus (Thanks.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
SCEC CyberShake on TG & OSG: Options and Experiments Allan Espinosa*°, Daniel S. Katz*, Michael Wilde*, Ian Foster*°,
National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Recent TeraGrid Visualization Support Projects at NCSA Dave.
Overview of Scientific Workflows: Why Use Them?
CyberShake Study 2.3 Readiness Review
High Performance Computing at SCEC
ShakeAlert CISN Testing Center (CTC) Development
Pegasus WMS Extends DAGMan to the grid world
Simplify Your Science with Workflow Tools
Seismic Hazard Analysis Using Distributed Workflows
High Performance Computing at SCEC
Scott Callaghan Southern California Earthquake Center
The SCEC Broadband Platform: Computational Infrastructure For Transparent And Reproducible Ground Motion Simulation Philip J. Maechling [1], Fabio Silva.
CyberShake Study 16.9 Discussion
SCEC Community Modeling Environment (SCEC/CME)
High-F Project Southern California Earthquake Center
Philip J. Maechling (SCEC) September 13, 2015
2University of Southern California, 3San Diego Supercomputer Center
USC Information Sciences Institute {jihie, gil,
CAPS Real-time Storm-Scale EnKF Data Assimilation and Forecasts for the NOAA Hazardous Weather Testbed Spring Forecasting Experiments: Towards the Goal.
High-Performance Computing (HPC) IS Transforming Seismology
CyberShake Study 17.3 Science Readiness Review
TeraScale Supernova Initiative
Pegasus Workflows on XSEDE
CyberShake Study 2.2: Science Review Scott Callaghan 1.
rvGAHP – Push-Based Job Submission Using Reverse SSH Connections
CyberShake Study 14.2 Science Readiness Review
Southern California Earthquake Center
Southern California Earthquake Center
CyberShake Study 18.8 Technical Readiness Review
CyberShake Study 2.2: Computational Review Scott Callaghan 1.
CyberShake Study 18.8 Planning
Presentation transcript:

CyberShake Study 16.9 Science Readiness Review

Study 16.9 Scientific Goals Expand CyberShake to Central California Calculate 1 Hz map with Vs min = 900 m/s Two velocity models CCA-06 CCA 1D model Calculate hazard at sites of interest to PG&E Compare Southern California and Central California results at overlapping sites

Proposed Study sites (438) 10 km spacing (purple) 5 km spacing (green) CISN + OBS stations (orange) Missions (blue) USGS California Gazetteer locations (red) PG&E pumping stations (cyan) Diablo Canyon was removed, but CISN station is ~1.4 km away Pink box is 180x240 km

Study 16.9 Data Products CCA-06 and CCA 1D Central California hazard maps RotD100 2, 3, 4, 5, 7.5, 10 sec Geometric mean 2, 3, 5, 10 sec Hazard curves for 438 sites x 2 velocity models, at 2s, 3s, 4s, 5s, 7.5s, 10s Seismograms for all ruptures (~438M) Peak amplitudes in DB for 2s, 3s, 4s, 5s, 7.5s, 10s RotD100, RotD50 and geometric mean SA Durations in DB for velocity and acceleration 5-75%, 5-95%, 20-80%

Study 16.9 Notables First Central California CyberShake calculation First study with CCA models First study with a different minimum Vs First study with new workflow approach on Titan, enabling end-to-end CyberShake First deterministic study to include duration calculations

Study 16.9 Parameters 1.0 Hz deterministic Vs min = 900 m/s 175 m spacing dt=0.00875 sec nt=23000 timesteps (201.25 sec) Vs min = 900 m/s CCA-06 model Volumes extending outside of CCA-06 will use CVM-S4.26 if possible, then the SCEC 1D model CCA 1D model UCERF 2 Graves & Pitarka (2014) rupture variations 200 m rupture grid point spacing Source filtered at 2.0 Hz

Inclusion of northern SAF events Using 200 km cutoff, 1/3 of sites capture northern events Will continue to use 200 km cutoff, so some sites will include northern SAF events and some won’t

Computational Requirements Per site: 1220-1940 node-hrs SGTs: (20-30%) Titan = 400 node-hrs (12,000 SUs) Blue Waters = 400 node-hrs (6,400 SUs) PP: (60-75%) Titan = 1440 node-hrs (43,200 SUs) Blue Waters = 720 node-hrs (23,040 SUs) More expensive for Titan due to 16 cores/node Total computational time Titan (219 runs): 425K node-hours / 15.9M SUs Blue Waters (657 runs): 1.00M node-hours Titan has 23M SUs remaining in 2016 Blue Waters has 3M node-hrs remaining (11/16?)

Storage Requirements Titan Blue Waters Purged: 398 TB SGTs + 3.3 TB data products Blue Waters Purged: 1193 TB SGTs + 10 TB data products Will clean up as we go to avoid exceeding quotas SCEC Archived: 13.3 TB Seismograms, PSA, RotD, durations Database: 918 GB Geom @ 4, RotD100 @ 6, RotD50 @ 6, 8 durations Temporary: 1 TB (workflow logs) Shared SCEC disks have 109 TB free

Estimated Duration Limiting factors: Estimated completion is 5 weeks XK node queue time Has been long outside of Blue Waters reservation Unscheduled downtime Titan workflow performance New database performance Estimated completion is 5 weeks Based on same node availability as Study 15.4 Planning to request reservation on Blue Waters Planning to request increased quota on Titan

Personnel Support Scientists Technical Lead NCSA Support Titan Support Tom Jordan, Kim Olsen, Rob Graves Technical Lead Scott Callaghan NCSA Support Tim Bouvet, Greg Bauer Titan Support Judy Hill USC Support John Yu Workflow Support Karan Vahi

Science To-dos Pending Calculate hazard curves for 3 overlapping test sites for both CCA-06 and CCA 1D Calculate hazard curves for 3 non-overlapping sites with varied Vs30 values with both CCA-06 and CCA 1D Confirm runs on Titan give same result as runs on Blue Waters Calculate two duplicate sites on Blue Waters and Titan and confirm results match

Risks Currently unable to run on Titan due to certificate issues Working with OLCF staff to resolve; if not, move full calculation to Blue Waters Queue times on Blue Waters for XK nodes Unforeseen complications with Titan workflows Small tests have worked OK, but issues at scale? Database performance Study 15.4 was OK, but now including durations Changed DB configuration for better performance, but not tested in production yet

Action Items Calculate test hazard curves for V&V Add duration calculation to deterministic code Have meetings with OLCF and NCSA

Thanks for your time!