CyberShake Study 15.3 Science Readiness Review. Study 15.3 Scientific Goals Calculate a 1 Hz map of Southern California Produce meaningful 2 second results.

Slides:



Advertisements
Similar presentations
Scheduling in Distributed Systems Gurmeet Singh CS 599 Lecture.
Advertisements

10/09/2007CIG/SPICE/IRIS/USAF1 Broadband Ground Motion Simulations for a Mw 7.8 Southern San Andreas Earthquake: ShakeOut Robert W. Graves (URS Corporation)
EPOS use case: High-resolution seismic tomography of Italy
1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California.
GridFlow: Workflow Management for Grid Computing Kavita Shinde.
Ground Motion Prediction Equations for Eastern North America Gail M. Atkinson, UWO David M. Boore, USGS (BSSA, 2006)
CyberShake Study 14.2 Technical Readiness Review.
Overview of Broadband Platform Software as used in SWUS Project Philip Maechling BBP Modelers Meeting 12 June 2013.
Ewa Deelman, Pegasus and DAGMan: From Concept to Execution Mapping Scientific Workflows onto the National.
1 The SCEC Broadband Ground Motion Simulation Platform Paul Somerville, Scott Callaghan, Philip Maechling, Robert Graves, Nancy Collins, Kim Olsen, Walter.
SCEC Information Technology Overview for 2012 Philip J. Maechling Information Technology Architect Southern California Earthquake Center SCEC Board of.
Computing and LHCb Raja Nandakumar. The LHCb experiment  Universe is made of matter  Still not clear why  Andrei Sakharov’s theory of cp-violation.
NSF Geoinformatics Project (Sept 2012 – August 2014) Geoinformatics: Community Computational Platforms for Developing Three-Dimensional Models of Earth.
CHEP'07 September D0 data reprocessing on OSG Authors Andrew Baranovski (Fermilab) for B. Abbot, M. Diesburg, G. Garzoglio, T. Kurca, P. Mhashilkar.
VISUALIZING EARTHQUAKE SIMULATION DATA Amit Chourasia 1, Steve Cutchin 1, Alex DeCastro 1, Geoffrey Ely 2 1 San Diego Supercomputer Center 2 Scripps Institute.
1 SCEC Broadband Platform Development Using USC HPCC Philip Maechling 12 Nov 2012.
1.UCERF3 development (Field/Milner) 2.Broadband Platform development (Silva/Goulet/Somerville and others) 3.CVM development to support higher frequencies.
CENA GMPEs from Stochastic Method Simulations: Review, Issues, Recent Work David M. Boore Blue Castle Licensing Project (BCLP) Senior Seismic Hazard Analysis.
CyberShake Study 15.4 Technical Readiness Review.
A Peer-to-Peer Approach to Resource Discovery in Grid Environments (in HPDC’02, by U of Chicago) Gisik Kwon Nov. 18, 2002.
CyberShake Study 2.3 Technical Readiness Review. Study re-versioning SCEC software uses year.month versioning Suggest renaming this study to 13.4.
The SCEC Broadband Platform Recent Activities and Developments Philip Maechling, Fabio Silva, Scott Callaghan, Thomas H. Jordan Southern California Earthquake.
Fig. 1. A wiring diagram for the SCEC computational pathways of earthquake system science (left) and large-scale calculations exemplifying each of the.
Pegasus: Running Large-Scale Scientific Workflows on the TeraGrid Ewa Deelman USC Information Sciences Institute
1 Admission Control and Request Scheduling in E-Commerce Web Sites Sameh Elnikety, EPFL Erich Nahum, IBM Watson John Tracey, IBM Watson Willy Zwaenepoel,
Validation of physics-based ground motion earthquake simulations using a velocity model improved by tomographic inversion results 1 Ricardo Taborda, 1.
Phase 1: Comparison of Results at 4Hz Phase 1 Goal: Compare 4Hz ground motion results from different codes to establish whether the codes produce equivalent.
HIGH FREQUENCY GROUND MOTION SCALING IN THE YUNNAN REGION W. Winston Chan, Multimax, Inc., Largo, MD W. Winston Chan, Multimax, Inc., Largo, MD Robert.
Southern California Earthquake Center CyberShake Progress Update 3 November 2014 through 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Experiences Running Seismic Hazard Workflows Scott Callaghan Southern California Earthquake Center University of Southern California SC13 Workflow BoF.
Southern California Earthquake Center SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) Infrastructure Philip J. Maechling (SCEC) September.
UCERF3 Uniform California Earthquake Rupture Forecast (UCERF3) 14 Full-3D tomographic model CVM-S4.26 of S. California 2 CyberShake 14.2 seismic hazard.
Funded by the NSF OCI program grants OCI and OCI Mats Rynge, Gideon Juve, Karan Vahi, Gaurang Mehta, Ewa Deelman Information Sciences Institute,
California Earthquake Rupture Model Satisfying Accepted Scaling Laws (SCEC 2010, 1-129) David Jackson, Yan Kagan and Qi Wang Department of Earth and Space.
Earthquake source modelling by second degree moment tensors Petra Adamová Jan Šílený Geophysical Institute, Academy of Sciences, Prague, Czech Republic.
1 1.Used AWP-ODC-GPU to run 10Hz Wave propagation simulation with rough fault rupture in half-space with and without small scale heterogeneities. 2.Used.
Southern California Earthquake Center SI2-SSI: Community Software for Extreme-Scale Computing in Earthquake System Science (SEISM2) Wrap-up Session Thomas.
CyberShake and NGA MCER Results Scott Callaghan UGMS Meeting November 3, 2014.
Visualizing large scale earthquake simulations Amit Chourasia Visualization Scientist San Diego Supercomputer Center Presented to: Advanced User Support,
LIGO-G9900XX-00-M DMT Monitor Verification with Simulated Data John Zweizig LIGO/Caltech.
Querying the Internet with PIER CS294-4 Paul Burstein 11/10/2003.
Southern California Earthquake Center CyberShake Progress Update November 3, 2014 – 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
1 USC Information Sciences InstituteYolanda Gil AAAI-08 Tutorial July 13, 2008 Part IV Workflow Mapping and Execution in Pegasus (Thanks.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
SCEC CyberShake on TG & OSG: Options and Experiments Allan Espinosa*°, Daniel S. Katz*, Michael Wilde*, Ian Foster*°,
National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Recent TeraGrid Visualization Support Projects at NCSA Dave.
Overview of Scientific Workflows: Why Use Them?
CyberShake Study 2.3 Readiness Review
CyberShake Study 16.9 Science Readiness Review
Simplify Your Science with Workflow Tools
Seismic Hazard Analysis Using Distributed Workflows
High Performance Computing at SCEC
Meeting Objectives Discuss proposed CISM structure and activities
Scott Callaghan Southern California Earthquake Center
The SCEC Broadband Platform: Computational Infrastructure For Transparent And Reproducible Ground Motion Simulation Philip J. Maechling [1], Fabio Silva.
CyberShake Study 16.9 Discussion
High-F Project Southern California Earthquake Center
Philip J. Maechling (SCEC) September 13, 2015
University of Southern California
High-Performance Computing (HPC) IS Transforming Seismology
CyberShake Study 17.3 Science Readiness Review
CyberShake Study 2.2: Science Review Scott Callaghan 1.
rvGAHP – Push-Based Job Submission Using Reverse SSH Connections
CyberShake Study 14.2 Science Readiness Review
Southern California Earthquake Center
Southern California Earthquake Center
CyberShake Study 18.8 Technical Readiness Review
CyberShake Study 2.2: Computational Review Scott Callaghan 1.
CyberShake Study 18.8 Planning
Presentation transcript:

CyberShake Study 15.3 Science Readiness Review

Study 15.3 Scientific Goals Calculate a 1 Hz map of Southern California Produce meaningful 2 second results for the UGMS RotD50 and RotD100 at 2, 3, 4, 5, 7.5, 10 seconds Contour maps Compare 0.5 Hz and 1 Hz hazard maps Use Graves & Pitarka (2014) rupture generator with regular spaced hypocenters 336 sites (10 km mesh + points of interest + “gap” sites) Run 14 UGMS sites first Produce 1 Hz seismograms which could be combined with BBP high-frequency seismograms

Proposed Study sites (336) Green sites are the 50 new “gap” sites

Study 15.3 Data Products CVM-S4.26 Los Angeles-area hazard maps RotD100 2, 3, 4, 5, 7.5, 10 sec RotD50 2, 3, 4, 5, 7.5, 10 sec Geometric mean 2, 3, 5, 10 sec Hazard curves for 286 sites, at 2s, 3s, 5s, 10s 336 sets of 2-component SGTs Seismograms for all ruptures (~160M) Peak amplitudes in DB for 2s, 3s, 5s, 10s RotD100, RotD50 and geometric mean SA

Study 15.3 Notables First 1 Hz hazard maps First study with RotD50 and RotD100 calculated First study to use OLCF Titan First study with Graves & Pitarka (2014) rupture generator with uniformly spaced hypocenters First study with 200 m rupture grid point spacing First study with source filtered at a different frequency than the simulation frequency

Study 15.3 Parameters 1.0 Hz deterministic 100 m spacing dt=0.005 sec nt=40000 timesteps CVM-S 4.26 Vs min = 500 m/s UCERF 2 Graves & Pitarka (2014) rupture variations 200 m rupture grid point spacing Source filtered at 2.0 Hz

Rupture Generator Differences When rupture geometry was changed from 1000m to 200m resolution, hazard curves changed dramatically TEST site: blue = 200 m, black = 1000 m 0.5Hz UCERF2 3 sec SA CVM-S4

Rupture Generator We determined that the change in hazard curves was due to hypocenter undersampling M6.55, Puente Hills

Rupture Generator changes Previous number of realizations related to fault length # of realizations = max(10, C * Area/10.0) C = 0.5 Each realization is unique slip + hypocenter location Supports either random or uniform hypocenter distribution

Rupture Generator v3.3.1 Use of new G&P rupture generator (v3.3.1) brought 1000m and 200m curves into agreement TEST site, black = 200 m, magenta = 1000 m 0.5Hz UCERF2 3 sec SA CVM-S4

Random vs Uniform Hypocenters Variation counts G&P 2010: 423k Uniform: 485k Random: 542k Uniform easier to interpolate 0.5Hz UCERF2 3 sec SA CVM-S4.26 WNGC site: black=random, magenta=uniform

Source Filtering, 0.5 Hz simulation Changed frequency of 4 th order lowpass Butterworth filtering of SGT source from 0.5 to 1 Hz WNGC site: Blue = filtered at 1 Hz; Black = filtered at 0.5 Hz

PseudoAA content by source filter WNGC site, blue = filtered at 0.5 Hz, green = filtered at 1 Hz M8.05, Elsinore

Source Filtering, 1 Hz simulation Blue = filtered at 2 Hz; Black = filtered at 1 Hz Changed frequency of 4 th order lowpass Butterworth filtering of SGT source from 1 Hz to 2 Hz

PseudoAA content by source filter WNGC site, blue = filtered at 1 Hz, green = filtered at 2 Hz M8.05, Elsinore

Fourier content by source filter WNGC site blue = filtered at 1 Hz green = filtered at 2 Hz M8.05, Elsinore

Computational Requirements Per site: ~3720 node-hrs SGTs: depends on execution site (~50%) Titan = 2110 node-hrs / 63,300 SUs Blue Waters = 1760 node-hrs / 30,200 SUs More expensive for Titan because of padding in pilot jobs and different node-hrs -> SU conversion PP: 1880 node-hrs / 60,200 SUs (~50%) Computational time: Titan (SGTs): 355K node-hours / 10.7M SUs Blue Waters: 928K node-hours SGTs: 275K GPU node-hrs, 21K CPU node-hrs PP: 632K CPU node-hrs Titan has 104M SUs remaining Blue Waters has 5.3M node-hrs remaining

Storage Requirements Titan Purged: 526 TB (for SGTs and temp data) Blue Waters Delayed purge: 506 TB (for Titan SGTs) Purged: 526 TB SGTs + 9 TB data products SCEC Archived: 9.1 TB (seismograms, PSA, RotD) Database: 268 GB 4 periods, 6) Temporary: 608 GB (workflow logs) Shared SCEC disks have 171 TB free

Estimated Duration Limiting factors: XK node queue time 800 XK nodes is 19% of Blue Waters Titan -> Blue Waters If throughput is very high, transfer could be bottleneck USC HPC downtime for ~1 week in April Estimated completion is 12 weeks (11 running + 1 downtime) Based on same node availability as Study 14.2 Planning to request reservation on Blue Waters Planning to request high priority on Titan

Personnel Support Scientists Tom Jordan, Kim Olsen, Rob Graves Technical Lead Scott Callaghan Job Submission / Run Monitoring Scott Callaghan, David Gill, Phil Maechling NCSA Support Omar Padron, Tim Bouvet Titan Support Val Anantharaj USC Support John Yu, John Mehringer Workflow Support Karan Vahi, Gideon Juve

Science To-dos Pending Confirm SGTs from Titan give same result as SGTs from Blue Waters Calculate two duplicate SGT sites on Blue Waters and Titan and confirm results match Forward simulations versus reciprocity? Run forward simulations to confirm reciprocity and forward calculations match Requires converting SRF to AWP-ODC source input format

Risks Queue times on Blue Waters for XK nodes Will try to dynamically assign SGT jobs to resources Unforeseen complications with Titan pilot jobs Small tests have worked OK, but issues at scale? Congestion protection events (network overloaded) If triggered consistently, will need to limit number of post-processing workflows

Action Items Confirm 200s is long enough for SGT simulation Insert ERF 36 hypocenters into DB Decide whether or not to run forward simulation Determine why 2 Hz filtered source isn’t showing expected differences in seismograms or hazard curves Select additional sites to help fill in gaps and discontinuities in the hazard map

Thanks for your time!