Download presentation
Presentation is loading. Please wait.
Published byFabiola Vaccaro Modified over 5 years ago
1
CyberShake Study 14.2 Science Readiness Review
2
Study 14.2 Scientific Goals
Compare impact of velocity models on Los Angeles-area hazard maps with various velocity models CVM-S4.26, BBP 1D, CVM-H 11.9, no GTL Compare to CVM-S, CVM-H 11.9 with GTL Investigate impact of GTL Compare 1D reference model Compare tomographic inversion results 286 sites (10 km mesh + points of interest)
3
CVM-S4.26 Model Starting point was Po’s perturbations
On 500 m grid Minimum Vs = 1000 m/s CVM-S4.26 integrates perturbations with CVM- S4, allowing for querying in arbitrary resolution. Preserves CVM-S GTL while lowering velocities in rock sites. If “inside the basin” (Vs<1000 m/s), preserve CVM- S4 material properties If “outside the basin”, (Vs>1000 m/s), trilinearly interpolate Po’s perturbations with CVM-S4.
4
CVM-S4.26 vs. CVM-S4
5
CVM-H 11.9, no GTL Model
6
BBP 1D Model
7
Proposed Study sites
8
Study 14.2 Data Products 2 CVM-S4.26 Los Angeles-area hazard maps
1 BBP 1D Los Angeles-area hazard map 1 CVM-H 11.9, no GTL Los Angeles-area hazard map Hazard curves for 286 sites x 4 conditions, at 3s, 5s, 10s 1144 sets of 2-component SGTs Seismograms for all ruptures (~470M) Peak amplitudes in DB for 3s, 5s, 10s
9
Study 14.2 Notables First CVM-S4.26 hazard maps
First CVM-H, no GTL hazard maps First 1D hazard maps First study using AWP-SGT-GPU First CyberShake Study using a single workflow on one system (Blue Waters)
10
Study 14.2 Parameters 0.5 Hz, deterministic CVMs UCERF 2
200 m spacing CVMs Vs min = 500 m/s UCERF 2 Graves & Pitarka (2010) rupture variations
11
Verification 4 sites (USC, PAS, WNGC, SBSM)
AWP-SGT-CPU, CVM-S4.26 AWP-SGT-GPU, CVM-S4.26 AWP-SGT-CPU, BBP 1D AWP-SGT-GPU, CVM-H 11.9, no GTL Plotted with previously calculated curves
12
CVM-S4.26 (orange), CVM-S (blue), CVM-H 11.9 (magenta)
CVM-S4.26 (CPU) CVM-S4.26 (orange), CVM-S (blue), CVM-H 11.9 (magenta)
13
CVM-S4.26 (GPU) CVM-S4.26 GPU (magenta), CPU (orange)
14
CVM-H, no GTL (CPU) 3 sec, CVM-H 11.9 no GTL (black), CVM-H 11.9 with GTL (purple)
15
BBP 1D (black), CVM-S4 (blue), CVM-H 11.9 (magenta)
16
Computational Requirements
Computational time: 275K node-hrs SGT Computational time: 180K node-hrs CPU: 86K node-hrs GPU: 52K node-hrs Study 13.4 had 29% overrun on SGTs PP Computational time: 95K node-hrs 70K node-hrs Study 13.4 had 35% overrun on PP Current allocation has 3.0M node-hrs remaining
17
Storage Requirements Blue Waters SCEC Unpurged: 45 TB (for SGTs)
Purged: 12 TB (seismograms) TB (temp) SCEC Archived: 12.5 TB (seismograms, PSA files) Database: 210 GB (PSA at 3, 5, 10s) Temporary: 5.5 TB (workflow logs)
18
Estimated Duration Limiting factors:
Queue time Especially for XK nodes, could be substantial percentage of run time Blue Waters -> SCEC transfer If Blue Waters throughput is very high, transfer could be bottleneck With queues, estimated completion is 4 weeks With a reservation, completion depends on the reservation size
19
Personnel Support Scientists Technical Lead
Tom Jordan, Kim Olsen, Rob Graves Technical Lead Scott Callaghan Job Submission / Run Monitoring Scott Callaghan, David Gill, Phil Maechling NCSA Support Omar Padron, Tim Bouvet Workflow Support Karan Vahi, Gideon Juve
20
Risks Queue times on Blue Waters
In tests, at times GPU queue times have been > 1 day Congestion protection events (network overloaded) If triggered consistently, will either need to throttle post-processing or suspend run until improvements are developed
21
Thanks for your time!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.