The Broadband CyberShake Platform: Improving Seismic Hazard Analysis using USC HPCC Scott Callaghan Southern California Earthquake Center University of Southern California SC11
Probabilistic Seismic Hazard Analysis Builders ask seismologists: “What will the peak ground motion be at my new building in the next 50 years?” Seismologists answer this question using Probabilistic Seismic Hazard Analysis (PSHA) PSHA results used in building codes, insurance California building codes impact billions of dollars of construction yearly SCEC uses CyberShake platform to perform PSHA
Probability of exceeding 0.1g in 50 yrs PSHA Reporting PSHA information is relayed through Hazard curves (for 1 location) Hazard maps (for a region) Curve for USC Probability of exceeding 0.1g in 50 yrs
PSHA Methodology Pick a location of interest. Define what future earthquakes might happen. Estimate the magnitude and probability for each earthquake, from earthquake rupture forecast (ERF) Determine the shaking caused by each earthquake at the site of interest. Aggregate the shaking levels with the probabilities to produce a hazard curve. Repeat for multiple sites for a hazard map. Typically performed with attenuation relationships. 4
CyberShake Computations Wave propagation simulation Create 1.5 billion point mesh with material properties Generate Strain Green Tensors (SGTs) for volume Describe stresses and strains Parallel, ~12,000 CPU-hrs
Second Phase Computations Individual earthquake contributions Use “seismic reciprocity” to simulate seismograms for each of 400,000 earthquakes Calculate peak shaking, combine for hazard curve Loosely-coupled, short-running serial jobs
High Frequencies CyberShake calculated seismograms up to 0.5 Hz Higher frequencies influence shorter buildings 1-story building ~ 10 Hz Higher frequency calculations are more intensive 2x frequency = 16x computational effort Instead, use stochastic high frequency approach Already in SCEC Broadband Platform Use to compute ground motion 0.5-10 Hz Created Broadband CyberShake Ran at USC HPCC
Computational Requirements (for 1 site) Component Data Executions Cores/exec CPU hours Mesh generation 15 GB 1 160 150 SGT simulation 40 GB 2 400 12,000 SGT extraction 690 GB 7,000 250 Low-frequency seismograms 10 GB 415,000 1,600 High-frequency seismograms 800 PSA calculation 90 MB 100 Curve generation 1 MB < 1 Total 795 GB 1,252,000 14,900 SGT Creation Post Processing
Scientific Workflow Tools Pegasus Create abstract workflow description (DAX) Logical names, dependencies Plan workflow for site-specific execution (DAG) Logical names resolved Adds stage-in and stage-out of data Condor Workflow submitted to DAGMan Manages workflow execution on remote resources Globus Sends jobs across the grid GridFTP for fast file transfer
Other CyberShake changes New rupture realizations More complexity Less coherence Hazard curves with previous (open circle) and new (closed circle) ruptures Plots of slip for previous (top) and new (bottom) ruptures
Multiple Velocity Models CyberShake now supports multiple velocity models CVM-H minus CVM-S4, depth 0.0m CVM-H 11.2 (red) vs CVM-S4 (blue)
Perris precariously balanced rock,1 s SA Broadband results 710-90 interchange, 0.1 s SA Perris precariously balanced rock,1 s SA Currently calculating PBR and seismic stations for validation
Thanks!