1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California.

Slides:



Advertisements
Similar presentations
EPRI/SOG Mmax –Six earth-science teams, diverse methods largest observed eq (+ increment) catalog statistics – extreme occurrences seismogenic feature:
Advertisements

Edward (Ned) Field USGS, Pasadena plus Tom Jordan, Nitin Gupta, Vipin Gupta, Phil Maechling, Allin Cornell, Ken Campbell, Sid Hellman, & Steve Rock OpenSHA.
CyberShake Project and ShakeMaps. CyberShake Project CyberShake is a SCEC research project that is a physics-based high performance computational approach.
10/09/2007CIG/SPICE/IRIS/USAF1 Broadband Ground Motion Simulations for a Mw 7.8 Southern San Andreas Earthquake: ShakeOut Robert W. Graves (URS Corporation)
Faults in Focus: Earthquake Science Accomplishments Thomas H. Jordan Director, Southern California Earthquake Cente r 28 February 2014.
Prague, March 18, 2005Antonio Emolo1 Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Integrating Probabilistic and Deterministic Approaches.
Tom Heaton Caltech Geophysics and Civil Engineering.
Earthquake Predictibility, Forcasting and Early Warning Bill Menke October 18, 2005.
The use of risk in design: ATC 58 performance assessment procedure Craig D. Comartin.
Course Schedule: Week 1 DAYDATEMORNING ( )LecturerAFTERNOON ( ) Prepared by Monday 2nd September Hazard and Risk; Tectonics; Earthquakes & Faults DMB Practicum.
Turkey Earthquake Risk Model Financing the Risks of Natural Disasters World Bank Washington, DC, June 2-3, 2003 Dennis E. Kuzak Senior Vice President,
Outline: Lecture 4 Risk Assessment I.The concepts of risk and hazard II.Shaking hazard of Afghanistan III.Seismic zone maps IV.Construction practice What.
Seismic Hazard Assessment for the Kingdom of Saudi Arabia
Earthquake Hazard Session 1 Mr. James Daniell Risk Analysis
NSF Geoinformatics Project (Sept 2012 – August 2014) Geoinformatics: Community Computational Platforms for Developing Three-Dimensional Models of Earth.
Lisa Wald USGS Pasadena U.S. Department of the Interior U.S. Geological Survey USGS Earthquake Hazards Program Earthquakes 101 (EQ101)
SCEC/CME Project - How Earthquake Simulations Drive Middleware Requirements Philip Maechling SCEC IT Architect 24 June 2005.
1 SCEC Broadband Platform Development Using USC HPCC Philip Maechling 12 Nov 2012.
1.UCERF3 development (Field/Milner) 2.Broadband Platform development (Silva/Goulet/Somerville and others) 3.CVM development to support higher frequencies.
Estimation of Future Earthquake Annualized Losses in California B. Rowshandel, M. Reichle, C. Wills, T. Cao, M. Petersen, and J. Davis California Geological.
OPENQUAKE Mission and Vision It is GEM’s mission to engage a global community in the design, development and deployment of state-of-the-art models and.
Many Faults, Many Rupture Scenarios for So. NV J. Louie, EGGE 3/25/2011  Japan and Christchurch Lesson: Don’t Ignore Worst Case! dePolo, 2008, NBMG Map.
Fig. 1. A wiring diagram for the SCEC computational pathways of earthquake system science (left) and large-scale calculations exemplifying each of the.
On Parallel Time Domain Finite Difference Computation of the Elastic Wave Equation and Perfectly Matched Layers (PML) Absorbing Boundary Conditions (With.
06/22/041 Data-Gathering Systems IRIS Stanford/ USGS UNAVCO JPL/UCSD Data Management Organizations PI’s, Groups, Centers, etc. Publications, Presentations,
Visualizing TERASHAKE Amit Chourasia Visualization Scientist Visualization Services San Diego Supercomputer center Geon Visualization Workshop March 1-2,
CyberShake Study 15.3 Science Readiness Review. Study 15.3 Scientific Goals Calculate a 1 Hz map of Southern California Produce meaningful 2 second results.
Southern California Earthquake Center CyberShake Progress Update 3 November 2014 through 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Experiences Running Seismic Hazard Workflows Scott Callaghan Southern California Earthquake Center University of Southern California SC13 Workflow BoF.
Earthquakes 101 (EQ101) Lisa Wald USGS Earthquake Hazards Program
Southern California Earthquake Center SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) Infrastructure Philip J. Maechling (SCEC) September.
UCERF3 Uniform California Earthquake Rupture Forecast (UCERF3) 14 Full-3D tomographic model CVM-S4.26 of S. California 2 CyberShake 14.2 seismic hazard.
Funded by the NSF OCI program grants OCI and OCI Mats Rynge, Gideon Juve, Karan Vahi, Gaurang Mehta, Ewa Deelman Information Sciences Institute,
1 1.Used AWP-ODC-GPU to run 10Hz Wave propagation simulation with rough fault rupture in half-space with and without small scale heterogeneities. 2.Used.
Rapid Centroid Moment Tensor (CMT) Inversion in 3D Earth Structure Model for Earthquakes in Southern California 1 En-Jui Lee, 1 Po Chen, 2 Thomas H. Jordan,
Southern California Earthquake Center SI2-SSI: Community Software for Extreme-Scale Computing in Earthquake System Science (SEISM2) Wrap-up Session Thomas.
CyberShake and NGA MCER Results Scott Callaghan UGMS Meeting November 3, 2014.
Ground Motion and Building Response. Building Oscillation Seismic Simulation Thanks to FEMA, for original design.
Southern California Earthquake Center CyberShake Progress Update November 3, 2014 – 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Welcome to the CME Project Meeting 2013 Philip J. Maechling Information Technology Architect Southern California Earthquake Center.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
Forecasting Earthquake Ground Motions Using Large-Scale Numerical Simulations Philip J. Maechling Information Technology Architect Southern California.
Earthquake Forecasting and Prediction. Parkfield, CA.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
SCEC CyberShake on TG & OSG: Options and Experiments Allan Espinosa*°, Daniel S. Katz*, Michael Wilde*, Ian Foster*°,
National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Recent TeraGrid Visualization Support Projects at NCSA Dave.
Overview of Scientific Workflows: Why Use Them?
CyberShake Study 16.9 Science Readiness Review
High Performance Computing at SCEC
SCEC UGMS Committee Meeting
2020 NEHRP Provisions Issues Ground Motion
Seismic Hazard Analysis Using Distributed Workflows
High Performance Computing at SCEC
Kinematic Modeling of the Denali Earthquake
Scott Callaghan Southern California Earthquake Center
CyberShake Study 16.9 Discussion
Philip J. Maechling (SCEC) September 13, 2015
HOW ARE we “BLIND” TO some of our faults?
LA Basin Story Tectonics Geology Faults Velocity Structure Seismicity
High-Performance Computing (HPC) IS Transforming Seismology
CyberShake Study 17.3 Science Readiness Review
Engineering Geology and Seismology
CyberShake Study 2.2: Science Review Scott Callaghan 1.
rvGAHP – Push-Based Job Submission Using Reverse SSH Connections
CyberShake Study 14.2 Science Readiness Review
Probabilistic Seismic Hazard Analysis
Southern California Earthquake Center
CyberShake Study 18.8 Technical Readiness Review
CyberShake Study 2.2: Computational Review Scott Callaghan 1.
CyberShake Study 18.8 Planning
Presentation transcript:

1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California

Why High Performance Computing? What is HPC? –Using large machines with many processors to compute quickly Why is it important? –Only way to perform large-scale simulations Two main types of HPC SCEC projects –What kind of shaking will this eq cause in a region? –What kind of shaking will this single location experience? 2

3 SCEC Scenario Simulations Simulations of individual earthquakes –Determine shaking over a region caused by a single event (usually M > 7) Peak ground velocities for a Mw8.0 Wall-to-Wall Scenario on the San Andreas Fault (1Hz) calculated using AWP-ODC on NICS Kraken.

4 Simulating Large Events Must break up the work into pieces –Most commonly, spatially –Give work to each processor –Run a timestep –Communicate with neighbors –Repeat As number of processors increases, harder to get good performance

5 Probabilistic Seismic Hazard Analysis Builders ask seismologists: “What will the peak ground motion be at my new building in the next 50 years?” Seismologists answer this question using Probabilistic Seismic Hazard Analysis (PSHA) –PSHA results used in building codes, insurance –California building codes impact billions of dollars of construction yearly

6 PSHA Reporting PSHA information is relayed through –Hazard curves (for 1 location) –Hazard maps (for a region) Probability of exceeding 0.1g in 50 yrsCurve for downtown LA 2% in 50 years 0.6 g

77 PSHA Methodology 1.Pick a location of interest. 2.Define what future earthquakes might happen. 3.Estimate the magnitude and probability for each earthquake, from earthquake rupture forecast (ERF) 4.Determine the shaking caused by each earthquake at the site of interest. 5.Aggregate the shaking levels with the probabilities to produce a hazard curve. Repeat for multiple sites for a hazard map. Typically performed with attenuation relationships.

8 CyberShake Approach Uses physics-based approach –3-D ground motion simulation with anelastic wave propagation –Considers ~415,000 rupture variations per site 7000 ruptures in ERF <200 km from site of interest Magnitude >6.5 Add variability –More accurate than traditional attenuation methods 100+ sites in Southern California needed to calculate hazard map LADT: Probability of Exceedance (SA 3.0) Blue and Green – common attenuation relationships Black – CyberShake

9 Results Attenuation mapCyberShake map

10 Results (difference) CyberShake map compared to attenuation map Population Density

11 Some recent numbers Wall-to-wall simulation –2 TB output –100,000 processors CyberShake –Hazard curves for 223 sites –8.5 TB output files –46 PB of file I/O –190 million jobs executed –4500 processors for 54 days