Presentation is loading. Please wait.

Presentation is loading. Please wait.

Forecasting Earthquake Ground Motions Using Large-Scale Numerical Simulations Philip J. Maechling Information Technology Architect Southern California.

Similar presentations


Presentation on theme: "Forecasting Earthquake Ground Motions Using Large-Scale Numerical Simulations Philip J. Maechling Information Technology Architect Southern California."— Presentation transcript:

1 Forecasting Earthquake Ground Motions Using Large-Scale Numerical Simulations Philip J. Maechling Information Technology Architect Southern California Earthquake Center Open Science Grid All Hands Meeting National CI Panel University of Southern California 9 March 2011

2 The SCEC Partnership National Partners International Partners Core Institutions Participating Institutions

3 SCEC Member Institutions (November 1, 2009) Core Institutions (16) California Institute of Technology Columbia University Harvard University Massachusetts Institute of Technology San Diego State University Stanford University U.S. Geological Survey, Golden U.S. Geological Survey, Menlo Park U.S. Geological Survey, Pasadena University of California, Los Angeles University of California, Riverside University of California, San Diego University of California, Santa Barbara University of California, Santa Cruz University of Nevada, Reno University of Southern California (lead) Participating Institutions (53) Appalachian State University; Arizona State University; Berkeley Geochron Center; Boston University; Brown University; Cal-Poly, Pomona; Cal-State, Long Beach; Cal-State, Fullerton; Cal-State, Northridge; Cal-State, San Bernardino; California Geological Survey; Carnegie Mellon University; Case Western Reserve University; CICESE (Mexico); Cornell University; Disaster Prevention Research Institute, Kyoto University (Japan); ETH (Switzerland); Georgia Tech; Institute of Earth Sciences of Academia Sinica (Taiwan); Earthquake Research Institute, University of Tokyo (Japan); Indiana University; Institute of Geological and Nuclear Sciences (New Zealand); Jet Propulsion Laboratory; Los Alamos National Laboratory; Lawrence Livermore National Laboratory; National Taiwan University (Taiwan); National Central University (Taiwan); Ohio State University; Oregon State University; Pennsylvania State University; Princeton University; Purdue University; Texas A&M University; University of Arizona; UC, Berkeley; UC, Davis; UC, Irvine; University of British Columbia (Canada); University of Cincinnati; University of Colorado; University of Massachusetts; University of Miami; University of Missouri-Columbia; University of Oklahoma; University of Oregon; University of Texas-El Paso; University of Utah; University of Western Ontario (Canada); University of Wisconsin; University of Wyoming; URS Corporation; Utah State University; Woods Hole Oceanographic Institution

4 SCEC Mission Statement Gather data on earthquakes in Southern California and elsewhere Integrate information into a comprehensive, physics-based understanding of earthquake phenomena Communicate understanding to the world at large as useful knowledge for reducing earthquake risk

5 SCEC Computing Resources USC HPCC Resources Open-Science Computing Resources (TeraGrid, XD, INCITE, OSG … ) SCEC’s Extensive use of Distributed High Throughput Research Computing Grid

6 The SCEC Community Modeling Environment has become a large collaboration involving many components of the U.S. supercomputing infrastructure …

7

8 Southern California Earthquake Center NGA (2008) Attenuation Relations used in National Seismic Hazard Maps NGA Boore & Atkinson NGA Chiou & Youngs NGA Abrahamson & Silva CyberShake (2009) Hazard Model PE = 2%/50 yr NGA Campbell & Bozorgnia

9 Southern California Earthquake Center Magnitude 8.0 wall-to-wall scenario, worst- case for southern San Andreas Fault Fault length: 545 km Minimum wavelength: 200 m Dynamic rupture simulation (pathway 3) performed on Kraken, 7.5 hours using 2160 cores 881,475 subfaults, 250 sec of rupture 2.1 TB tensor time series output Wave propagation simulation (pathway 2) performed on Jaguar, 24 hours using 223,074 cores (220 Tflop/s sustained). 436 billion grid points representing geologic model of dimension 810 x 405 x 85 km (40-m sampling) 368 s of ground motions (160,000 steps of 0.0023 s) representing seismic frequencies up to 2 Hz Output data: surface seismograms Input model: CVM-S4 Leadership-scale Parallel Computing

10 LA region Large-scale Globus, DAGman, Pegasus, Corral-based Distributed High Throughput Computing CyberShake hazard map PoE = 2% in 50 yrs CyberShake seismogram CyberShake 1.0 computation (225 sites in LA region, f < 0.5 Hz) -440,000 simulations per site -5.5 million CPU hrs (50-day run on Ranger averaging 4,400 cores) -189 million jobs -165 TB of total output data -10.6 TB of stored data -2.1 TB of archived data

11 Will Focus my Comments on a Specific Type of Research Computing – Physics-based Forecast Models SCEC (and other research groups) are developing computational models that enable physics-based predictive forecasting (Earth as a deterministic system) Rainfall Flooding Hurricanes Tornados Earthquakes Strong Ground Motions Tsunami Volcano

12 US CI Panel Question 1. There is a hierarchy of cyberinfrastructures in place in the US ranging from campuses, laboratories, and a plethora of grids. Will these merger or diverge in common infrastructure in the days ahead

13 SCEC: An NSF + USGS Research Center SCEC Pursuing Leadership Class Computer Systems 100 TF Systems 10’s of Projects 10’s of 10 TF Systems 1,000’s of Users 100’s of 1 TF Systems 10,000’s of Users Workstations Departmental HPC HPC Centers GigaFLOPS Millions of Users Key function of the NSF Supercomputer Centers: Provide facilities over and above what can be found in the typical campus/lab environment Scientific Computing Compute (more FLOPS) Data (more BYTES) Home, Lab, Campus, Desktop Traditional HPC environment Data-oriented Science and Engineering Environment SCEC

14 HPC Hierarchy for Open Science Research Computing Leadership Class HPC Open Science HPC Centers and Grids Departmental HPC Workstations HPC Resource Providers Increasing Number of Resource Providers ------->

15 Types of Seismic Hazard Forecasts with Commercial or Governmental Market Seismic Hazard Forecast TypesForecast Users Earthquake Early Warning ForecastPublic, Press, City, State, National Governments Scenario Earthquake Seismograms Forecasts Engineering Companies, Insurance Companies, State, National Governments Short-term earthquake ForecastsPublic, Press, State and National Governments Long-term Probabilistic Seismic Hazard Forecasts Engineering Companies, Building Code Developers, Insurance companies, State and National Governments,

16 Computational Forecast User Hierarchy Broadest Impact by Forecasts Used by Public and Govt. Public and Governmental Forecasts Engineering and Interdisciplinary Forecasting Collaborative Forecasting Individual Forecaster SCEC Computational Forecast Users <----- Increasing Number of Forecast Providers

17 Forecast Rigor Must Increase Along with Forecast Impact Computational codes, structural models, and simulation results versioned with associated tests. Development of new computational, data, and physical models. Automated retrospective testing of forecast models using community defined validation problems. Automated prospective testing of forecast models over time within collaborative forecast testing center. Scientific and Engineering Requirements for Forecast Modeling Systems Public and Governmental Forecasts Engineering and Interdisciplinary Forecasting Collaborative Forecasting Individual Forecaster SCEC Computational Forecast Users

18 US CI Interoperability Needed to Achieve Broad Impact Public and Governmental Forecasts Engineering and Interdisciplinary Research Collaborative Research Project Individual Researcher Project SCEC Computational Forecast Users Leadership Class HPC Open Science HPC Centers and Grids Departmental HPC Workstations HPC Resource Providers Effective use of leadership class HPC requires advancement through smaller-scale modeling. Effective system-science forecasting requires advancement through levels of smaller-scale, less rigorous, interdisciplinary research. Forecast: US CI will merge into interoperable continuum because such collaboration is in best interests of (a) researchers, (b) resource providers, (c) forecast users

19 US CI Panel Question 2. How do standards, APIs, and grid software "libraries" or "SDKs" fit into the US CI picture?

20 Building Models / Tsunami Models Seismic Data Centers HPC Resource Providers Real-time Earthquake Monitoring Programmable Interfaces SCEC Seismic Hazard Forecast Collaboratory Develop Forecast Models Forecast Model Evaluation Operate Forecast Models

21 Computational codes, structural models, and simulation results versioned with associated tests. Development of new computational, data, and physical models. Automated retrospective testing of forecast models using community defined validation problems. Automated prospective performance evaluation of forecast models over time within collaborative forecast testing center. External Seismic /Tsunami Models Seismic Data Centers HPC Resource Providers Public and Governmental Forecasts Engineering and interdisciplinary Research Collaborative Research Project Individual Research Project Real-time Earthquake Monitoring Discovery and access to digital artifacts. Contribution and annotation of digital artifacts. SCEC computational data models and physics-based simulation codes. Programmable Interfaces

22 US CI Interoperability Needed to Achieve Broad Impact Public and Governmental Forecasts Engineering and Interdisciplinary Research Collaborative Research Project Individual Research Project SCEC Computational Forecast Users Leadership Class HPC Open Science HPC Centers and Grids Departmental HPC Workstations HPC Resource Providers Shared resource (grid,cloud) API’s including authentication Standard Public API’s (e.g. http, html, pbs) Domain Specific API’s (e.g. OpenDAP) Regulated Interfaces (e.g. Common Alerting Protocol (CAP) Required Interfaces and API’s Conclusion: Each level of forecast development requires its own Interfaces and API, together with APIs and interfaces of lower levels.

23 End For more information: www.scec.org


Download ppt "Forecasting Earthquake Ground Motions Using Large-Scale Numerical Simulations Philip J. Maechling Information Technology Architect Southern California."

Similar presentations


Ads by Google