Philip Maechling Southern California Earthquake Center

Slides:



Advertisements
Similar presentations
SCEC/ITR All-Hands Meeting October 11, 2002 Introduction by Tom Jordan.
Advertisements

Edward (Ned) Field USGS, Pasadena plus Tom Jordan, Nitin Gupta, Vipin Gupta, Phil Maechling, Allin Cornell, Ken Campbell, Sid Hellman, & Steve Rock OpenSHA.
CyberShake Project and ShakeMaps. CyberShake Project CyberShake is a SCEC research project that is a physics-based high performance computational approach.
1 USC INFORMATION SCIENCES INSTITUTE Modeling and Using Simulation Code for SCEC/IT Yolanda Gil Varun Ratnakar Norm Tubman USC/Information Sciences Institute.
Faults in Focus: Earthquake Science Accomplishments Thomas H. Jordan Director, Southern California Earthquake Cente r 28 February 2014.
1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California.
Southern California Earthquake Center Toward a Collaboratory for System-Level Earthquake Science Tom Jordan Southern California Earthquake Center University.
S OUTHERN C ALIFORNIA E ARTHQUAKE C ENTER Southern California: A Natural Laboratory for Earthquake Science SCEC annual meeting, 2000.
S OUTHERN C ALIFORNIA E ARTHQUAKE C ENTER Themes by Tom Henyey.
Overview of Broadband Platform Software as used in SWUS Project Philip Maechling BBP Modelers Meeting 12 June 2013.
1 USC INFORMATION SCIENCES INSTITUTE Modeling and Using Simulation Code for SCEC/IT Yolanda Gil Jihie Kim Varun Ratnakar Marc Spraragen USC/Information.
The Grid is a complex, distributed and heterogeneous execution environment. Running applications requires the knowledge of many grid services: users need.
GEON Science Application Demos
Seismic Hazard Assessment for the Kingdom of Saudi Arabia
Data R&D Issues for GTL Data and Knowledge Systems San Diego Supercomputer Center University of California, San Diego Bertram Ludäscher
CYBERINFRASTRUCTURE FOR THE GEOSCIENCES High Performance Computing applications in GEON: From Design to Production Dogan Seber.
10/03/ An Overview of the Southern California Earthquake Center Thomas H. Jordan Director.
NSF Geoinformatics Project (Sept 2012 – August 2014) Geoinformatics: Community Computational Platforms for Developing Three-Dimensional Models of Earth.
VISUALIZING EARTHQUAKE SIMULATION DATA Amit Chourasia 1, Steve Cutchin 1, Alex DeCastro 1, Geoffrey Ely 2 1 San Diego Supercomputer Center 2 Scripps Institute.
SCEC/CME Project - How Earthquake Simulations Drive Middleware Requirements Philip Maechling SCEC IT Architect 24 June 2005.
1 SCEC Broadband Platform Development Using USC HPCC Philip Maechling 12 Nov 2012.
1.UCERF3 development (Field/Milner) 2.Broadband Platform development (Silva/Goulet/Somerville and others) 3.CVM development to support higher frequencies.
Southern California Earthquake Center - SCEC SCEC/CME Tom Jordan (USC) Bernard Minster (SIO) Carl Kesselman (ISI) Reagan Moore (SDSC) Phil Maechling (USC)
AMIT CHOURASIA VISUALIZATION SCIENTIST VISUALIZATION SERVICES SAN DIEGO SUPERCOMPUTER CENTER PRESENTED AT : GEOINFORMATICS 2007, MAY 18 TH VISUAL REPRESENTATION.
Simulations of Large Earthquakes on the Southern San Andreas Fault Amit Chourasia Visualization Scientist San Diego Supercomputer Center Presented to:
OPENQUAKE Mission and Vision It is GEM’s mission to engage a global community in the design, development and deployment of state-of-the-art models and.
Where to find LiDAR: Online Data Resources.
CyberShake Study 2.3 Technical Readiness Review. Study re-versioning SCEC software uses year.month versioning Suggest renaming this study to 13.4.
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
Fig. 1. A wiring diagram for the SCEC computational pathways of earthquake system science (left) and large-scale calculations exemplifying each of the.
1 USC INFORMATION SCIENCES INSTITUTE CAT: Composition Analysis Tool Interactive Composition of Computational Pathways Yolanda Gil Jihie Kim Varun Ratnakar.
On Parallel Time Domain Finite Difference Computation of the Elastic Wave Equation and Perfectly Matched Layers (PML) Absorbing Boundary Conditions (With.
Pegasus: Running Large-Scale Scientific Workflows on the TeraGrid Ewa Deelman USC Information Sciences Institute
SCEC Community Modeling Environment (SCEC/CME): SCEC TeraShake Platform: Dynamic Rupture and Wave Propagation Simulations Seismological Society of America.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
GEON2 and OpenEarth Framework (OEF) Bradley Wallet School of Geology and Geophysics, University of Oklahoma
SIG: Synthetic Seismogram Exchange Standards (formats & metadata) Is it time to establish exchange standards for synthetic seismograms? IRIS Annual Workshop.
Validation of physics-based ground motion earthquake simulations using a velocity model improved by tomographic inversion results 1 Ricardo Taborda, 1.
GEOSCIENCE NEEDS & CHALLENGES Dogan Seber San Diego Supercomputer Center University of California, San Diego, USA.
CISM Collaboratory Development Plan Philip J. Maechling Information Technology Architect Southern California Earthquake Center March 11, 2015.
06/22/041 Data-Gathering Systems IRIS Stanford/ USGS UNAVCO JPL/UCSD Data Management Organizations PI’s, Groups, Centers, etc. Publications, Presentations,
Visualizing TERASHAKE Amit Chourasia Visualization Scientist Visualization Services San Diego Supercomputer center Geon Visualization Workshop March 1-2,
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Phase 1: Comparison of Results at 4Hz Phase 1 Goal: Compare 4Hz ground motion results from different codes to establish whether the codes produce equivalent.
Southern California Earthquake Center CyberShake Progress Update 3 November 2014 through 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Experiences Running Seismic Hazard Workflows Scott Callaghan Southern California Earthquake Center University of Southern California SC13 Workflow BoF.
Southern California Earthquake Center SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) Infrastructure Philip J. Maechling (SCEC) September.
UCERF3 Uniform California Earthquake Rupture Forecast (UCERF3) 14 Full-3D tomographic model CVM-S4.26 of S. California 2 CyberShake 14.2 seismic hazard.
1 1.Used AWP-ODC-GPU to run 10Hz Wave propagation simulation with rough fault rupture in half-space with and without small scale heterogeneities. 2.Used.
Southern California Earthquake Center SI2-SSI: Community Software for Extreme-Scale Computing in Earthquake System Science (SEISM2) Wrap-up Session Thomas.
Visualizing large scale earthquake simulations Amit Chourasia Visualization Scientist San Diego Supercomputer Center Presented to: Advanced User Support,
Southern California Earthquake Center CyberShake Progress Update November 3, 2014 – 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Welcome to the CME Project Meeting 2013 Philip J. Maechling Information Technology Architect Southern California Earthquake Center.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
SCEC CyberShake on TG & OSG: Options and Experiments Allan Espinosa*°, Daniel S. Katz*, Michael Wilde*, Ian Foster*°,
Seismic Hazard Analysis Using Distributed Workflows
Meeting Objectives Discuss proposed CISM structure and activities
Scott Callaghan Southern California Earthquake Center
CyberShake Study 16.9 Discussion
SCEC Community Modeling Environment (SCEC/CME)
High-F Project Southern California Earthquake Center
Philip J. Maechling (SCEC) September 13, 2015
Large Data Visualization of Seismic Data (TeraShake)
High-Performance Computing (HPC) IS Transforming Seismology
Growing importance of metadata for synthetics: Calculating and Sharing Synthetic Seismic Data Dogan Seber University of California, San Diego San Diego.
rvGAHP – Push-Based Job Submission Using Reverse SSH Connections
CyberShake Study 14.2 Science Readiness Review
Southern California Earthquake Center
CyberShake Study 18.8 Technical Readiness Review
Presentation transcript:

SCEC Community Modeling Environment (SCEC/CME): Cyberinfrastructure for Earthquake Science Philip Maechling Southern California Earthquake Center University of Southern California SCEC/UseIT Intern Program June 6, 2005

People on the SCEC/CME Project 02/13/04

SCEC/CME Researchers Principal Investigators: Research Leads: Tom Jordan (USC) Bernard Minster (Scripps Institution of Oceanography) Carl Kesselman (USC/ISI) Reagan Moore (San Diego Supercomputer Center) Research Leads: Ned Field (USGS) -- Jacobo Bielak (CMU) Kim Olsen (SDSU) -- Dave O’Hallaron (CMU) Steve Day (SDSU) -- Ralph Archuleta (UCSB) Tim Ahern (IRIS) -- Hans Chalupsky (ISI) Yolanda Gil (ISI) Project Manager: Phil Maechling (USC) 02/13/04

SCEC/CME Project NSF SCEC/ITR Project Goal: To develop a cyberinfrastructure that can support system-level earthquake science – the SCEC Community Modeling Environment (CME) Support: 5-yr project funded by the NSF/ITR program under the CISE and Geoscience Directorates Start date: Oct 1, 2001 NSF CISE GEO SCEC/ITR Project ISI USGS Information Science Earth Science SDSC IRIS SCEC Institutions www.scec.org/cme 02/13/04

Seismic Hazard Analysis Definition: Specification of the maximum intensity of shaking expected at a site during a fixed time interval Example: National seismic hazard maps (http://geohazards.cr.usgs.gov/eq/) Intensity measure: peak ground acceleration (PGA) Interval: 50 years Probability of exceedance: 2% 02/13/04

Seismic Hazard Analysis Definition: Specification of the maximum intensity of shaking expected at a site during a fixed time interval Example: National seismic hazard maps (http://geohazards.cr.usgs.gov/eq/) Intensity measure: peak ground acceleration (PGA) Interval: 50 years Probability of exceedance: 2% 02/13/04

Risk Analysis: A System-Level Problem Risk = Probable Loss (lives & dollars) = Hazard  Exposure  Fragility Faulting, shaking, landsliding, liquifaction Extent & density of built environment Structural fragility 02/13/04

The FEMA 366 Report “HAZUS’99 Estimates of Annual Earthquake Losses for the United States”, September, 2000 U.S. annualized earthquake loss (AEL) is about $4.4 billion/yr. For 25 states, AEL > $10 million/yr 74% of the total is concentrated in California 25% is in Los Angeles County alone 02/13/04

Pathway 1: Puente Hills M 7.1 Scenario Peak Ground Acceleration (% g) Downtown LA 0 - 12 12 - 24 24 - 36 36 - 48 48 - 60 60 - 72 Los Angeles County 02/13/04

Three Global Geosystems Climate System Three Global Geosystems Mantle Convection Atmosphere Hydrosphere Cryosphere Lithosphere Asthenosphere Deep Mantle Outer Core Biosphere Inner Core Core Dynamo 02/13/04

SHA Computational Pathways 1 Standardized Seismic Hazard Analysis Ground motion simulation Physics-based earthquake forecasting Ground-motion inverse problem 2 3 Other Data Geology Geodesy 4 Improvement of models Unified Structural Representation Invert 4 Faults Motions Stresses Anelastic model Ground Motions Physics-based simulations FSM RDM AWM SRM 3 2 Empirical relationships Earthquake Forecast Model Attenuation Relationship Intensity Measures 1 FSM = Fault System Model RDM = Rupture Dynamics Model AWP = Anelastic Wave Propagation SRM = Site Response Model 02/13/04

KNOWLEDGE REPRESENTATION  & REASONING SCEC Community Modeling Environment A collaboratory for system-level earthquake science KNOWLEDGE REPRESENTATION  & REASONING Knowledge Server Knowledge base access, Inference Translation Services Syntactic & semantic translation Knowledge Base DIGITAL LIBRARIES Navigation & Queries Versioning, Replication Mediated Collections Federated access Ontologies Curated taxonomies, Relations & constraints Pathway Models Pathway templates, Models of simulation codes KNOWLEDGE ACQUISITION Acquisition Interfaces Dialog planning, Pathway construction strategies Pathway Assembly Template instantiation, Resource selection, Constraint checking Code Repositories FSM RDM AWM SRM Users Data Collections Data & Simulation Products GRID Pathway Execution Policy, Data ingest, Repository access Grid Services Compute & storage management, Security Pathway Instantiations Computing Storage 02/13/04

SCEC/CME Computational Pathway Construction A major SCEC/CME objective is the ability to construct and run complex computational pathways for SHA Define Scenario Earthquake Lat/Long/Amp (xyz file) with 3000 datapoints (100Kb) ERF Definition Calculate Hazard Curves Extract IMR Value Plot Hazard Map 9000 Hazard Curve files (9000 x 0.5 Mb = 4.5Gb) IMR Definition GMT Map Configuration Parameters Gridded Region Definition Probability of Exceedence and IMR Definition Pathway 1 example 02/13/04

Example Application of Pathway 1: Scenarios for M 7 Example Application of Pathway 1: Scenarios for M 7.4 Southern San Andreas Rupture Without soil & basin effects With soil & basin effects Courtesy of Ned Field, USGS, Pasadena 02/13/04

SCEC Collaboratory for system-level earthquake science 02/13/04

PGV data for Northridge from SCSN System Pathway Comparisons SCEC/CME computational testbed was used to generate PGV Hazard Maps utilizing Pathway 1 and Pathway 2 data sets and SCSN observed data. Pathway 1 (All Firm Soil) Pathway 1 (SCEC CVM 3.0) Pathway 2 (Olsen AWM) PGV data for Northridge from SCSN System (Pathway 0) 02/13/04

SCEC IT Challenges Many geophysical models, computational programs, and data sets and data types. Large scale simulations, high performance computing and large-scale data management are required in a physics-based approach to earthquake modeling at high spatial-temporal resolution requires. Communication tools, distributed model development, and computer resource sharing are required by the distributed SCEC collaboration. 02/13/04

SCEC/CME Research Areas Geoscience Research Areas: Probabilistic Seismic Hazard Analysis Anelastic Wave Propagation Modeling Rupture Dynamics Modeling Data Inversion IT Research Areas: High Performance Computing Grid Digital Library Knowledge Representation and Reasoning 4D Data Visualization Creation of Computational Pathways Web Services Data Integration Data Standards Community Computational Models Outreach and Education: Undergraduate and Graduate Research Opportunities Access to non-scientific Users (Emergency Management, Public) 02/13/04

Composition Analysis Tool (CAT) Interface User building a pathway specification from library of components Errors and fixes generated by ErrorScan algorithm 02/13/04

Earthquake Simulation SCEC ITR Collaboration The SCEC Earthquake Simulation SCEC ITR Collaboration

Major Earthquakes on the San Andreas Fault, 1680-present 1906 M 7.8 Major Earthquakes on the San Andreas Fault, 1680-present 1857 M 7.9 ~1680 M 7.7 02/13/04

TeraShake Simulation Area 02/13/04

33 researchers, 8 Institutions Southern California Earthquake Center San Diego Supercomputer Center Information Sciences Institute Institute of Geophysics and Planetary Physics (UC) University of Southern California San Diego State University University of California, Santa Barbara Carnegie-Mellon University EXonMobil 02/13/04

TeraShake Peak Ground Velocity Maps NW to SE rupture SE to NW rupture 02/13/04

SCEC/CME Grid Infrastructure SCEC/CME has established grid-based connectivity, job-scheduling, and user and host authentication between SCEC, USC, ISI, SDSC, and PSC. SCEC Grid Testbed SCEC USC SDSC epi.usc.edu SUN E3800 8CPUs - 8GB RAM gravity.usc.edu Linux 4 CPUs - 4GB RAM condor.usc.edu Condor pool A collection of 320 SUN workstations hpc.usc.edu IBM Linux cluster 640 CPUs 320GB RAM almaak.usc.edu - SUN Sunfire 15K 64 CPUs 256GB RAM horizon.sdsc.edu IBM Blue Horizon 1152 CPUs 576GB RAM ISI PSC pinto.isi.edu Linux 2CPUs,500MHz 380MB RAM giis.scec.org/ scec-giis.isi.edu Linux 2CPUs, 1GHz 1GB RAM sidecar.psc.edu Linux 1CPU - 1GB RAM Current SCEC Grid configuration 02/13/04

SCEC Community Library Data grid architecture using SDSC Storage Resource Broker Supports user customizable portals Maintains associations between data and metadata Current collections contain 1.6 million files (10 terabytes) 3D ground motion for LA Basin (36 scenarios) Rupture Dynamics 4D Wavefield http://www.sdsc.edu/SCEC 02/13/04

SCEC UseIT Undergraduate Intern Program 02/13/04

LA3D GeoWall Software 02/13/04

Group Interaction and Collaboration Tools Applying communication tools to help with this distributed collaboration. Web Sites Software Configuration Management Tools Document Management Tools Bug Tracking Tools Data File and Metadata Tools Email Lists 02/13/04

Hosting of SCEC Community Models Provide access to SCEC Community Models, possibly alternative models, and utilities for working with the models. Community Fault Model (CFM-A) Community Velocity Model (CVM.3.0) Community Crustal Motion Map (CMM.3.0.1) 02/13/04

Validation Exercises for Simulation Codes Comparisons for 09/03/02 Yorba Linda Earthquake Data in black, SCEC CVM (FD) in blue, Harvard model (SEM) in red Comparison of Dynamic Rupture Models Rupture Test Case Contours 02/13/04

Hosting of SCEC Community Codes Provide access to SCEC geophysical codes Pathway 2: Olsen AWM CMU Hercules AWM Pathway 1: OpenSHA Pathway 4: Synthetic Seismograms Fréchet Kernels 02/13/04

Supporting and Running Large Scale Simulations 8 Processors (in 2002) 240 Processors (in 2004) 02/13/04

Establishment of SCEC Grid Infrastructure SCEC/CME has established grid-based connectivity, job-scheduling, and user and host authentication between SCEC, USC, ISI, SDSC, PSC, and TeraGrid sites. SCEC Grid Testbed SCEC USC SDSC epi.usc.edu SUN E3800 8CPUs - 8GB RAM gravity.usc.edu Linux 4 CPUs - 4GB RAM condor.usc.edu Condor pool A collection of 320 SUN workstations hpc.usc.edu IBM Linux cluster 640 CPUs 320GB RAM almaak.usc.edu - SUN Sunfire 15K 64 CPUs 256GB RAM horizon.sdsc.edu IBM DataStar 1152 CPUs 576GB RAM ISI PSC TeraGrid pinto.isi.edu Linux 2CPUs,500MHz 380MB RAM giis.scec.org/ scec-giis.isi.edu Linux 2CPUs, 1GHz 1GB RAM sidecar.psc.edu Linux 1CPU - 1GB RAM horizon.sdsc.edu IBM Blue Horizon 1152 CPUs 576GB RAM 02/13/04

SCEC Digital Library - Providing Data Management Capabilities Storage Resource Broker based Digital Library Collection now includes SCEC/PEER Scenario Ground Motion data collection, USC Green Tensors data collection (40TB+ Storage), TeraShake Simulations (40 TB+), and Puente Hills Simulation. SCEC Community Library Select Receiver (Lat/Lon) Output Time History Seismograms Select Scenario Fault Model Source Model 02/13/04

SDSC Data Visualization 02/13/04

ISI Data Visualization 02/13/04

Example SCEC and SCEC/CME IT-oriented Activities UseIT Intern Program Unified Structural Representation Ground Motion Prediction Communication, Education, Outreach OpenSHA Seismic Hazard Analysis Earthquake Forecasting 02/13/04

Example SCEC and SCEC/CME IT-oriented Activities CyberShake Waveform-based Seismic Hazard Analysis Unified Structural Representation Earthquake Rupture Dynamics Ground Motion Prediction Communication, Education, Outreach TeraShake 2 Dynamic Rupture-based Simulations 02/13/04

CyberShake Project Using 3D Synthetic Seismic Waveforms In Seismic Hazard Analysis 02/13/04

Various IMR types (subclasses) Attenuation Relationships Gaussian dist. is assumed; mean and std. from various parameters IMT, IML(s) Multi-Site IMRs compute joint prob. of exceeding IML(s) at multiple sites (e.g., Wesson & Perkins, 2002) Site(s) Rupture Intensity-Measure Relationship List of Supported IMTs List of Site-Related Ind. Params Vector IMRs compute joint prob. of exceeding multiple IMTs (Bazzurro & Cornell, 2002) Simulation IMRs exceed. prob. computed using a suite of synthetic seismograms

Ruptures in ERF within 200KM of USC 02/13/04

CyberShake Computational Elements Large (TeraShake Scale) forward calculations for each site. Requires calculation of 100,000+ seismogram for each site. SCEC/CME Grid-based scientific workflow system required to work at this scale. Access to distributed computing resources Large scale file management High performance and high throughput computing. TeraGrid allocation awarded for effort 145K SU (TG-BCS050001N) 02/13/04

End 02/13/04