WGCEP Workshop What Represents Best Available Science in terms of Time-Dependent Earthquake Probabilities? Introduction by Ned Field.

Slides:



Advertisements
Similar presentations
UCERF3 Fault-by-Fault Review Update
Advertisements

Review of Catalogs and Rate Determination in UCERF2 and Plans for UCERF3 Andy Michael.
The SCEC Community Stress Model (CSM) Project Jeanne Hardebeck USGS, Menlo Park, CA.
Earthquake recurrence models Are earthquakes random in space and time? We know where the faults are based on the geology and geomorphology Segmentation.
Edward (Ned) Field USGS, Pasadena plus Tom Jordan, Nitin Gupta, Vipin Gupta, Phil Maechling, Allin Cornell, Ken Campbell, Sid Hellman, & Steve Rock OpenSHA.
16/9/2011UCERF3 / EQ Simulators Workshop RSQSim Jim Dieterich Keith Richards-Dinger UC Riverside Funding: USGS NEHRP SCEC.
Modeling swarms: A path toward determining short- term probabilities Andrea Llenos USGS Menlo Park Workshop on Time-Dependent Models in UCERF3 8 June 2011.
The trouble with segmentation David D. Jackson, UCLA Yan Y. Kagan, UCLA Natanya Black, UCLA.
Faults in Focus: Earthquake Science Accomplishments Thomas H. Jordan Director, Southern California Earthquake Cente r 28 February 2014.
Recurrence Intervals Frequency – Average time between past seismic events – aka “recurrence interval” Recurrence Interval = Average slip per major rupture.
Prague, March 18, 2005Antonio Emolo1 Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Integrating Probabilistic and Deterministic Approaches.
Earthquake Probabilities in the San Francisco Bay Region, 2002–2031 Working Group on California Earthquake Probabilities, 2002 Chapters 1 & 2.
Importance of OEF to the USGS Earthquake Hazards Program Michael L. Blanpied USGS Earthquake Hazards Program For Powell Center March.
Earthquake Probabilities for the San Francisco Bay Region Working Group 2002: Chapter 6 Ved Lekic EQW, April 6, 2007 Working Group 2002: Chapter.
A New Approach To Paleoseismic Event Correlation Glenn Biasi and Ray Weldon University of Nevada Reno Acknowledgments: Tom Fumal, Kate Scharer, SCEC and.
1 Can We Predict Earthquakes? Can We Predict Earthquakes? Andrea Nemeth Advisor: Dr. Mark Schilling.
Chapter 5: Calculating Earthquake Probabilities for the SFBR Mei Xue EQW March 16.
Time-dependent seismic hazard maps for the New Madrid seismic zone and Charleston, South Carolina areas James Hebden Seth Stein Department of Earth and.
The Empirical Model Karen Felzer USGS Pasadena. A low modern/historical seismicity rate has long been recognized in the San Francisco Bay Area Stein 1999.
Earthquake potential of the San Andreas and North Anatolian Fault Zones: A comparative look M. B. Sørensen Department of Earth Science, University of Bergen,
2007 NSTA: St. Louis, Missouri Earthquake Prediction and Forecasting: A Case Study of the San Andreas and New Madrid Faults Sponsored by: IRIS (Incorporated.
U.S. Earthquake Frequency Estimation - Ratemaking for Unusual Events CAS Ratemaking Seminar Nashville, Tennessee March 11-12, 1999 Stuart B. Mathewson,
Impact of the 1906 Bay Area Earthquake and San Francisco Fires USGS Forum on Catastrophe Preparedness David Keeton Swiss Re.
Turkey Earthquake Risk Model Financing the Risks of Natural Disasters World Bank Washington, DC, June 2-3, 2003 Dennis E. Kuzak Senior Vice President,
If we build an ETAS model based primarily on information from smaller earthquakes, will it work for forecasting the larger (M≥6.5) potentially damaging.
S OUTHERN C ALIFORNIA E ARTHQUAKE C ENTER Southern California: A Natural Laboratory for Earthquake Science SCEC annual meeting, 2000.
Paleoseismic and Geologic Data for Earthquake Simulations Lisa B. Grant and Miryha M. Gould.
2014 Grand Challenge Symposium UseIT Undergraduate Studies in Earthquake Information Technology Southern California Earthquake Center.
Intraplate Deformation and Seismicity: Implication for Seismic Hazard and Risk Estimates in the Central United States Zhenming Wang Kentucky Geological.
Comments on UCERF 3 Art Frankel USGS For Workshop on Use of UCERF3 in the National Seismic Hazard Maps Oct , 2012.
Updating Models of Earthquake Recurrence and Rupture Geometry of the Cascadia Subduction Zone for UCERF3 and the National Seismic Hazard Maps Art Frankel.
Key Considerations in Modeling of Earthquake Risk in Turkey
Real-time application of Coulomb stress modelling and related issues By Suleyman S. Nalbant, Sandy Steacy & John McCloskey Geophysics Research Group, University.
Workshop on Fault Segmentation and Fault-To-Fault Jumps in Earthquake Rupture (March 15-17, 2006) Convened by Ned Field, Ray Weldon, Ruth Harris, David.
Kenneth W. Hudnut USGS, Pasadena, CA West Newport Beach Association Public Forum, Newport Beach City Hall March 5, 2003 Coping with ‘quakes.
Earthquake Science (Seismology). Seismometers and seismic networks Seismometers and seismic networks Earthquake aftershocks Earthquake aftershocks Earthquake.
National Seismic Hazard Maps and Uniform California Earthquake Rupture Forecast 1.0 National Seismic Hazard Mapping Project (Golden, CO) California Geological.
1 SCEC Broadband Platform Development Using USC HPCC Philip Maechling 12 Nov 2012.
Estimation of Future Earthquake Annualized Losses in California B. Rowshandel, M. Reichle, C. Wills, T. Cao, M. Petersen, and J. Davis California Geological.
A (re-) New (ed) Spin on Renewal Models Karen Felzer USGS Pasadena.
System Level Science and System Level Models Ian Foster Argonne National Laboratory University of Chicago Improving IAM Representations of a Science-Driven.
March 2006 WGCEP Workshop Ruth A. Harris U.S. Geological Survey.
Karen Felzer & Emily Brodsky Testing Stress Shadows.
Yuehua Zeng & Wayne Thatcher U. S. Geological Survey
U.S. Department of the Interior U.S. Geological Survey The Earthquake is Inevitable: The Disaster is Not.
Foreshocks, Aftershocks, and Characteristic Earthquakes or Reconciling the Agnew & Jones Model with the Reasenberg and Jones Model Andrew J. Michael.
California Project Seismicity in the oil and gas fields Tayeb A. Tafti University of Southern California July 2, 2013.
Working Group on California Earthquake Probabilities (WGCEP) Development of a Uniform California Earthquake Rupture Forecast (UCERF)
San Andreas MW 7.9 Earthquake: Slip at Critical Lifeline Crossings ShakeOut scenario for southern California Dr. Ken Hudnut U.S. Geological Survey,
The influence of the geometry of the San Andreas fault system on earthquakes in California Qingsong Li and Mian Liu Geological Sciences, 101 Geol. Bldg.,
06/22/041 Data-Gathering Systems IRIS Stanford/ USGS UNAVCO JPL/UCSD Data Management Organizations PI’s, Groups, Centers, etc. Publications, Presentations,
Some General Implications of Results Because hazard estimates at a point are often dominated by one or a few faults, an important metric is the participation.
112/16/2010AGU Annual Fall Meeting - NG44a-08 Terry Tullis Michael Barall Steve Ward John Rundle Don Turcotte Louise Kellogg Burak Yikilmaz Eric Heien.
Can we forecast an Earthquake??? In the next minute there will be an earthquake somewhere in the world! This sentence is correct (we have seen that there.
A GPS-based view of New Madrid earthquake hazard Seth Stein, Northwestern University Uncertainties permit wide range (3X) of hazard models, some higher.
Southern California Earthquake Center SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) Infrastructure Philip J. Maechling (SCEC) September.
The 2002 Working Group Approach to Modeling Earthquake Probabilities Michael L. Blanpied U.S. Geological Survey Earthquake Hazards Program, Reston, VA.
A proposed triggering/clustering model for the current WGCEP Karen Felzer USGS, Pasadena Seismogram from Peng et al., in press.
California Earthquake Prediction Evaluation Council (CEPEC) E DMUND G. B ROWN J R GOVERNOR NATIONAL EARTHQUAKE PROGRAM MANAGERS MEETING SEATTLE, WASHINGTON.
California Earthquake Rupture Model Satisfying Accepted Scaling Laws (SCEC 2010, 1-129) David Jackson, Yan Kagan and Qi Wang Department of Earth and Space.
SUSHI II Long-term stress modelling: Implications for large earthquake forecasting Suleyman S. Nalbant, John McCloskey, Shane Murphy, Nuno Simao and Tony.
Southern California Earthquake Center SI2-SSI: Community Software for Extreme-Scale Computing in Earthquake System Science (SEISM2) Wrap-up Session Thomas.
Epistemic uncertainty in California-wide simulations of synthetic seismicity Fred Pollitz, USGS Menlo Park Acknowledgments: David Schwartz, Steve Ward.
9. As hazardous as California? USGS/FEMA: Buildings should be built to same standards How can we evaluate this argument? Frankel et al., 1996.
Future Directions and Capabilities of Simulators Jim Dieterich
Comments on physical simulator models
Some issues/limitations with current UCERF approach
Meeting Objectives Discuss proposed CISM structure and activities
SAN ANDREAS FAULT San Francisco Bay Area North American plate
VII. Earthquake Mitigation
Presentation transcript:

WGCEP Workshop What Represents Best Available Science in terms of Time-Dependent Earthquake Probabilities? Introduction by Ned Field

Best Available Science? Poisson Model (long-term rates) Quasi-Periodic Recurrence Models BPT Renewal Time or Slip predictable Static-Stress Interaction Models Clock change BPT-step Rate & State Clock change w/ Rate & State Hardebeck (2004) approach Empirical Rate Change Models Clustering Models Foreshock/Afershock statistics (e.g., STEP; ETAS)

Summary (in brief) of Previous: “Working Groups on California Earthquake Probabilities” (WGCEP, 1988, 1990, 1995, 2002) They generally segmented faults and applied elastic- rebound-theory-motivated (quasi-periodic) renewal models to define time-dependent earthquake probabilities...

Reid’s (1910) Elastic Rebound Hypothesis: EQ Time Stress Loading EQ Stress Loading EQ

more noisy system Perfectly Periodic Lognormal or BPT distribution Reid’s (1910) Elastic Rebound Hypothesis:

They divided the San Andreas, San Jacinto, Hayward, and Imperial Faults into segments and assumed each ruptures only in a single-magnitude (“characteristic”) earthquake. WGCEP 1988

COV I = 0.2 Mean Recurrence Interval from: 1)Ave of those observed previously. 2)Slip in last event divided by slip rate. 3)Ave slip divided by slip rate.

WGCEP 1990 Updated WGCEP (1988) for San Francisco Bay Area in light of the 1989 Loma Prieta earthquake (and some new data). e.g., applied a clock change to account for influence of Loma Prieta on Peninsula segment (seg #3).

Focused on southern Cal. (SCEC’s Phase II report). 2) Allowed neighboring segments to sometimes rupture together as “cascade” events 3) Included lesser faults and background seismicity (to account for unknown faults) Innovations: WGCEP ) Updated WGCEP (1988) segment probabilities (COV I = 0.5 +/- 0.2)

WGCEP 1995 Problem : predicted twice as many mag 6-7 events as have been observed historically, which led to a lively debate on this apparent earthquake “deficit”: Need to allow “huge” events (Mag≥8) potentially anywhere (Jackson, 1996) But such events would leave obvious scars (Schwartz, 1996; Hough, 1996) Problem results from several factors; solution exists ( e.g., Stirling and Wesnousky (1997); Stein & Hanks (1998); and Field et al., (1999) )

WGCEP 1995 Problem : predicted twice as many mag 6-7 events as have been observed historically, which led to a lively debate on this apparent earthquake “deficit”: Need to allow “huge” events (Mag≥8) potentially anywhere (Jackson, 1996) But such events would leave obvious scars (Schwartz, 1996; Hough, 1996) Problem results from several factors; solution exists ( e.g., Stirling and Wesnousky (1997); Stein & Hanks (1998); and Field et al., (1999) ) Note: these two were part of the working group … implying a lack of “consensus” … RELM

Focused on Bay Area 2) “Consensus process” rather than consensus model 3) Extensive treatment of epistemic uncertainties (logic-tree branches) Other Innovations: WGCEP ) Updated WGCEP (1990) segment probabilities based on a more elaborate earthquake rate model; allowed cascades

Current Working Group on California Earthquake Probabilities (WGCEP) Development of a Uniform California Earthquake Rupture Forecast (UCERF)

22% of our funding comes from the California Earthquake Authority (CEA)

Northridge caused 93% of insurers to halt or significantly reduce coverage. CEA was created (via state legislation) to resolve the crisis. CEA is a privately financed, publicly managed (and tax exempt) organization that offers basic earthquake insurance for California homeowners and renters. It’s governed by: CA Governor, Treasurer, Insurance Commissioner, Speaker of the Assembly, and Chairperson of the Senate Rules Committee. CEA policies are sold only through participating insurance companies (two-thirds of California homeowners policies). Policies carry a 15% deductible. Today the CEA has $7.2 billion to pay claims. CEA is required by law to use “best-available science”. California Earthquake Authority (CEA):

Best Available Science? California Insurance Code section (a) "Rates shall be based on the best available scientific information for assessing the risk of earthquake frequency, severity and loss.” “Scientific information from geologists, seismologists, or similar experts shall not be conclusive to support the establishment of different rates … unless that information, as analyzed by experts such as the United States Geological Survey, the California Division of Mines and Geology, and experts in the scientific or academic community, clearly shows a higher risk of earthquake frequency, severity, or loss between those most populous rating territories to support those differences.”

“Seismic Event” “Seismic Event” means one or more earthquakes that occur within a 360-hour period. The seismic event commences upon the initial earthquake, and all earthquakes or aftershocks that occur within the 360 hours immediately following the initial earthquake are considered for purposes of this policy to be part of the same seismic event. from page 6 of CEA’s “Basic Earthquake Policy--Homeowners” doc

Coordinated with the next National Seismic Hazard Mapping Program (NSHMP) time-independent model This will be used by CEA to set earthquake insurance rates (they want 5-year forecasts, maybe 1-year in future) WGCEP Goals: UCERF To provide the California Earthquake Authority (CEA) with a statewide, time-dependent ERF that uses “best available science” and is endorsed by the USGS, CGS, and SCEC, and is evaluated by a Scientific Review Panel (SRP), CEPEC, and NEPEC

NSF CEA USGS CGS SCEC MOC State of CA USGS Menlo Park USGS Golden Sources of WGCEP funding Geoscience organizations Management oversight committee WGCEP ExCom Subcom. A Subcom. B Subcom. C … … Working group leadership Task-oriented subcommittees Working Group on California Earthquake Probabilities WGCEP Organization & Funding Sources SCEC will provide CEA with a single-point interface to the project. SRP Scientific review panel Thomas H. Jordan (SCEC, Chair) Rufus Catchings (USGS, Menlo Park  ) Jill McCarthy (USGS, Golden ) Michael Reichle (CGS) Ned Field (USGS, Chair) Thomas Parsons (USGS, Menlo Park) Chris Wills (CGS) Ray Weldon (U of O) Mark Petersen (USGS, Golden) Ross Stein (USGS, Menlo Park) Bill Ellsworth (chair) Art Frankel David Jackson Steve Wesnousky Lloyd Cluff Allin Cornell Mike Blanpied David Schwartz Plus many others

Delivery Schedule February 8, 2006 (to CEA) UCERF 1.0 & S. SAF Assessment to CEA Aug 31, 2006 (to CEA) Fault Section Database 2.0 Earthquake Rate Model 2.0 (preliminary for NSHMP) April 1, 2007 (to NSHMP) Revised Earthquake Rate Model 2.x (for use in 2007 NSHMP revision) September 30, 2007 (to CEA) UCERF 2 (reviewed by SRP, NEPEC, and CEPEC)

1)Everything takes longer than you expect Some planned on innovations won’t pan out Focus on what’s important 2)There will be problems with the final model The best time to solve these problems is right away (while fresh in the mind) Burnout makes this problematic Important Lessons from Previous WGCEPs: Thus: Plan for both the near and long term (e.g., build a living, adaptive, extensible infrastructure)

Deploy as extensible, adaptive (living) model i.e., modifications can be made as warranted by scientific developments, the collection of new data, or following the occurrence of significant earthquakes. The model can be “living” to the extent that update & evaluation process can occur in short order. How do we plan to achieve this?

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box UCERF Model Components (generalization of WGCEP-2002) Fault Model(s)

The computer code The models &/or applications Object Oriented (Modular) Framework - building on OpenSHA

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box UCERF Model Components (generalization of WGCEP-2002) Fault Model(s) Fault-slip rates (at least) Long-term rate of all possible events (on and off modeled faults) Time-dependent probabilities

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box Fault Model(s) Fault Section Database Paleo Sites Database GPS Database Historical Qk Catalog Instrumental Qk Catalog UCERF Model Components

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box UCERF Model Components Fault Model(s) Object Oriented (Modular) Framework Makes logic trees very easy to handle …

OpenSHA Hazard Curve Calculator WGCEP-2002 Hazard Curves (Field et al. 2005, SRL) Distributed Object Technologies (Maechling et al., 2005, SRL) WGCEP-2002 (10,000 branches) Now w/ NGAs & ERM 2.2 also

Issue with Logic Trees 1)They take time and resources to implement and document 2)Must be careful about correlations (Page and Carlson, 2006, BSSA) 3)Is anyone using them? 4)How do we know which are important or worth pursuing (especially in terms of loss)?

Demo Loss Calculator?

Logic Trees What we need is not all possible branches, but the minimum number of branches that span the range of viability and importance

Best Available Science? Poisson Model (long-term rates) Quasi-Periodic Recurrence Models BPT Renewal Time or Slip predictable Static-Stress Interaction Models Clock change BPT-step Rate & State Clock change w/ Rate & State Hardebeck (2004) approach Empirical Rate-Change Models Clustering Models Foreshock/Afershock statistics (e.g., STEP; ETAS)