Presentation is loading. Please wait.

Presentation is loading. Please wait.

SCEC: An NSF + USGS Research Center ShakeAlert CISN Testing Center (CTC) Development Philip Maechling Information Technology Architect Southern California.

Similar presentations


Presentation on theme: "SCEC: An NSF + USGS Research Center ShakeAlert CISN Testing Center (CTC) Development Philip Maechling Information Technology Architect Southern California."— Presentation transcript:

1 SCEC: An NSF + USGS Research Center ShakeAlert CISN Testing Center (CTC) Development Philip Maechling Information Technology Architect Southern California Earthquake Center (SCEC) 14 October 2010

2 1.Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present). 2.Changed our automated software testing infrastructure from web-based (Joomla) system to server-based (CSEP) system. 3.Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. 4.Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. 5.Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. 6.Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 2 CTC Progress in 2010

3 EEW Testing Center Provides On-going Performance Evaluation Performance summaries available through login (www.scec.org/eew) Evaluation results for 2010 include 144 M4+ earthquakes in CA Testing Region Cumulative raw summaries (2008-present) posted at (scec.usc.edu/scecpedia/Earthquake_Early_Warning)

4 1.Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present). 2.Changed our automated software testing infrastructure from web-based (Joomla) system to server-based (CSEP) system. 3.Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. 4.Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. 5.Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. 6.Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 4 CTC Progress in 2010

5 Earthquake Catalog Retrieve Data Filter Catalog Filtered Earthquake Catalog CISN Testing Center (CTC) Forecast Evaluation Processing System CISN EEW Testing Center (CTC) and Web Site ANSS Earthquake Catalog CISN Decision Modules CISN User Modules Load Reports ShakeAlert Earthquake Parameter Forecast ShakeAlert Ground Motion Forecast Observed ANSS EQ Parameter and Ground Motion Data ShakeAlert Forecast Information Evaluation tests comparing Forecasts and Observations ShakeMap RSS Feed Ground Motion Observations

6

7

8 1.Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present). 2.Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. 3.Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. 4.Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. 5.Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. 6.Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 8 CTC Progress in 2010

9

10 1.Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present). 2.Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. 3.Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. 4.Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. 5.Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. 6.Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 10 CTC Progress in 2010

11 11 Current ShakeAlert CTC retrieves ShakeMap RSS data and plots observations for all Mag 3.0+ earthquakes in California Testing Region as shown (left). CTC Progress in 2010

12 1.Operating algorithm evaluation system with California-based performance reports and raw data available (2008-Present) 2.Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. 3.Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. 4.Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. 5.Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. 6.Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 12 CTC Progress in 2010

13 Earthquake Catalog Retrieve Data Filter Catalog Filtered Earthquake Catalog CISN Testing Center (CTC) Forecast Evaluation Processing System CISN EEW Testing Center (CTC) and Web Site ANSS Earthquake Catalog CISN Decision Modules CISN User Modules Load Reports ShakeAlert Earthquake Parameter Forecast ShakeAlert Ground Motion Forecast Observed ANSS EQ Parameter and Ground Motion Data ShakeAlert Forecast Information Evaluation tests comparing Forecasts and Observations ShakeMap RSS Feed Ground Motion Observations

14 1.Operating algorithm evaluation system with California-based performance reports and raw data available (2008-Present) 2.Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. 3.Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. 4.Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. 5.Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. 6.Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document. 14 CTC Progress in 2010

15 15 CTC Progress in 2010 Initial CTC Evaluation Test is defined in 2008 CISN EEW Testing Document (as updated July 2010). Previous Algorithm Testing Center did not implement this summary. Access to ShakeMap RSS ground motion observations makes automated implementation practical.

16 1.Prioritization of forecast evaluation tests to be implemented 2.SCEC science planning of EEW forecast evaluation experiments 3.Use of EEW in time-dependent PSHA information 4.Consider Extending ShakeMap format as CAP-based forecast exchange format. –Send forecasts information (and time of report) to produce: –ShakeMap Intensity Maps –ShakeMap Uncertainties Maps 5.Consider ShakeAlert interfaces to support comparative EEW performance tests. Provide access to information for each trigger: –Stations Used In Trigger –Stations Available when declaring Trigger –Software Version declaring Trigger 16 Scientific and Technical Coordination Issues

17 End 17

18 Proposed CTC Evaluation Tests 18

19 Rigorous CISN EEW testing will involve the following definitions: –Define a forecast –Define testing area –Define input data used in forecasts –Define reference observation data –Define measures of success for forecasts 19 Design of an Experiment

20 Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. –Summary 1: Magnitude –Summary 2: Location –Summary 3: Ground Motion –Summary 4: System Performance –Summary 5: False Triggers –Summary 6: Missed Triggers 20 Proposed Performance Measures

21 Summary 1.1: Magnitude X-Y Diagram Measure of Goodness: Data points fall on diagonal line Relevant: T2,T3,T4 Drawbacks: Timeliness element not represented Which in series of magnitude estimates should be used in plot. 21 Experiment Design

22 Summary 1.2: Initial magnitude error by magnitude Measure of Goodness: Data points fall on horizontal line Relevant: T2,T3,T4 Drawbacks: Timeliness element not represented 22 Experiment Design

23 Summary 1.3: Magnitude accuracy by update Measure of Goodness: Data points fall on horizontal line Relevant: T3,T4 Drawbacks: Timeliness element not represented 23 Experiment Design

24 Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. –Summary 1: Magnitude –Summary 2: Location –Summary 3: Ground Motion –Summary 4: System Performance –Summary 5: False Triggers –Summary 6: Missed Triggers 24 Proposed Performance Measures

25 25 Experiment Design Summary 2.1: Cumulative Location Errors Measure of Goodness: Data points fall on vertical zero line Relevant: T3, T4 Drawbacks: Does not consider magnitude accuracy or timeliness

26 Summary 2.2: Magnitude and Location error by time after origin Measure of Goodness: Data points fall on horizontal zero line Relevant: T3, T4 Drawbacks: Event-specific not cumulative 26 Experiment Design

27 Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. –Summary 1: Magnitude –Summary 2: Location –Summary 3: Ground Motion –Summary 4: System Performance –Summary 5: False Triggers –Summary 6: Missed Triggers 27 Proposed Performance Measures

28 28 Experiment Design Summary 3.1 : Intensity Map Comparisons Measure of Goodness: Forecast map matches observed map Relevant: T4 Drawbacks: Not a quantitative results

29 Summary 3.2: Intensity X-Y Diagram Measure of Goodness: Data points fall on diagonal line Relevant: T1,T2,T4 Drawbacks: Timeliness element not represented Which in series of intensity estimate should be used in plots T3. 29 Experiment Design

30 Summary 3.3: Intensity Ratio by Magnitude Measure of Goodness: Data points fall on horizontal line Relevant: T1,T2,T4 Drawbacks: Timeliness element not represented Which intensity estimate in series should be used in plot. 30 Experiment Design

31 Summary 3.3: Predicted to Observed Intensity Ratio by Distance and Magnitude Measure of Goodness: Data points fall on horizontal line Relevant: T1,T2,T4 Drawbacks: Timeliness element not represented Which intensity estimate in series should be used in plot. 31

32 Summary 3.3: Evaluate Conversion from PGV to Intensity Group has proposed to evaluate algorithms by comparing intensities and they provide a formula for conversion to Intensity. 32

33 Summary 3.4: Evaluate Conversion from PGV to Intensity Group has proposed to evaluate algorithms by comparing intensities and they provide a formula for conversion to Intensity. 33

34 34 Experiment Design Summary 3.5: Statistical Error Distribution for Magnitude and Intensity Measure of Goodness: No missed events or false alarms in testing area Relevant: T4 Drawbacks:

35 35 Experiment Design Summary 3.6: Mean-time to first location or intensity estimate (small blue plot) Measure of Goodness: Peak of measures at zero Relevant: T1,T2,T3,T4 Drawbacks: Cumulative and does not involve accuracy of estimates

36 Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. –Summary 1: Magnitude –Summary 2: Location –Summary 3: Ground Motion –Summary 4: System Performance –Summary 5: False Triggers –Summary 6: Missed Triggers 36 Proposed Performance Measures

37 37 Experiment Design No examples for System Performance Summary defined as Summary 4.1: Ratio of reporting versus non-reporting stations:

38 Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. –Summary 1: Magnitude –Summary 2: Location –Summary 3: Ground Motion –Summary 4: System Performance –Summary 5: False Triggers –Summary 6: Missed Triggers 38 Proposed Performance Measures

39 39 Experiment Design Summary 5.1: Missed event and False Alarm Map Measure of Goodness: No missed events or false alarms in testing area Relevant: T3, T4 Drawbacks: Must develop definitions for missed events and false alarms, Does not reflect timeliness

40 40 Experiment Design Summary 5.2: Missed event and False Alarm Map Measure of Goodness: No missed events or false alarms in testing area Relevant: T3, T4 Drawbacks: Must develop definitions for missed events and false alarms, Does not reflect timeliness

41 Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. –Summary 1: Magnitude –Summary 2: Location –Summary 3: Ground Motion –Summary 4: System Performance –Summary 5: False Triggers –Summary 6: Missed Triggers 41 Proposed Performance Measures

42 42 Experiment Design Summary 6.1: Missed Event map Measure of Goodness: No missed events in testing region Relevant: T3, T4 Drawbacks: Must define missed event. Does not indicate timeliness

43 End 43


Download ppt "SCEC: An NSF + USGS Research Center ShakeAlert CISN Testing Center (CTC) Development Philip Maechling Information Technology Architect Southern California."

Similar presentations


Ads by Google