TEMPLATE DESIGN © 2008 www.PosterPresentations.com Vertically Integrated Seismological Analysis II : Inference (S31B-1713) Nimar S. Arora, Stuart Russell,

Slides:



Advertisements
Similar presentations
Lecture 18: Temporal-Difference Learning
Advertisements

Bayesian Estimation in MARK
Gibbs sampling in open-universe stochastic languages Nimar S. Arora Rodrigo de Salvo Braz Erik Sudderth Stuart Russell.
Automatic Inference in BLOG Nimar S. Arora University of California, Berkeley Stuart Russell University of California, Berkeley Erik Sudderth Brown University.
Markov-Chain Monte Carlo
Markov Chains 1.
Likelihood ratio tests
Variational Inference for Dirichlet Process Mixture Daniel Klein and Soravit Beer Changpinyo October 11, 2011 Applied Bayesian Nonparametrics Special Topics.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
1 Vertically Integrated Seismic Analysis Stuart Russell Computer Science Division, UC Berkeley Nimar Arora, Erik Sudderth, Nick Hay.
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Course overview Tuesday lecture –Those not presenting turn in short review of a paper using the method being discussed Thursday computer lab –Turn in short.
Today Introduction to MCMC Particle filters and MCMC
Reinforcement Learning: Learning algorithms Yishay Mansour Tel-Aviv University.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Monte Carlo Methods in Partial Differential Equations.
EVALUATION David Kauchak CS 451 – Fall Admin Assignment 3 - change constructor to take zero parameters - instead, in the train method, call getFeatureIndices()
Inferring High-Level Behavior from Low-Level Sensors Don Peterson, Lin Liao, Dieter Fox, Henry Kautz Published in UBICOMP 2003 ICS 280.
Representational and inferential foundations for possible large-scale information extraction and question-answering from the web Stuart Russell Computer.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
Comparing droplet activation parameterisations against adiabatic parcel models using a novel inverse modelling framework Warsaw: April 20 th 2015: Eulerian.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
Fast Simulators for Assessment and Propagation of Model Uncertainty* Jim Berger, M.J. Bayarri, German Molina June 20, 2001 SAMO 2001, Madrid *Project of.
An Efficient Sequential Design for Sensitivity Experiments Yubin Tian School of Science, Beijing Institute of Technology.
- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of.
Molecular Systematics
1 8. Back-testing of trading strategies 8.1Bootstrap Brock et al (1992), Davidson & Hinkley (1997), Fusai & Roncoroni (2008). Bootstrap: picking up at.
MCMC reconstruction of the 2 HE cascade events Dmitry Chirkin, UW Madison.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
Bayesian Travel Time Reliability
Representational and inferential foundations for possible large-scale information extraction and question-answering from the web Stuart Russell Computer.
Inference of Non-Overlapping Camera Network Topology by Measuring Statistical Dependence Date :
TEMPLATE DESIGN © Vertically Integrated Seismological Analysis I : Modeling Nimar S. Arora, Michael I. Jordan, Stuart.
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
04/21/2005 CS673 1 Being Bayesian About Network Structure A Bayesian Approach to Structure Discovery in Bayesian Networks Nir Friedman and Daphne Koller.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
TEMPLATE DESIGN © Approximate Inference Completing the analogy… Inferring Seismic Event Locations We start out with the.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
HW7: Evolutionarily conserved segments ENCODE region 009 (beta-globin locus) Multiple alignment of human, dog, and mouse 2 states: neutral (fast-evolving),
Monte Carlo Sampling to Inverse Problems Wojciech Dębski Inst. Geophys. Polish Acad. Sci. 1 st Orfeus workshop: Waveform inversion.
CS Fall 2011, Stuart Russell
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Reducing Photometric Redshift Uncertainties Through Galaxy Clustering
MCMC Output & Metropolis-Hastings Algorithm Part I
GEOGG121: Methods Monte Carlo methods, revision
Advanced Statistical Computing Fall 2016
Chapter 6: Temporal Difference Learning
Accelerated Sampling for the Indian Buffet Process
Bayesian inference Presented by Amir Hadadi
Haim Kaplan and Uri Zwick
(Very Brief) Introduction to Bayesian Statistics
Open universes and nuclear weapons
Multidimensional Integration Part I
Instructors: Fei Fang (This Lecture) and Dave Touretzky
ANALYST EVALUATION OF MODEL-BASED BAYESIAN SEISMIC MONITORING AT THE CTBTO Logic, Inc. Nimar S. Arora1, Jeffrey Given2, Elena Tomuta2, Stuart J. Russell1,3,
Conclusions and Further Work
Markov Chain Monte Carlo Limitations of the Model
Chapter 6: Temporal Difference Learning
Gibbs sampling in open-universe stochastic languages
Presentation transcript:

TEMPLATE DESIGN © Vertically Integrated Seismological Analysis II : Inference (S31B-1713) Nimar S. Arora, Stuart Russell, and Erik B. Sudderth and The Model # SeismicEvents ~ Poisson[TIME_DURATION*EVENT_RATE]; IsEarthQuake(e) ~ Bernoulli(.5); EventLocation(e) If IsEarthQuake(e) ~ EarthQuakeDistribution() Else ~ UniformEarthDistribution(); Magnitude(e) ~ Exponential(log(10)) + MIN_MAG; Distance(e,s) = GeographicalDistance(EventLocation(e), SiteLocation(s)); IsDetected(e,s) ~ Logistic[SITE_COEFFS(s)] (Magnitude(e), Distance(e,s),Distance(e,s)**2] ; #Arrivals(site = s) ~ Poisson[TIME_DURATION*FALSE_RATE(s)]; #Arrivals(event=e, site=s) If IsDetected(e,s) = 1 Else = 0; Time(a) If (event(a) = null) ~ Uniform(0,TIME_DURATION) else = IASPEI-TIME(EventLocation(event(a)), SiteLocation(site(a))) + TimeRes(a); TimeRes(a) ~ Laplace(TIMLOC(site(a)), TIMSCALE(site(a))); Azimuth(a) If (event(a) = null) ~ Uniform(0, 360) else = AddAngle(GeographicalAzimuth(EventLocation(event(a)), SiteLocation(site(a))) + AzRes(a); AzRes(a) ~ Laplace(0, AZSCALE(site(a))); Slow(s) If (event(a) = null) ~ Uniform(0,20) else = IASPEI-SLOW(EventLocation(event(a)), SiteLocation(site(a))) + SlowRes(site(a)); SlowRes(a) ~ Laplace(0, SLOSCALE); Markov Chain Monte Carlo The model combined with the actual observations of the arrivals defines a posterior probability density on the number, type, and locations of the seismic events –  (x), where x is a possible world. We use Markov Chain Monte Carlo (MCMC, Gilks et al., 1996) methods to infer  (x).  In other words, we sample from a Markov Chain whose stationary distribution is  (x). To construct this Markov Chain, we design moves which transition between the hypothesis space. The birth and death moves create new events and destroy them, respectively. The switch arrival move changes the event associated with an arrival. The random walk move changes the location and other parameters of an event. Based on arrivals identified by automated station processing (i.e. not based on waveforms, yet!). Relies only on the first P-arrival. Limitations of the Model Results 76 days of parametric data (i.e. arrivals marked by automated station processing) for training. 7 days of validation data (results below) 7 days of test data (not currently used). Dataset Analysis of Errors Markov chain is not converging fast enough. We need better moves to avoid local minima. Automated station processing has systematic bias in picking arrivals late. We need to build models on waveforms directly. We assume that LEB (human annotated bulletin) is the ground truth. We evaluate our system by comparing against the performance of SEL3 (the current automated bulletin) using the same arrivals as are available to SEL3. The predictions are evaluated by computing a min-cost max- cardinality matching of the predicted events with the ground truth events where the cost is the distance between the predicted and the true event location. Any edge with more that 50 seconds or 5 degrees of error is not included in the matching. We report precision (percentage of predicted events which are matched), recall (percentage of true events which are matched), F1 (harmonic mean of precision and recall), and the average cost of the matching. In itial world has a number of spurious events. The death move quickly kills off most of the spurious events. Proposal density is constructed by inverting the arrivals. Gradually, due to random walk and switch association moves, the locations of all the events are improved. Over some iterations, all the events are proposed, but the locations may not be very good. MCMC example Example Continued… The samples collected from the Markov Chain can be used to infer the posterior density Evaluation F1Precision/ Recall Error/S.D. (km) Average Log- likelihood SEL3 (IDC Automated) / / 119_ VISA (Best Start) / / VISA (SEL3 Start) / / VISA (Back projection Start) / /