Bayesian Model Selection and Multi-target Tracking Presenters: Xingqiu Zhao and Nikki Hu Joint work with M. A. Kouritzin, H. Long, J. McCrosky, W. Sun.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Motivating Markov Chain Monte Carlo for Multiple Target Tracking
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Stationary Probability Vector of a Higher-order Markov Chain By Zhang Shixiao Supervisors: Prof. Chi-Kwong Li and Dr. Jor-Ting Chan.
Ordinary Least-Squares
Tracking Unknown Dynamics - Combined State and Parameter Estimation Tracking Unknown Dynamics - Combined State and Parameter Estimation Presenters: Hongwei.
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Markov Chains 1.
11 - Markov Chains Jim Vallandingham.
Graduate School of Information Sciences, Tohoku University
Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes Day 1: January 19 th, Day 2: January 28 th Lahore University.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
Brief Introduction to the Alberta Group of MITACS-PINTS Center I. Groups: (Project Leader : Prof. Mike Kouritzin) University of Alberta (base), H.E.C.
PHD Approach for Multi-target Tracking
Artificial Learning Approaches for Multi-target Tracking Jesse McCrosky Nikki Hu.
Outline Formulation of Filtering Problem General Conditions for Filtering Equation Filtering Model for Reflecting Diffusions Wong-Zakai Approximation.
Manifold Filtering Problem Lockheed Martin Jarett Hailes Jonathan Wiersma Richard VanWeelden July 21, 2003.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Sérgio Pequito Phd Student
3.3 Brownian Motion 報告者:陳政岳.
Probability theory 2010 Outline  The need for transforms  Probability-generating function  Moment-generating function  Characteristic function  Applications.
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
2.3 General Conditional Expectations 報告人:李振綱. Review Def (P.51) Let be a nonempty set. Let T be a fixed positive number, and assume that for each.
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
5.2Risk-Neutral Measure Part 2 報告者:陳政岳 Stock Under the Risk-Neutral Measure is a Brownian motion on a probability space, and is a filtration for.
The moment generating function of random variable X is given by Moment generating function.
Error estimates for degenerate parabolic equation Yabin Fan CASA Seminar,
Mathematics for Business (Finance)
Approximate Inference 2: Monte Carlo Markov Chain
MITACS-PINTS Prediction In Interacting Systems Project Leader : Michael Kouriztin.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
5.4 Fundamental Theorems of Asset Pricing 報告者:何俊儒.
Section 4-1: Introduction to Linear Systems. To understand and solve linear systems.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2010 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Chapter 9 Samples.
Sample Variability Consider the small population of integers {0, 2, 4, 6, 8} It is clear that the mean, μ = 4. Suppose we did not know the population mean.
An Efficient Sequential Design for Sensitivity Experiments Yubin Tian School of Science, Beijing Institute of Technology.
9. Change of Numeraire 鄭凱允. 9.1 Introduction A numeraire is the unit of account in which other assets are denominated and change the numeraire by changing.
Optimal Bayes Classification
Security Markets VII Miloslav S. Vosvrda Teorie financnich trhu.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.
Confidence Interval & Unbiased Estimator Review and Foreword.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
3.6 First Passage Time Distribution
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
S TOCHASTIC M ODELS L ECTURE 4 B ROWNIAN M OTIONS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (Shenzhen) Nov 11,
Introduction: Metropolis-Hasting Sampler Purpose--To draw samples from a probability distribution There are three steps 1Propose a move from x to y 2Accept.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Sampling Distribution Estimation Hypothesis Testing
Today.
Calibrated estimators of the population covariance
Path Coupling And Approximate Counting
Generalized Spatial Dirichlet Process Models
ASV Chapters 1 - Sample Spaces and Probabilities
Sec 21: Analysis of the Euler Method
Lecture 43 Section 10.1 Wed, Apr 6, 2005
Example Make x the subject of the formula
Rectangular Coordinates; Introduction to Graphing Equations
Chapter 11 Probability.
Presentation transcript:

Bayesian Model Selection and Multi-target Tracking Presenters: Xingqiu Zhao and Nikki Hu Joint work with M. A. Kouritzin, H. Long, J. McCrosky, W. Sun University of Alberta Supported by NSERC, MITACS, PIMS Lockheed Martin Naval Electronics and Surveillance System Lockheed Martin Canada, APR. Inc

Outline Introduction Simulation Studies Filtering Equations Markov Chain Approximations Model Selection Future Work

1. Introduction Motivation: Submarine tracking and fish farming Model: - Signal : (1)

- Observation : (2) Goal: to find the best estimation for the number of targets and the location of each target.

2. Simulation Studies

3. filtering equations Notations : the space of bounded continuous functions on ; : the set of all cadlag functions from into ; : the spaces of probability measures; : the spaces of positive finite measures on ; : state space of.

Let,, and. Define

The generator of Let where.

For any, we define where and

Conditions: C1. and satisfy the Lipschitz conditions. C2. C3. C4.

Theorem 1. The equation (1) has a unique solution a.s., which is an -valued Markov process.

Bayes formula and filtering equations Theorem 2. Suppose that C1-C3 hold. Then (i) (ii) where is the innovation process. (iii)

Uniqueness Theorem 3. Suppose that C1-C4 hold. Let be an - adapted cadlag process which is a solution of the Kushner-FKK equation where Then, for all a.s.

Theorem 4 Suppose that C1-C4 hold. If is an - adapted -valued cadlag process satisfying and Then, for all a.s.

4. Markov chain approximations Step 1 : Constructing smooth approximation of the observation process Step 2 : Dividing D and Let, For, let

Note that if is a rearrangement of. Let then. For, let. For, with 1 in the i-th coordinate.

Step 3 : Constructing the Markov chain approximations Method 1: — Method 1: Let. Set. One can find that and

Define as and for, define as let

─Method 2 : Let and, Then and Define as for.

Let as, take denote the integer part, set and let satisfy

Then, the Markov chain approximation is given by Theorem 5. in probability on for almost every sample path of.

5. Model selection Assume that the possible number of targets is,. Model k:,. Which model is better? Bayesian FactorsBayesian Factors Define the filter ratio processes as

The Evolution of Bayesian Factors Let and be independent and Y be Brownian motion on some probability space. Theorem 3. Let be the generator of,. Suppose that is continuous. Then is the unique measure- valued pair solution of the following system of SDEs, (3)

for, and (4) for, where is the optimal filter for model k, and

Markov chain approximations Applying the method in Section 3, one can construct Markov chain approximations to equations (3) and (4).

6. Future work Number of targets is a random variable Number of Targets is a random process