Download presentation
Presentation is loading. Please wait.
Published byToby Hawkins Modified over 8 years ago
1
The Unscented Particle Filter 2000/09/29 이 시은
2
Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes available on-line To solve it – modeling the evolution of the system and noise Resulting models – non-linearity and non-Gaussian distribution
3
Extended Kalman filter –linearize the measurements and evolution models using Taylor series Unscented Kalman Filter –not apply to general non Gaussian distribution Seq. Monte Carlo Methods : Particle filters –represent posterior distribution of states. –any statistical estimates can be computed. –deal with nonlinearities distribution
4
Particle Filter –rely on importance sampling –design of proposal distribution Proposal for Particle Filter –EKF Gaussian approximation –UKF proposal control rate at which tails go to zero heavy tailed distribution
5
Dynamic State Space Model Transition equation and a measurement’s equation Goal –approximate the posterior –one of marginals, filtering density recursively
6
Extended Kalman Filter MMSE estimator based on Taylor expansion of nonlinear f and g around estimate of state
7
Unscented Kalman Filter Not approximate non-linear process and observation models Use true nonlinear models and approximate distribution of the state random variable Unscented transformation
8
Particle Filtering Not require Gaussian approximation Many variations, but based on sequential importance sampling –degenerate with time Include resampling stage
9
Perfect Monte Carlo Simulation A set of weighted particles(samples) drawn from the posterior Expectation
11
Bayesian Importance Sampling Impossible to sample directly from the posterior sample from easy-to-sample, proposal distribution
13
Asymptotic convergence and a central theorem for under the following assumptions – i.i.d samples drawn from the proposal, support of the proposal include support of posterior and finite exists. –Expectation of, exist and are finite.
14
Sequential Importance Sampling Proposal distribution assumption –state: Markov process –observations: independent given states
15
–we can sample from the proposal and evaluate likelihood and transition probability, generate a prior set of samples and iteratively compute the importance weights
16
Choice of proposal distribution Minimize variance of the importance weights popular choice move particle towards the region of high likelihood
17
Degeneracy of SIS algorithm Variance of importance ratios increases stochastically over time
18
Selection(Resampling) Eliminate samples with low importance ratios and multiply samples with high importance ratios. Associate to each particle a number of children
19
SIR and Multinomial sampling Mapping Dirac random measure onto an equally weighted random measure Multinomial distribution
20
Residual resampling Set perform an SIR procedure to select remaining samples with new weights add the results to the current
21
Minimum variance sampling When to sample
22
Generic Particle Filter 1. Initialization t=0 2. For t=1,2, … (a) Importance sampling step for I=1, …N, sample: evaluate importance weight normalize the importance weights (b) Selection (resampling) (c) output
23
Improving Particle Filters Monte Carlo(MC) assumption –Dirac point-mass approx. provides an adequate representaion of posterior Importance sampling(IS) assumption –obtain samples from posterior by sampling from a suitable proposal and apply importance sampling corrections.
24
MCMC Move Step Introduce MCMC steps of invariant distribution If particles are distributed according to the posterior then applying a Markov chain transition kernel
25
Designing Better Importance Proposals Move samples to regions of high likelihood prior editing –ad-hoc acceptance test of proposing particles Local linearization –Taylor series expansion of likelihood and transition prior –ex) –improved simulated annealed sampling algorithm
26
Rejection methods If likelihood is bounded, sample from optimal importance distribution
27
Auxiliary Particle Filters Obtain approximate samples from the optimal importance distribution by an auxiliary variable k. draw samples from joint distribution
28
Unscented Particle Filter Using UKF for proposal distribution generation within a particle filter framework
29
Theoretical Convergence Theorem1 If importance weight is upper bounded for any and if one of selection schemes, then for all, there exists independent of N s.t. for any
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.