Download presentation
Presentation is loading. Please wait.
Published byUrsula Booker Modified over 9 years ago
1
Particle Filtering
2
Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models to estimate ground truth, unobserved variables, make forecasts
3
Hidden Markov Model Use observations to get a better idea of where the robot is at time t X0X0 X1X1 X2X2 X3X3 z1z1 z2z2 z3z3 Hidden state variables Observed variables Predict – observe – predict – observe…
4
Last Class Kalman Filtering and its extensions Exact Bayesian inference for Gaussian state distributions, process noise, observation noise What about more general distributions? Key representational issue How to represent and perform calculations on probability distributions?
5
General problem x t ~ Bel(x t ) (arbitrary p.d.f.) x t+1 = f(x t,u, p ) z t+1 = g(x t+1, o ) p ~ arbitrary model, o ~ arbitrary model Process noise Observation noise
6
Particle Filtering (aka Sequential Monte Carlo) Represent distributions as a set of particles Applicable to non- gaussian high-D distributions Convenient implementations Widely used in vision, robotics
7
Simultaneous Localization and Mapping (SLAM) Mobile robots Odometry Locally accurate Drifts significantly over time Vision/ladar/sonar Inaccurate locally Global reference frame Combine the two State: (robot pose, map) Observations: (sensor input)
8
Particle Representation Bel(x t ) = {(w k,x k )} w k are weights, x k are state hypotheses Weights sum to 1 Approximates the underlying distribution
9
Recovering the Distribution Kernel density estimation P(x) = k w k K(x,x k ) K(x,x k ) is the kernel function Better approximation as # particles, kernel sharpness increases
10
Monte Carlo Integration If P(x) ≈ Bel(x) E P [f(x)] = integral[ f(x)P(x)dx ] ≈ k w k f(x k ) What might you want to compute? Mean: set f(x) = x Variance: f(x) = x 2 P(y): set f(x) = P(y|x) Because P(y) = integral[ P(y|x)P(x)dx ]
11
Filtering Steps Predict Compute Bel’(x t+1 ): distribution of x t+1 using dynamics model alone Update Compute x t+1 |z t+1 with Bayes rule Gives Bel(x t+1 ) for next step
12
Predict Step Given input particles Bel(x t ) Distribution of f(x t,u t,v) determined by propagating individual particles Gives Bel’(x t+1 )
13
Particle Propagation
14
Update Step Goal: compute P(x t+1 | z t+1 ) given Bel’(x t+1 ), z t+1 P(x t+1 | z t+1 ) = P(z t+1 | x t+1 ) P(x t+1 ) P(x t+1 ) = Bel’(x t+1 ) (given) For a state hypothesis x k Bel’(x t+1 ), what’s P(z t+1 | x t+1 =x k )? Solution: Weight particles by the likelihood of observing z t+1, given that the particle is the actual state, and resample
15
Update Step w k w k ’ * P(z t+1 | x t+1 =x k ) 1D example: g(x, o ) = h(x) + o o ~ N( , ) P(z t+1 | x t+1 =x k ) = C exp(- (h(x)-z t+1 ) 2 / 2 2 ) In general, distribution can be calibrated using experimental data
16
Resampling Likelihood weighted particles may no longer represent the distribution efficiently Importance resampling: sample new particles proportionally to weight
17
Sampling Importance Resampling (SIR) variant Predict Update Resample
18
Presentation
19
Particle Filtering Issues Variance Std. dev. of particle representation ~ 1/sqrt(N) Loss of particle diversity Resampling will likely drop particles with low likelihood They may turn out to be useful hypotheses in the future
20
Other Resampling Variants Selective resampling Only resample when # of “effective particles” < threshold Stratified resampling Reduce variance using quasi-random sampling Optimization Explicitly choose particles to minimize deviance from posterior …
21
Storing more information with particles Unscented Particle Filter Each particle represents a local gaussian, maintains a local covariance matrix Combination of particle filter + Kalman filter Rao-Blackwellized Particle Filter State (x 1,x 2 ) Particle contains hypothesis of x 1, analytical distribution over x 2 Reduces variance
22
Recap Bayesian mechanisms for state estimation are well understood Representation challenge Methods: Kalman filters: highly efficient closed-form solution for Gaussian distributions Particle filters: approximate filtering for high-D, non- Gaussian distributions Implementation challenges for different domains (localization, mapping, SLAM, tracking)
23
Midterm Project Report Schedule (tentative) Tuesday Changsi, Yang, Roland: Indoor person following Jiaan and Yubin: Indoor mapping You-wei: Autonomous driving Santhosh and Yohanand: Robot chess Thursday Adrija: Dynamic collision checking with point clouds Damien: Netlogo Jingru and Yajia: Ball collector with robot arm Ye: UAV simulation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.