Download presentation
Presentation is loading. Please wait.
1
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang
2
Outline Basic idea of Bayes filters Several types of Bayes filters Some applications
3
Bayes Filters System state dynamics Observation dynamics We are interested in: Belief or posterior density Estimating system state from noisy observations
4
From above, constructing two steps of Bayes Filters Predict: Update: Recall “law of total probability” and “Bayes’ rule”
5
Predict: Update: Assumptions: Markov Process
6
Bayes Filter How to use it? What else to know? Motion Model Perceptual Model Start from:
7
Example 1 Step 0: initialization Step 1: updating
8
Example 1 (continue) Step 3: updating Step 4: predicting Step 2: predicting
9
Several types of Bayes filters They differs in how to represent probability densities –Kalman filter –Multihypothesis filter –Grid-based approach –Topological approach –Particle filter
10
Kalman Filter Recall general problem Assumptions of Kalman Filter: Belief of Kalman Filter is actually a unimodal Gaussian Advantage: computational efficiency Disadvantage: assumptions too restrictive
11
Multi-hypothesis Tracking Belief is a mixture of Gaussian Tracking each Gaussian hypothesis using a Kalman filter Deciding weights on the basis of how well the hypothesis predict the sensor measurements Advantage: –can represent multimodal Gaussian Disadvantage: –Computationally expensive –Difficult to decide on hypotheses
12
Grid-based Approaches Using discrete, piecewise constant representations of the belief Tessellate the environment into small patches, with each patch containing the belief of object in it Advantage: –Able to represent arbitrary distributions over the discrete state space Disadvantage –Computational and space complexity required to keep the position grid in memory and update it
13
Topological approaches A graph representing the state space –node representing object’s location (e.g. a room) –edge representing the connectivity (e.g. hallway) Advantage –Efficiency, because state space is small Disadvantage –Coarseness of representation
14
Particle filters Also known as Sequential Monte Carlo Methods Representing belief by sets of samples or particles are nonnegative weights called importance factors Updating procedure is sequential importance sampling with re-sampling
15
Example 2: Particle Filter Step 0: initialization Each particle has the same weight Step 1: updating weights. Weights are proportional to p(z|x)
16
Example 2: Particle Filter Particles are more concentrated in the region where the person is more likely to be Step 3: updating weights. Weights are proportional to p(z|x) Step 4: predicting. Predict the new locations of particles. Step 2: predicting. Predict the new locations of particles.
17
Compare Particle Filter with Bayes Filter with Known Distribution Example 1 Example 2 Example 1 Example 2 Predicting Updating
18
Comments on Particle Filters Advantage: –Able to represent arbitrary density –Converging to true posterior even for non- Gaussian and nonlinear system –Efficient in the sense that particles tend to focus on regions with high probability Disadvantage –Worst-case complexity grows exponentially in the dimensions
19
Comparison KalmanMultihypothesis Tracking GridTopologyParticle BeliefUnimodalMultimodalDiscrete Accuracy ++0-+ Robustness 0++++ Sensor Variety --+0+ Efficiency +0-00 Implement ation 0-00+ + : good; 0 : neutral; - : weak
20
Particle Filters (unconstrained) Particle Filters (constrained) Combination of Particle Filters and Kalman Filters Example Applications
21
Sensors Ultra sound and infrared Sensors: –Less accurate but certain with identification –Laser range finder –Accurate but anonymous
22
Example Indoor Environment Red circles: ultra- sound ID sensors Blue squares: infrared ID sensors
23
Using Particle Filters (unconstrained) Due to high noise level of ultrasound and infrared sensors, we use particle filters Whenever detect the person, updating particles
24
Using Particle Filters (unconstrained) Another Example
25
Using Particle Filters (unconstrained) Another Example
26
Using Particle Filters (constrained) A more efficient way to use particle filters constraining the state space to locations on a Voronoi graph (a structure similar to a skeleton of an environment’s free space)
27
Combine Particle and Kalman Filters To Solve Data Association Problem Area covered by ID sensors Data Association Problem In area 3 and 4, identities of A and B are known In area 5 and 6, resolving ambiguity, but need additional hypotheses Laser range finder
28
Track individual people using Kalman filters (using laser range data) A particle filter maintains multiple hypothesis wrt identities of people Combine Particle and Kalman Filters To Solve Data Association Problem
29
Conclusion “The Location Stack”: a general framework with publicly available implementation Probabilistic techniques have tremendous potential for inference problems Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.