T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering.

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
T EMPORAL P ROBABILISTIC M ODELS. M OTIVATION Observing a stream of data Monitoring (of people, computer systems, etc) Surveillance, tracking Finance.
Lirong Xia Approximate inference: Particle filter Tue, April 1, 2014.
(Includes references to Brian Clipp
IR Lab, 16th Oct 2007 Zeyn Saigol
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
CS B 553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Temporal sequences: Hidden Markov Models and Dynamic Bayesian Networks.
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics: Kalman Filters
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
CS 188: Artificial Intelligence Fall 2009 Lecture 20: Particle Filtering 11/5/2009 Dan Klein – UC Berkeley TexPoint fonts used in EMF. Read the TexPoint.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Probabilistic Robotics
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Stanford CS223B Computer Vision, Winter 2006 Lecture 11 Filters / Motion Tracking Professor Sebastian Thrun CAs: Dan Maynes-Aminzade, Mitul Saha, Greg.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
CS B 553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Continuous Probability Distributions and Bayesian Networks with Continuous Variables.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Slam is a State Estimation Problem. Predicted belief corrected belief.
Bayesian Filtering for Robot Localization
Markov Localization & Bayes Filtering
/09/dji-phantom-crashes-into- canadian-lake/
Computer vision: models, learning and inference Chapter 19 Temporal models.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Recap: Reasoning Over Time  Stationary Markov models  Hidden Markov models X2X2 X1X1 X3X3 X4X4 rainsun X5X5 X2X2 E1E1 X1X1 X3X3 X4X4 E2E2 E3E3.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
Probabilistic Robotics Bayes Filter Implementations.
Young Ki Baik, Computer Vision Lab.
Particle Filters.
Kalman Filter Notes Prateek Tandon.
Mobile Robot Localization (ch. 7)
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
P ARTICLE F ILTER L OCALIZATION Mohammad Shahab Ahmad Salam AlRefai.
State Estimation and Kalman Filtering
QUIZ!!  In HMMs...  T/F:... the emissions are hidden. FALSE  T/F:... observations are independent given no evidence. FALSE  T/F:... each variable X.
Gaussian Processes For Regression, Classification, and Prediction.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
State Estimation and Kalman Filtering Zeeshan Ali Sayyed.
Tracking with dynamics
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Autonomous Mobile Robots Autonomous Systems Lab Zürich Probabilistic Map Based Localization "Position" Global Map PerceptionMotion Control Cognition Real.
CS 541: Artificial Intelligence Lecture VIII: Temporal Probability Models.
CS 541: Artificial Intelligence Lecture VIII: Temporal Probability Models.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics
Markov ó Kalman Filter Localization
Introduction to particle filter
Filtering and State Estimation: Basic Concepts
Introduction to particle filter
A Short Introduction to the Bayes Filter and Related Models
Particle Filtering.
Probabilistic Map Based Localization
Presentation transcript:

T EMPORAL P ROBABILISTIC M ODELS P T 2

A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering

K ALMAN F ILTERING In a nutshell Efficient filtering in continuous state spaces Gaussian transition and observation models Ubiquitous for tracking with noisy sensors, e.g. radar, GPS, cameras

H IDDEN M ARKOV M ODEL FOR R OBOT L OCALIZATION Use observations + transition dynamics to get a better idea of where the robot is at time t X0X0 X1X1 X2X2 X3X3 z1z1 z2z2 z3z3 Hidden state variables Observed variables Predict – observe – predict – observe…

H IDDEN M ARKOV M ODEL FOR R OBOT L OCALIZATION Use observations + transition dynamics to get a better idea of where the robot is at time t Maintain a belief state b t over time b t (x) = P(X t =x|z 1:t ) X0X0 X1X1 X2X2 X3X3 z1z1 z2z2 z3z3 Hidden state variables Observed variables Predict – observe – predict – observe…

B AYESIAN F ILTERING WITH B ELIEF S TATES

Update via the observation z t Predict P(X t |z 1:t-1 ) using dynamics alone B AYESIAN F ILTERING WITH B ELIEF S TATES

I N C ONTINUOUS S TATE S PACES …

G ENERAL B AYESIAN F ILTERING IN C ONTINUOUS S TATE S PACES

K EY R EPRESENTATIONAL D ECISIONS Pick a method for representing distributions Discrete: tables Continuous: fixed parameterized classes vs. particle- based techniques Devise methods to perform key calculations (marginalization, conditioning) on the representation Exact or approximate?

G AUSSIAN D ISTRIBUTION

L INEAR G AUSSIAN T RANSITION M ODEL FOR M OVING 1D P OINT Consider position and velocity x t, v t Time step h Without noise x t+1 = x t + h v t v t+1 = v t With Gaussian noise of std   P(x t+1 |x t )  exp(-(x t+1 – (x t + h v t )) 2 /(2   2  i.e. X t+1 ~ N (x t + h v t,   )

L INEAR G AUSSIAN T RANSITION M ODEL If prior on position is Gaussian, then the posterior is also Gaussian vh 11 N( ,  )  N(  +vh,  +  1 )

L INEAR G AUSSIAN O BSERVATION M ODEL Position observation z t Gaussian noise of std  2 z t ~ N (x t,   )

L INEAR G AUSSIAN O BSERVATION M ODEL If prior on position is Gaussian, then the posterior is also Gaussian   (  2 z+  2 2  )/(  2 +  2 2 )  2   2  2 2 /(  2 +  2 2 ) Position prior Posterior probability Observation probability

M ULTIVARIATE G AUSSIANS X ~ N( ,  )

M ULTIVARIATE L INEAR G AUSSIAN P ROCESS A linear transformation + multivariate Gaussian noise If prior state distribution is Gaussian, then posterior state distribution is Gaussian If we observe one component of a Gaussian, then its posterior is also Gaussian y = A x +  ~ N( ,  )

M ULTIVARIATE C OMPUTATIONS Linear transformations of gaussians If x ~ N ( ,  ), y = A x + b Then y ~ N (A  +b, A  A T ) Consequence If x ~ N (  x,  x ), y ~ N (  y,  y ), z=x+y Then z ~ N (  x +  y,  x +  y ) Conditional of gaussian If [x 1,x 2 ] ~ N ([  1  2 ],[  11,  12 ;  21,  22 ]) Then on observing x 2 =z, we have x 1 ~ N (  1 -  12  (z-  2 ),  11 -  12   21 )

K ALMAN F ILTER A SSUMPTIONS x t ~ N (  x,  x ) x t+1 = F x t + g + v z t+1 = H x t+1 + w v ~ N ( ,  v ), w ~ N ( ,  w ) Dynamics noise Observation noise

T WO S TEPS Maintain  t,  t the parameters of the gaussian distribution over state x t Predict Compute distribution of x t+1 using dynamics model alone Update (observe z t+1 ) Compute P(x t+1 |z t+1 ) with Bayes rule

T WO S TEPS Maintain  t,  t the parameters of the gaussian distribution over state x t Predict Compute distribution of x t+1 using dynamics model alone x t+1 ~ N (F  t + g, F  t F T +  v ) Let these be N(  ’,  ’) Update Compute P(x t+1 |z t+1 ) with Bayes rule

T WO S TEPS Maintain  t,  t the parameters of the gaussian distribution over state x t Predict Compute distribution of x t+1 using dynamics model alone x t+1 ~ N (F  t + g, F  t F T +  v ) Let these be N(  ’,  ’) Update Compute P(x t+1 |z t+1 ) with Bayes rule Parameters of final distribution  t+1 and  t+1 derived using the conditional distribution formulas

D ERIVING THE U PDATE R ULE xtztxtzt ’a’a = N (, )  ’ B B T C x t ~ N(  ’,  ’) (1) Unknowns a,B,C (3) Assumption (7) Conditioning (1) x t | z t ~ N(  ’-BC -1 (z t -a),  ’-BC -1 B T ) (2) Assumption z t | x t ~ N(H x t,  W ) C-B T  ’ -1 B =  W => C = H  ’ H T +  W H x t = a-B T  ’ -1 (x t -  ’) => a=H  ’, B T =H  ’ (5) Set mean (4)=(3) (6) Set cov. (4)=(3) (8,9) Kalman filter  t =  ’ -  ’H T C -1 (z t -H  ’) (4) Conditioning (1) z t | x t ~ N(a-B T  ’ -1 x t, C-B T  ’ -1 B)  t =  ’ -  ’H T C -1 H  ’

P UTTING IT TOGETHER Transition matrix F, covariance  x Observation matrix H, covariance  z  t+1 = F  t + K t+1 (z t+1 – HF  t )  t+1 = (I - K t+1 )(F  t F T +  x ) Where K t+1 = (F  t F T +  x )H T (H(F  t F T +  x )H T +  z ) -1 Got that memorized?

P ROPERTIES OF K ALMAN F ILTER Optimal Bayesian estimate for linear Gaussian transition/observation models Need estimates of covariance… model identification necessary Extensions to nonlinear transition/observation models work as long as they aren’t too nonlinear Extended Kalman Filter Unscented Kalman Filter

Tracking the velocity of a braking obstacle Learning that the road is slick Actual max deceleration Braking begins Estimated max deceleration Velocity initially uninformed More distance measurements arrive Obstacle slows Stopping distance (95% confidence interval) Braking initiatedGradual stop

N ON -G AUSSIAN DISTRIBUTIONS Gaussian distributions are a “lump” Kalman filter estimate

N ON -G AUSSIAN DISTRIBUTIONS Integrating continuous and discrete states Splitting with a binary choice “up” “down”

E XAMPLE : F AILURE DETECTION Consider a battery meter sensor Battery = true level of battery BMeter = sensor reading Transient failures: send garbage at time t Persistent failures: send garbage forever

E XAMPLE : F AILURE DETECTION Consider a battery meter sensor Battery = true level of battery BMeter = sensor reading Transient failures: send garbage at time t … Persistent failures: sensor is broken …

D YNAMIC B AYESIAN N ETWORK BMeter t Battery t Battery t-1 BMeter t ~ N(Battery t,  ) (Think of this structure “unrolled” forever…)

D YNAMIC B AYESIAN N ETWORK BMeter t Battery t Battery t-1 BMeter t ~ N(Battery t,  ) P(BMeter t =0 | Battery t =5) = 0.03 Transient failure model

R ESULTS ON T RANSIENT F AILURE E(Battery t ) Transient failure occurs Without model With model Meter reads …

R ESULTS ON P ERSISTENT F AILURE E(Battery t ) Persistent failure occurs With transient model Meter reads …

P ERSISTENT F AILURE M ODEL BMeter t Battery t Battery t-1 BMeter t ~ N(Battery t,  ) P(BMeter t =0 | Battery t =5) = 0.03 Broken t-1 Broken t P(BMeter t =0 | Broken t ) = 1 Example of a Dynamic Bayesian Network (DBN)

R ESULTS ON P ERSISTENT F AILURE E(Battery t ) Persistent failure occurs With transient model Meter reads … With persistent failure model

H OW TO PERFORM INFERENCE ON DBN? Exact inference on “unrolled” BN Variable Elimination – eliminate old time steps After a few time steps, all variables in the state space become dependent! Lost sparsity structure Approximate inference Particle Filtering

P ARTICLE F ILTERING ( AKA S EQUENTIAL M ONTE C ARLO ) Represent distributions as a set of particles Applicable to non- gaussian high-D distributions Convenient implementations Widely used in vision, robotics

P ARTICLE R EPRESENTATION Bel(x t ) = {(w k,x k )} w k are weights, x k are state hypotheses Weights sum to 1 Approximates the underlying distribution

Weighted resampling step P ARTICLE F ILTERING Represent a distribution at time t as a set of N “particles” S t 1,…,S t N Repeat for t=0,1,2,… Sample S[i] from P(X t+1 |X t =S t i ) for all i Compute weight w[i] = P(e|X t+1 =S[i]) for all i Sample S t+1 i from S[.] according to weights w[.]

B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Sampling step

B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Suppose we now observe BMeter=0 P(BMeter=0|sample) = ?

B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Compute weights (drawn as particle size) P(BMeter=0|sample) = ?

B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Weighted resampling P(BMeter=0|sample) = ?

B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Sampling Step

B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Now observe BMeter t = 5

B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Compute weights 1 0

B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Weighted resample

A PPLICATIONS OF P ARTICLE F ILTERING IN R OBOTICS Simultaneous Localization and Mapping (SLAM) Observations: laser rangefinder State variables: position, walls

S IMULTANEOUS L OCALIZATION AND M APPING (SLAM) Mobile robots Odometry Locally accurate Drifts significantly over time Vision/ladar/sonar Inaccurate locally Global reference frame Combine the two State: (robot pose, map) Observations: (sensor input)

G ENERAL PROBLEM x t ~ Bel (x t ) (arbitrary p.d.f.) x t+1 = f(x t,u,  p ) z t+1 = g(x t+1,  o )  p ~ arbitrary p.d.f.,  o ~ arbitrary p.d.f. Process noise Observation noise

S AMPLING I MPORTANCE R ESAMPLING (SIR) VARIANT Predict Update Resample

A DVANCED F ILTERING T OPICS Mixing exact and approximate representations (e.g., mixture models) Multiple hypothesis tracking (assignment problem) Model calibration Scaling up (e.g., 3D SLAM, huge maps)

N EXT T IME Putting it together: intelligent agents Read R&N 2