Kalman Filters Gaussian MNs

Slides:



Advertisements
Similar presentations
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Advertisements

Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters.
Kalman’s Beautiful Filter (an introduction) George Kantor presented to Sensor Based Planning Lab Carnegie Mellon University December 8, 2000.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Probabilistic Robotics
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Stanford CS223B Computer Vision, Winter 2006 Lecture 11 Filters / Motion Tracking Professor Sebastian Thrun CAs: Dan Maynes-Aminzade, Mitul Saha, Greg.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Kalman Filtering Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
A Unifying Review of Linear Gaussian Models
1 EM for BNs Graphical Models – Carlos Guestrin Carnegie Mellon University November 24 th, 2008 Readings: 18.1, 18.2, –  Carlos Guestrin.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Crash Course on Machine Learning
Machine Learning Queens College Lecture 3: Probability and Statistics.
Markov Localization & Bayes Filtering
Computer vision: models, learning and inference Chapter 19 Temporal models.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics Bayes Filter Implementations.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
1 EM for BNs Kalman Filters Gaussian BNs Graphical Models – Carlos Guestrin Carnegie Mellon University November 17 th, 2006 Readings: K&F: 16.2 Jordan.
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
An Introduction to Kalman Filtering by Arthur Pece
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Unscented Kalman Filter 1. 2 Linearization via Unscented Transform EKF UKF.
Lecture 2: Statistical learning primer for biologists
State Estimation and Kalman Filtering Zeeshan Ali Sayyed.
Tracking with dynamics
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
General approach: A: action S: pose O: observation Position at time t depends on position previous position and action, and current observation.
CS 541: Artificial Intelligence Lecture VIII: Temporal Probability Models.
1 BN Semantics 3 – Now it’s personal! Parameter Learning 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 22 nd, 2006 Readings:
Probability Theory and Parameter Estimation I
Extending Expectation Propagation for Graphical Models
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Today.
Probabilistic Robotics
Kalman’s Beautiful Filter (an introduction)
Unscented Kalman Filter
Probabilistic Robotics
Special Topics In Scientific Computing
CS 4/527: Artificial Intelligence
Machine Learning for Signal Processing Prediction and Estimation, Part II Bhiksha Raj Class Nov Nov /18797.
Introduction to particle filter
CAP 5636 – Advanced Artificial Intelligence
State Estimation Probability, Bayes Filtering
Filtering and State Estimation: Basic Concepts
Introduction to particle filter
CS 188: Artificial Intelligence
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Particle Filtering.
Prediction and Estimation, Part II
Pattern Recognition and Machine Learning
Bayes and Kalman Filter
Extending Expectation Propagation for Graphical Models
Chapter14-cont..
Kalman Filters Switching Kalman Filter
EM for BNs Kalman Filters Gaussian BNs
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
Approximate Inference by Sampling
Unifying Variational and GBP Learning Parameters of MNs EM for BNs
BN Semantics 3 – Now it’s personal! Parameter Learning 1
Junction Trees 3 Undirected Graphical Models
Mean Field and Variational Methods Loopy Belief Propagation
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Presentation transcript:

Kalman Filters Gaussian MNs Readings: K&F: 6.1, 6.2, 6.3, 14.1, 14.2, 14.3, 14.4, Kalman Filters Gaussian MNs Graphical Models – 10708 Carlos Guestrin Carnegie Mellon University December 1st, 2008

Multivariate Gaussian Mean vector: Covariance matrix:

Conditioning a Gaussian Joint Gaussian: p(X,Y) ~ N(;) Conditional linear Gaussian: p(Y|X) ~ N(mY|X; s2Y|X)

Gaussian is a “Linear Model” Conditional linear Gaussian: p(Y|X) ~ N(0+X; s2)

Conditioning a Gaussian Joint Gaussian: p(X,Y) ~ N(;) Conditional linear Gaussian: p(Y|X) ~ N(mY|X; SYY|X)

Conditional Linear Gaussian (CLG) – general case p(Y|X) ~ N(0+BX; SYY|X)

Understanding a linear Gaussian – the 2d case Variance increases over time (motion noise adds up) Object doesn’t necessarily move in a straight line

Tracking with a Gaussian 1 p(X0) ~ N(0,0) p(Xi+1|Xi) ~ N( Xi + ; Xi+1|Xi)

Tracking with Gaussians 2 – Making observations We have p(Xi) Detector observes Oi=oi Want to compute p(Xi|Oi=oi) Use Bayes rule: Require a CLG observation model p(Oi|Xi) ~ N(W Xi + v; Oi|Xi)

Operations in Kalman filter X1 X2 X3 X4 X5 O1 = O2 = O3 = O4 = O5 = Compute Start with At each time step t: Condition on observation Prediction (Multiply transition model) Roll-up (marginalize previous time step) I’ll describe one implementation of KF, there are others Information filter

Exponential family representation of Gaussian: Canonical Form

Canonical form Standard form and canonical forms are related: Conditioning is easy in canonical form Marginalization easy in standard form

Conditioning in canonical form First multiply: Then, condition on value B = y

Operations in Kalman filter X1 X2 X3 X4 X5 O1 = O2 = O3 = O4 = O5 = Compute Start with At each time step t: Condition on observation Prediction (Multiply transition model) Roll-up (marginalize previous time step)

Prediction & roll-up in canonical form First multiply: Then, marginalize Xt:

What if observations are not CLG? Often observations are not CLG CLG if Oi =  Xi + o +  Consider a motion detector Oi = 1 if person is likely to be in the region Posterior is not Gaussian

Linearization: incorporating non-linear evidence p(Oi|Xi) not CLG, but… Find a Gaussian approximation of p(Xi,Oi)= p(Xi) p(Oi|Xi) Instantiate evidence Oi=oi and obtain a Gaussian for p(Xi|Oi=oi) Why do we hope this would be any good? Locally, Gaussian may be OK

Linearization as integration Gaussian approximation of p(Xi,Oi)= p(Xi) p(Oi|Xi) Need to compute moments E[Oi] E[Oi2] E[Oi Xi] Note: Integral is product of a Gaussian with an arbitrary function

Linearization as numerical integration Product of a Gaussian with arbitrary function Effective numerical integration with Gaussian quadrature method Approximate integral as weighted sum over integration points Gaussian quadrature defines location of points and weights Exact if arbitrary function is polynomial of bounded degree Number of integration points exponential in number of dimensions d Exact monomials requires exponentially fewer points For 2d+1 points, this method is equivalent to effective Unscented Kalman filter Generalizes to many more points

Operations in non-linear Kalman filter X1 X2 X3 X4 X5 O1 = O2 = O3 = O4 = O5 = Compute Start with At each time step t: Condition on observation (use numerical integration) Prediction (Multiply transition model, use numerical integration) Roll-up (marginalize previous time step)

Canonical form & Markov Nets

What you need to know about Gaussians, Kalman Filters, Gaussian MNs Probably most used BN Assumes Gaussian distributions Equivalent to linear system Simple matrix operations for computations Non-linear Kalman filter Usually, observation or motion model not CLG Use numerical integration to find Gaussian approximation Gaussian Markov Nets Sparsity in precision matrix equivalent to graph structure Continuous and discrete (hybrid) model Much harder, but doable and interesting (see book)