Download presentation
Presentation is loading. Please wait.
Published bySabrina Matthews Modified over 9 years ago
1
T EMPORAL P ROBABILISTIC M ODELS P T 2
2
A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering
3
K ALMAN F ILTERING In a nutshell Efficient filtering in continuous state spaces Gaussian transition and observation models Ubiquitous for tracking with noisy sensors, e.g. radar, GPS, cameras
4
H IDDEN M ARKOV M ODEL FOR R OBOT L OCALIZATION Use observations + transition dynamics to get a better idea of where the robot is at time t X0X0 X1X1 X2X2 X3X3 z1z1 z2z2 z3z3 Hidden state variables Observed variables Predict – observe – predict – observe…
5
H IDDEN M ARKOV M ODEL FOR R OBOT L OCALIZATION Use observations + transition dynamics to get a better idea of where the robot is at time t Maintain a belief state b t over time b t (x) = P(X t =x|z 1:t ) X0X0 X1X1 X2X2 X3X3 z1z1 z2z2 z3z3 Hidden state variables Observed variables Predict – observe – predict – observe…
6
B AYESIAN F ILTERING WITH B ELIEF S TATES
7
Update via the observation z t Predict P(X t |z 1:t-1 ) using dynamics alone B AYESIAN F ILTERING WITH B ELIEF S TATES
8
I N C ONTINUOUS S TATE S PACES …
9
G ENERAL B AYESIAN F ILTERING IN C ONTINUOUS S TATE S PACES
10
K EY R EPRESENTATIONAL D ECISIONS Pick a method for representing distributions Discrete: tables Continuous: fixed parameterized classes vs. particle- based techniques Devise methods to perform key calculations (marginalization, conditioning) on the representation Exact or approximate?
11
G AUSSIAN D ISTRIBUTION
12
L INEAR G AUSSIAN T RANSITION M ODEL FOR M OVING 1D P OINT Consider position and velocity x t, v t Time step h Without noise x t+1 = x t + h v t v t+1 = v t With Gaussian noise of std P(x t+1 |x t ) exp(-(x t+1 – (x t + h v t )) 2 /(2 2 i.e. X t+1 ~ N (x t + h v t, )
13
L INEAR G AUSSIAN T RANSITION M ODEL If prior on position is Gaussian, then the posterior is also Gaussian vh 11 N( , ) N( +vh, + 1 )
14
L INEAR G AUSSIAN O BSERVATION M ODEL Position observation z t Gaussian noise of std 2 z t ~ N (x t, )
15
L INEAR G AUSSIAN O BSERVATION M ODEL If prior on position is Gaussian, then the posterior is also Gaussian ( 2 z+ 2 2 )/( 2 + 2 2 ) 2 2 2 2 /( 2 + 2 2 ) Position prior Posterior probability Observation probability
16
M ULTIVARIATE G AUSSIANS X ~ N( , )
17
M ULTIVARIATE L INEAR G AUSSIAN P ROCESS A linear transformation + multivariate Gaussian noise If prior state distribution is Gaussian, then posterior state distribution is Gaussian If we observe one component of a Gaussian, then its posterior is also Gaussian y = A x + ~ N( , )
18
M ULTIVARIATE C OMPUTATIONS Linear transformations of gaussians If x ~ N ( , ), y = A x + b Then y ~ N (A +b, A A T ) Consequence If x ~ N ( x, x ), y ~ N ( y, y ), z=x+y Then z ~ N ( x + y, x + y ) Conditional of gaussian If [x 1,x 2 ] ~ N ([ 1 2 ],[ 11, 12 ; 21, 22 ]) Then on observing x 2 =z, we have x 1 ~ N ( 1 - 12 22 -1 (z- 2 ), 11 - 12 22 -1 21 )
19
K ALMAN F ILTER A SSUMPTIONS x t ~ N ( x, x ) x t+1 = F x t + g + v z t+1 = H x t+1 + w v ~ N ( , v ), w ~ N ( , w ) Dynamics noise Observation noise
20
T WO S TEPS Maintain t, t the parameters of the gaussian distribution over state x t Predict Compute distribution of x t+1 using dynamics model alone Update (observe z t+1 ) Compute P(x t+1 |z t+1 ) with Bayes rule
21
T WO S TEPS Maintain t, t the parameters of the gaussian distribution over state x t Predict Compute distribution of x t+1 using dynamics model alone x t+1 ~ N (F t + g, F t F T + v ) Let these be N( ’, ’) Update Compute P(x t+1 |z t+1 ) with Bayes rule
22
T WO S TEPS Maintain t, t the parameters of the gaussian distribution over state x t Predict Compute distribution of x t+1 using dynamics model alone x t+1 ~ N (F t + g, F t F T + v ) Let these be N( ’, ’) Update Compute P(x t+1 |z t+1 ) with Bayes rule Parameters of final distribution t+1 and t+1 derived using the conditional distribution formulas
23
D ERIVING THE U PDATE R ULE xtztxtzt ’a’a = N (, ) ’ B B T C x t ~ N( ’, ’) (1) Unknowns a,B,C (3) Assumption (7) Conditioning (1) x t | z t ~ N( ’-BC -1 (z t -a), ’-BC -1 B T ) (2) Assumption z t | x t ~ N(H x t, W ) C-B T ’ -1 B = W => C = H ’ H T + W H x t = a-B T ’ -1 (x t - ’) => a=H ’, B T =H ’ (5) Set mean (4)=(3) (6) Set cov. (4)=(3) (8,9) Kalman filter t = ’ - ’H T C -1 (z t -H ’) (4) Conditioning (1) z t | x t ~ N(a-B T ’ -1 x t, C-B T ’ -1 B) t = ’ - ’H T C -1 H ’
24
P UTTING IT TOGETHER Transition matrix F, covariance x Observation matrix H, covariance z t+1 = F t + K t+1 (z t+1 – HF t ) t+1 = (I - K t+1 )(F t F T + x ) Where K t+1 = (F t F T + x )H T (H(F t F T + x )H T + z ) -1 Got that memorized?
25
P ROPERTIES OF K ALMAN F ILTER Optimal Bayesian estimate for linear Gaussian transition/observation models Need estimates of covariance… model identification necessary Extensions to nonlinear transition/observation models work as long as they aren’t too nonlinear Extended Kalman Filter Unscented Kalman Filter
26
Tracking the velocity of a braking obstacle Learning that the road is slick Actual max deceleration Braking begins Estimated max deceleration Velocity initially uninformed More distance measurements arrive Obstacle slows Stopping distance (95% confidence interval) Braking initiatedGradual stop
27
N ON -G AUSSIAN DISTRIBUTIONS Gaussian distributions are a “lump” Kalman filter estimate
28
N ON -G AUSSIAN DISTRIBUTIONS Integrating continuous and discrete states Splitting with a binary choice “up” “down”
29
E XAMPLE : F AILURE DETECTION Consider a battery meter sensor Battery = true level of battery BMeter = sensor reading Transient failures: send garbage at time t Persistent failures: send garbage forever
30
E XAMPLE : F AILURE DETECTION Consider a battery meter sensor Battery = true level of battery BMeter = sensor reading Transient failures: send garbage at time t 5555500555… Persistent failures: sensor is broken 5555500000…
31
D YNAMIC B AYESIAN N ETWORK BMeter t Battery t Battery t-1 BMeter t ~ N(Battery t, ) (Think of this structure “unrolled” forever…)
32
D YNAMIC B AYESIAN N ETWORK BMeter t Battery t Battery t-1 BMeter t ~ N(Battery t, ) P(BMeter t =0 | Battery t =5) = 0.03 Transient failure model
33
R ESULTS ON T RANSIENT F AILURE E(Battery t ) Transient failure occurs Without model With model Meter reads 55555005555…
34
R ESULTS ON P ERSISTENT F AILURE E(Battery t ) Persistent failure occurs With transient model Meter reads 5555500000…
35
P ERSISTENT F AILURE M ODEL BMeter t Battery t Battery t-1 BMeter t ~ N(Battery t, ) P(BMeter t =0 | Battery t =5) = 0.03 Broken t-1 Broken t P(BMeter t =0 | Broken t ) = 1 Example of a Dynamic Bayesian Network (DBN)
36
R ESULTS ON P ERSISTENT F AILURE E(Battery t ) Persistent failure occurs With transient model Meter reads 5555500000… With persistent failure model
37
H OW TO PERFORM INFERENCE ON DBN? Exact inference on “unrolled” BN Variable Elimination – eliminate old time steps After a few time steps, all variables in the state space become dependent! Lost sparsity structure Approximate inference Particle Filtering
38
P ARTICLE F ILTERING ( AKA S EQUENTIAL M ONTE C ARLO ) Represent distributions as a set of particles Applicable to non- gaussian high-D distributions Convenient implementations Widely used in vision, robotics
39
P ARTICLE R EPRESENTATION Bel(x t ) = {(w k,x k )} w k are weights, x k are state hypotheses Weights sum to 1 Approximates the underlying distribution
40
Weighted resampling step P ARTICLE F ILTERING Represent a distribution at time t as a set of N “particles” S t 1,…,S t N Repeat for t=0,1,2,… Sample S[i] from P(X t+1 |X t =S t i ) for all i Compute weight w[i] = P(e|X t+1 =S[i]) for all i Sample S t+1 i from S[.] according to weights w[.]
41
B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Sampling step
42
B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Suppose we now observe BMeter=0 P(BMeter=0|sample) = ? 0.03 1
43
B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Compute weights (drawn as particle size) P(BMeter=0|sample) = ? 0.03 1
44
B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Weighted resampling P(BMeter=0|sample) = ?
45
B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Sampling Step
46
B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Now observe BMeter t = 5
47
B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Compute weights 1 0
48
B ATTERY E XAMPLE BMeter t Battery t Battery t-1 Broken t-1 Broken t Weighted resample
49
A PPLICATIONS OF P ARTICLE F ILTERING IN R OBOTICS Simultaneous Localization and Mapping (SLAM) Observations: laser rangefinder State variables: position, walls
50
S IMULTANEOUS L OCALIZATION AND M APPING (SLAM) Mobile robots Odometry Locally accurate Drifts significantly over time Vision/ladar/sonar Inaccurate locally Global reference frame Combine the two State: (robot pose, map) Observations: (sensor input)
51
G ENERAL PROBLEM x t ~ Bel (x t ) (arbitrary p.d.f.) x t+1 = f(x t,u, p ) z t+1 = g(x t+1, o ) p ~ arbitrary p.d.f., o ~ arbitrary p.d.f. Process noise Observation noise
52
S AMPLING I MPORTANCE R ESAMPLING (SIR) VARIANT Predict Update Resample
53
A DVANCED F ILTERING T OPICS Mixing exact and approximate representations (e.g., mixture models) Multiple hypothesis tracking (assignment problem) Model calibration Scaling up (e.g., 3D SLAM, huge maps)
54
N EXT T IME Putting it together: intelligent agents Read R&N 2
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.