Download presentation
Presentation is loading. Please wait.
Published byAku Koskinen Modified over 6 years ago
1
Biointelligence Laboratory, Seoul National University
Ch 13. Sequential Data Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Summarized by B.-W. Ku Biointelligence Laboratory, Seoul National University
2
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Contents 13.3. Linear Dynamical Systems Inference in LDS Learning in LDS Extensions of LDS Particle filters (C) 2007, SNU Biointelligence Lab,
3
A Stochastic Linear Dynamical System
Fig (C) 2007, SNU Biointelligence Lab,
4
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Inference in LDS (C) 2007, SNU Biointelligence Lab,
5
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Inference Problem Finding the marginal distributions for the latent variables conditional on the observation sequence. (C) 2007, SNU Biointelligence Lab,
6
An Example Application: Tracking an Moving Object
Fig An illustration of a linear dynamical system being used to track a moving object. Blue:Zn. Green: Xn. Red: The inferred. One of the most important application of the Kalman filter. (C) 2007, SNU Biointelligence Lab,
7
Mean and Variance of p(zn|x1, …, xn)
Kalman filter equations (C) 2007, SNU Biointelligence Lab,
8
Interpretation of the Steps Involved
Fig Kalman filter as a process of Making successive predictions and then Correcting the predictions using the new observations. (C) 2007, SNU Biointelligence Lab,
9
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Derivation of Eq.13.89, 13.90, 13.94, 13.95 (13.85), (13.59) forward recursion (13.84) (13.87) (C) 2007, SNU Biointelligence Lab,
10
Mean and Variance of p(zn|x1, …, xn, xn+1, …, xN)
(13.84) Kalman smoother equations. (C) 2007, SNU Biointelligence Lab,
11
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Derivation of Eq , (13.62) backward recursion the forward-backward algorithm (C) 2007, SNU Biointelligence Lab,
12
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Learning in LDS (C) 2007, SNU Biointelligence Lab,
13
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Learning Problem Determining the parameters θ = {A, Γ, C, Σ, μ0, V0} using the EM algorithm. (C) 2007, SNU Biointelligence Lab,
14
Expectation of Log Likelihood Function
The complete data ({X, Z}) log likelihood function (13.108) The expectation of the log likelihood function with respect to p(Z | X, θ old) (13.109) (C) 2007, SNU Biointelligence Lab,
15
Maximizing the Expectation
Maximizing each term with respect to the parameters. (Section 2.3.4) (C) 2007, SNU Biointelligence Lab,
16
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Evaluated θ new (13.105) (13.106) (13.107) (C) 2007, SNU Biointelligence Lab,
17
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Extensions of LDS Problem: Beyond the linear-Gaussian assumption. Considerable interest in extending the basic linear dynamical system in order to increase its capabilities. Gaussian p(zn |xn) – A significant limitation. Some extensions Gaussian mixture p(zn). Gaussian mixture p(xn |zn) – Impractical. The extended Kalman filter. The switching state space model / the switching hidden Markov model. (C) 2007, SNU Biointelligence Lab,
18
(C) 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
Particle filters Non-Gaussian emission density p(xn |zn) non-Gaussian p(zn | x1, …, xn) mathematically intractable integral Sampling-importance-resampling Fig A schematic illustration of the operation of particle filter. (C) 2007, SNU Biointelligence Lab,
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.