Download presentation
Presentation is loading. Please wait.
Published byAdelia Bradford Modified over 9 years ago
1
Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi
2
Outline This lecture is devoted to the problem of inference in probabilistic graphical models (aka Bayesian nets). The goal is for you to: Practice marginalization and conditioning Practice Bayes rule Learn the HMM representation (model) Learn how to do prediction and filtering using HMMs This lecture is devoted to the problem of inference in probabilistic graphical models (aka Bayesian nets). The goal is for you to: Practice marginalization and conditioning Practice Bayes rule Learn the HMM representation (model) Learn how to do prediction and filtering using HMMs 2
3
Assistive Technology Assume you have a little robot that is trying to estimate the posterior probability that you are happy or sad 3
4
Assistive Technology Given that the robot has observed whether you are: watching Game of Thrones (w) sleeping (s) crying (c) face booking (f) Let the unknown state be X =h if you’re happy and X =s if you’re sad. Let Y denote the observation, which can be w, s, c or f. Given that the robot has observed whether you are: watching Game of Thrones (w) sleeping (s) crying (c) face booking (f) Let the unknown state be X =h if you’re happy and X =s if you’re sad. Let Y denote the observation, which can be w, s, c or f. 4
5
Assistive Technology 5
6
6 X Y
7
7
8
Dynamic Model 8
9
Optimal Filtering 9
10
Prediction 10
11
Bayes Update 11
12
HMM Algorithm 12
13
The END
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.