Presentation is loading. Please wait.

Presentation is loading. Please wait.

CHAPTER 15 SECTION 1 – 2 Markov Models. Outline Probabilistic Inference Bayes Rule Markov Chains.

Similar presentations


Presentation on theme: "CHAPTER 15 SECTION 1 – 2 Markov Models. Outline Probabilistic Inference Bayes Rule Markov Chains."— Presentation transcript:

1 CHAPTER 15 SECTION 1 – 2 Markov Models

2 Outline Probabilistic Inference Bayes Rule Markov Chains

3 Probabilistic Inference Probabilistic inference: compute a desired probability from other known probabilities (e.g. conditional from joint) We generally compute conditional probabilities:  P(on time | no reported accidents) = 0.90  These represent the agent’s beliefs given the evidence Probabilities change with new evidence:  P(on time | no accidents, 5 a.m.) = 0.95  P(on time | no accidents, 5 a.m., raining) = 0.80  Observing new evidence causes beliefs to be updated

4 Bayes’ Rule

5 Terminology

6 Inference by enumeration P(sun)?

7 Inference by enumeration P(sun | winter)? P(sun | winter, hot)?

8 Inference by enumeration

9 The product rule

10

11 The chain rule

12 Bayes’ Rule

13 Inference with Bayes’ Rule

14 Reasoning over Time or Space Often, we want to reason about a sequence of observations  Speech recognition  Robot localization  User attention  Medical monitoring Need to introduce time (or space) into our models

15 Markov Models (Markov Chains)

16

17 Conditional independence Basic conditional independence:  Past and future independent of the present  Each time step only depends on the previous  This is called the (first order) Markov property Note that the chain is just a (growable) BN: We can always use generic BN reasoning on it if we truncate the chain at a fixed length

18 Example: Markov Chain

19 Markov Chain Inference

20 Joint distribution of a Markov Model

21 Markov Models Recap

22 Mini-Forward Algorithm

23 Example Run of Mini-Forward Algorithm From initial observations of sun: From initial observations of rain:

24 Example Run of Mini-Forward Algorithm From yet another initial distribution P(X 1 ):

25 Stationary Distributions

26 Example: Stationary Distributions Question: What’s P(X) at time t = infinity?

27 Application of Stationary Distribution: Web Link Analysis PageRank over a web graph  Each web page is a state  Initial distribution: uniform over pages  Transitions:  With prob. c, uniform jump to a random page (doMed lines, not all shown)  With prob. 1-c, follow a random outlink (solid lines) Stationary distribution  Will spend more time on highly reachable pages  E.g. many ways to get to the Acrobat Reader download page  Somewhat robust to link spam  Google 1.0 returned the set of pages containing all your keywords in decreasing rank, now all search engines use link analysis along with many other factors (rank actually getting less important over time)

28 References CSE473: Introduction to Artificial Intelligence http://courses.cs.washington.edu/courses/cse473/ http://courses.cs.washington.edu/courses/cse473/


Download ppt "CHAPTER 15 SECTION 1 – 2 Markov Models. Outline Probabilistic Inference Bayes Rule Markov Chains."

Similar presentations


Ads by Google