Download presentation
Presentation is loading. Please wait.
Published byJeffrey Joseph Modified over 9 years ago
1
CS 401R: Intro. to Probabilistic Graphical Models Lecture #6: Useful Distributions; Reasoning with Joint Distributions This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative Commons Attribution-Share Alike 3.0 Unported License Some slides (14-onward) adapted from slides originally created by Andrew W. Moore of CMU. See message on slide #14.
2
Announcements Assignment 0.1 Due today Reading Report #2 Due Wednesday Assignment 0.2 Mathematical exercises Early: Friday Due next Monday
3
Objectives Understand 4 important discrete distributions Describe uncertain worlds with joint probability distributions Reel with terror at the intractability of reasoning with joint distributions Prepare to build models of natural phenomena as Bayes Nets
4
Parametric Distributions
5
e.g., Normal Distribution
7
Bernoulli Distribution 2 possible outcomes “What’s the probability of a single binary event x, if a ‘positive’ event has probability p?”
9
Categorical Distribution Extension for m possible outcomes “What’s the probability of a single event x (containing a 1 in only one position), if outcomes 1, 2, …, and m, are specified by p = [p 1, p 2, …, p m ]?” Note: p i must sum to 1
10
Categorical Distribution Extension for m possible outcomes “What’s the probability of a single event x (containing a 1 in only one position), if outcomes 1, 2, …, and m, are specified by p = [p 1, p 2, …, p m ]?” Note: p i must sum to 1 Equivalently: Great for language models, where each value corresponds to a word or an n-gram of words. (e.g., value ‘1’ corresponds to ‘the’)
12
Binomial Distribution 2 possible outcomes; N trials “What’s the probability in N independent Bernoulli events that x of them will come up ‘positive’, if a ‘positive’ event has probability p?”
14
Multinomial Distribution Extension for m possible outcomes; N trials “What’s the probability in N independent categorical events that value 1 will occur x 1 times … and that value m will occur x m times, if the probabilities of each possible value are specified by p = [p 1, p 2, …, p m ]?” Note: p i must sum to 1
15
Acknowledgments Note to other teachers and users of the following slides: Andrew Moore would be delighted if you found this source material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit your own needs. PowerPoint originals are available. If you make use of a significant portion of these slides in your own lecture, please include this message, or the following link to the source repository of Andrew’s tutorials: http://www.cs.cmu.edu/~awm/tutorials. Comments and corrections gratefully received.http://www.cs.cmu.edu/~awm/tutorials
16
Why Bayes Nets Matter Andrew Moore (Google, formerly CMU): One of the most important technologies in the Machine Learning / AI field to have emerged in the last 20 years A clean, clear, manageable language Express what you’re certain and uncertain about Many practical applications in medicine, factories, helpdesks, robotics, and NLP! Inference: P(diagnosis | these symptoms) Anomaly Detection: anomalousness of this observation Active Data Collection: next diagnostic test | current observations
17
The Joint Distribution Recipe for making a joint distribution of M variables: Example: Boolean variables X, Y, Z
18
The Joint Distribution Recipe for making a joint distribution of M variables: 1.Make a truth table listing all combinations of values of your variables (if there are M Boolean variables, then the table will have 2 M rows). XYZ 000 001 010 011 100 101 110 111 Example: Boolean variables X, Y, Z
19
The Joint Distribution Recipe for making a joint distribution of M variables: 1.Make a truth table listing all combinations of values of your variables (if there are M Boolean variables, then the table will have 2 M rows). 2.For each combination of values, indicate the probability. XYZProb 0000.30 0010.05 0100.10 0110.05 100 1010.10 1100.25 1110.10 Example: Boolean variables X, Y, Z
20
XYZProb 0000.30 0010.05 0100.10 0110.05 100 1010.10 1100.25 1110.10 X=1 Y=1 Z=1 0.05 0.25 0.10 0.05 0.10 0.30 The Joint Distribution Example: Boolean variables X, Y, Z
21
Using the Joint Distribution
22
Once you have the joint dist., you can ask for the probability of any logical expression involving any of the “attributes”. What is this summation called?
23
P(Poor, Male) = Using the Joint Distribution
24
P(Poor, Male) = Using the Joint Distribution
25
P(Poor) = Using the Joint Distribution Try this.
26
P(Poor) = 0.7604 Using the Joint Distribution
27
Inference with the Joint Dist.
28
P(Male | Poor) = Inference with the Joint Dist.
29
P(Male | Poor) = 0.4654 / 0.7604 = 0.612 Inference with the Joint Dist.
30
News Good Once you have a joint distribution, you can ask important questions about uncertain events. Bad Impossible to create for more than about ten attributes because there are so many numbers needed when you build the distribution.
31
Next Address our efficiency problem by making independence assumptions! Use the Bayes Net methodology to build joint distributions in manageable chunks
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.