Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 10601 Machine Learning Recitation 2 Öznur Taştan September 2, 2009.

Similar presentations


Presentation on theme: "1 10601 Machine Learning Recitation 2 Öznur Taştan September 2, 2009."— Presentation transcript:

1 1 10601 Machine Learning Recitation 2 Öznur Taştan September 2, 2009

2 Logistics Homework 2 is going to be out tomorrow. It is due on Sep 16, Wed. There is no class on Monday Sep 7 th (Labor day) Those who have not return Homework 1 yet For details of how to submit the homework policy please check : http://www.cs.cmu.edu/~ggordon/10601/hws.html

3 Outline We will review Some probability and statistics Some graphical models We will not go over Homework 1 Since the grace period has not ended yet. Solutions will be up next week on the web page.

4 We’ll play a game: Catch the goof! I’ll be the sloppy TA… will make ‘intentional’ mistakes You’ll catch those mistakes and correct me! Slides with mistakes are marked with Correct slides are marked with

5 Catch the goof!!

6 Given two discrete random variables X and Y X takes values in Law of total probability Y takes values in

7 Given two discrete random variables X and Y X takes values in Law of total probability Y takes values in

8 Given two discrete random variables X and Y X takes values in Law of total probability Y takes values in

9 Given two discrete random variables X and Y X takes values in Law of total probability Y takes values in

10 Given two discrete random variables X and Y Law of total probability Joint probability Marginal probability Conditional probability of X conditioned on Y

11 Given two discrete random variables X and Y Law of total probability Joint probability Marginal probability Conditional probability of X conditioned on Y Formulas are fine. Anything wrong with the names?

12 Given two discrete random variables X and Y Law of total probability Joint probability of X,Y Marginal probability Conditional probability of X conditioned on Y Marginal probability

13 In a strange world Two discrete random variables X and Y take binary values Joint probabilities

14 In a strange world Two discrete random variables X and Y take binary values Joint probabilities Should sum up to 1

15 The world seems fine Two discrete random variables X and Y take binary values Joint probabilities

16 What about the marginals? Joint probabilities Marginal probabilities

17 This is a strange world Joint probabilities Marginal probabilities

18 In a strange world Joint probabilities Marginal probabilities

19 This is a strange world Joint probabilities Marginal probabilities

20 Let’s have a simple problem Joint probabilities Marginal probabilities

21 Conditional probabilities What is the complementary event of P(X=0|Y=1) ? P(X=1|Y=1) OR P(X=0|Y=0)

22 Conditional probabilities What is the complementary event of P(X=0|Y=1) ? P(X=1|Y=1) OR P(X=0|Y=0)

23 The game ends here.

24 Independent number of parameters Assume X and Y take Boolean values {0,1}: How many independent parameters do you need to fully specify: marginal probability of X ? the joint probability of P(X,Y)? the conditional probability of P(X|Y)?

25 Independent number of parameters Assume X and Y take Boolean values {0,1}: How many independent parameters do you need to fully specify: marginal probability of X ? P(X=0) 1 parameter only [ because P(X=1)+P(X=0)=1 ] the joint probability of P(X,Y)? P(X=0, Y=0) 3 parameters P(X=0, Y=1) P(X=1, Y=0) the conditional probability of P(X|Y)?

26 Number of parameters Assume X and Y take Boolean values {0,1}? How many independent parameters do you need to fully specify marginal probability of X? P(X=0) 1 parameter only P(X=1)= 1-P(X=0) How many independent parameters do you need to fully specify the joint probability of P(X,Y)? P(X=0, Y=0) 3 parameters P(X=0, Y=1) P(X=1, Y=0) How many independent parameters do you need to fully specify the conditional probability of P(X|Y)? P(X=0|Y=0) 2 parameters P(X=0|Y=1)

27 Number of parameters What about P(X | Y,Z), how many independent parameters do you need to be able to fully specify the probabilities? Assume each RV takes: m values P(X | Y,Z) n values q values

28 Number of parameters What about P(X | Y,Z), how many independent parameters do you need to be able to fully specify the probabilities? Assume each RV takes: m values Number of independent parameters: (m-1)*nq P(X | Y,Z) n values q values

29 Graphical models A graphical model is a way of representing probabilistic relationships between random variables Variables are represented by nodes: Edges indicates probabilistic relationships: Arrive class late You miss the bus

30 Serial connection Is X and Z independent? ?

31 Serial connection Is X and Z independent? X and Z are not independent

32 Serial connection Is X conditionally independent of Z given Y ? ?

33 Serial connection Is X conditionally independent of Z given Y ? Yes they are independent

34 How can we show it? Is X conditionally independent of Z given Y ?

35 An example case Studied late last night Wake up late Arrive class late

36 Common cause Shoe Size Age Gray Hair X and Y are not marginally independent X and Y are conditionally independent given Z

37 Explaining away X Z Y Flu Allergy Sneeze X and Z marginally independent X and Z conditionally dependent given Y

38 D-separation X and Z are conditionally independent given Y if Y d-separates X and Z Path between X and Z is blocked by Y Neither Y nor its descendants should be observed

39 D-separation example Is B, C independent given A?

40 D-separation example Is B, C independent given A? Yes

41 D-separation example Is B, C independent given A? Yes Observed, A blocks the path

42 Is B, C independent given A? Yes Observed, A blocks the path not observed neither its descendants

43 D-separation example Is A, F independent given E?

44 Yes

45 Is A, F independent given E? Yes

46 Is C, D independent given F?

47 Is C, D independent given F? No

48 Is A, G independent given B and F?

49 Yes

50 Naïve Bayes Model J DCR J: The person is a junior D: The person knows calculus C: The person leaves in campus R: Saw the “Return of the King” more than once

51 Naïve Bayes Model J DCR J: The person is a junior D: The person knows calculus C: The person leaves in campus R: Saw the “Return of the King” more than once What parameters are stored?

52 Naïve Bayes Model J DCR J : The person is a junior D: The person knows calculus C: The person leaves in campus R: Saw the “Return of the King” more than once P(J)= P(R/J=1)= P(R/J=0)= P(D/J=1)= P(D/J=0)= P(C/J=1)= P(C/J=0)=

53 Naïve Bayes Model J DCR J: The person is a junior D: The person knows calculus C: The person leaves in campus R: Saw the “Return of the King” more than once P(J)= P(R/J=1)= P(R/J=0)= P(D/J=1)= P(D/J=0)= P(C/J=1)= P(C/J=0)=

54 Are you a junior? Do you know calculus? Do you live in campus? Have you seen 'Return of the King‘ more than once? Student 11011 Student 21110 Student 31011 Student 41011 Student 51110 Student 61011 Student 71111 Student 81111 Student 90101 Student 101111 Student 111010 Student 121011 Student 130111 Student 141111 Student 151111 Student 161111 Student 170001 Student 181010 Student 190111 Student 200011 We have the structure how do we get the CPTs? Estimate them from observed data

55 Naïve Bayes Model J DCR J: The person is a junior D: The person knows calculus C: The person leaves in campus R: Saw the “Return of the King” more than once P(J)= P(R/J)= P(R/~J)= P(C/J)= P(C/~J)= P(C/J)= P(C/~J)= Suppose a new person come and says: I don’t know calculus I live in campus I have seen ‘The return of the king’ five times What is the probability that he is a Junior?

56 Naïve Bayes Model J DCR Suppose a person says: I don’t know calculus D=0 I live in campus C=1 I have not seen ‘The return of the king’ five times R=1 What is the probability that he is a Junior? P(J=1/D=0,C=1,R=1)

57 What is the probability that he is a Junior? J DCR To calculate this marginalize over J

58 Naïve Bayes Model


Download ppt "1 10601 Machine Learning Recitation 2 Öznur Taştan September 2, 2009."

Similar presentations


Ads by Google