Probabilistic Inference Reading: Chapter 13 Next time: How should we define artificial intelligence? Reading for next time (see Links, Reading for Retrospective.

Slides:



Advertisements
Similar presentations
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
Advertisements

Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
Marginal Independence and Conditional Independence Computer Science cpsc322, Lecture 26 (Textbook Chpt 6.1-2) March, 19, 2010.
CPSC 422 Review Of Probability Theory.
Probability Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
CS 188: Artificial Intelligence Fall 2009 Lecture 13: Probability 10/8/2009 Dan Klein – UC Berkeley 1.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Learning In Bayesian Networks. Learning Problem Set of random variables X = {W, X, Y, Z, …} Training set D = { x 1, x 2, …, x N }  Each observation specifies.
Probabilistic Reasoning
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Methods in Computational Linguistics II Queens College Lecture 2: Counting Things.
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Feb 28 and March 13-15, 2012.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 26 of 41 Friday, 22 October.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CHAPTER 15 SECTION 1 – 2 Markov Models. Outline Probabilistic Inference Bayes Rule Markov Chains.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Elementary manipulations of probabilities Set probability of multi-valued r.v. P({x=Odd}) = P(1)+P(3)+P(5) = 1/6+1/6+1/6 = ½ Multi-variant distribution:
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
Uncertainty 1. 2 ♦ Uncertainty ♦ Probability ♦ Syntax and Semantics ♦ Inference ♦ Independence and Bayes’ Rule Outline.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
CS 541: Artificial Intelligence Lecture VII: Inference in Bayesian Networks.
Pattern Recognition Probability Review
Quick Review Probability Theory
Quick Review Probability Theory
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Uncertainty.
Probability and Information
Uncertainty in Environments
Representing Uncertainty
Probability Topics Random Variables Joint and Marginal Distributions
Probability and Information
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2007
Class #21 – Monday, November 10
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

Probabilistic Inference Reading: Chapter 13 Next time: How should we define artificial intelligence? Reading for next time (see Links, Reading for Retrospective Class): Turing paper Mind, Brain and Behavior, John Searle Prepare discussion points by midnight, wed night (see end of slides)

2 Transition to empirical AI  Add in  Ability to infer new facts from old  Ability to generalize  Ability to learn based on past observation  Key:  Observation of the world  Best decision given what is known

3 Overview of Probabilistic Inference  Some terminology  Inference by enumeration  Bayesian Networks

4

5

6

7

8

9 Probability Basics  Sample space  Atomic event  Probability model  An event A

10

11 Random Variables  Random variable  Probability for a random variable

12

13

14

15

16

17 Logical Propositions and Probability  Proposition = event (set of sample points)  Given Boolean random variables A and B:  Event a = set of sample points where A(ω)=true  Event ⌐ a=set of sample points where A(ω)=false  Event aΛb=points where A(ω)=true and B(ω)=true  Often the sample space is the Cartesian product of the range of variables  Proposition=disjunction of atomic events in which it is true (aVb) = ( ⌐ aΛb)V(aΛ ⌐ b)V(aΛb) P(aVb)= P( ⌐ aΛb)+P(aΛ ⌐ b)+P(aΛb)

18

19

20

21

22

23

24

25 Axioms of Probability  All probabilities are between 0 and 1  Necessarily true propositions have probability 1. Necessarily false propositions have probability 0  The probability of a disjunction is  P(aVb)=P(a)+P(b)-P(aΛb)  P( ⌐ a)=1-p(a)

26  The definitions imply that certain logically related events must have related probabilities P(aVb)= P(a)+P(b)-P(aΛb)

27 Prior Probability  Prior or unconditional probabilities of propositions P(female=true)=.5 corresponds to belief prior to arrival of any new evidence  Probability distribution gives values for all possible assignments P(color) = (color = green, color=blue, color=purple) P(color)= (normalized: sums to 1)  Joint probability distribution for a set of r.v.s gives the probability of every atomic event on those r.v.s (i.e., every sample point) P(color,gender) = a 3X2 matrix

28

29

30

31

32

33

34 Inference by enumeration  Start with the joint distribution

35 Inference by enumeration  P(HasTeeth)= =.2

36 Inference by enumeration  P(HasTeethVColor=Green)= =.4 4

37 Conditional Probability  Conditional or posterior probabilities E.g., P(PlayerWins|HostOpenDoor=1 and PlayerPickDoor2 and Door1=goat) =.5 If we know more (e.g., HostOpenDoor=3 and door3-goat): P(PlayerWins)=1 Note: the less specific belief remains valid after more evidence arrives, but is not always useful  New evidence may be irrelevant, allowing simplification: P(PlayerWins|California- earthquake)=P(PlayerWins)=.3

38 Conditional Probability A general version holds for joint distributions: P(PlayerWins,HostOpensDoor1)=P(PlayerWins|HostOpensDoor1)*P(Ho stOpensDoor1)

39 Inference by enumeration  Compute conditional probabilities:  P( ⌐Hasteeth|color=green)= P(⌐HasteethΛcolor=green) P(color=green) 0.8 =

40 Normalization  Denominator can be viewed as normalization constraint α  P( ⌐Hasteeth|color=green ) = α P( ⌐Hasteeth|color=green ) =α[P( ⌐Hasteeth,color=green, female )+ P( ⌐Hasteeth,color=green, ⌐ female)] =α[ + ]=α =  Compute distribution on query variable by fixing evidence variables and summing over hidden variables

41 Inference by enumeration

42 Independence  A and B are independent iff P(A|B)=P(A) or P(B|A)=P(B) or P(A,B)=P(A)P(B)  32 entries reduced to 12; for n independent biased coins, 2 n -> n  Absolute independence powerful but rare  Any domain is large with hundreds of variables none of which are independent

43

44 Conditional Independence  If I have length <=.2, the probability that I am female doesn’t depend on whether or not I have teeth: P(female|length<=.2,hasteeth)=P(female|h asteeth)  The same independence holds if I am >.2  P(male|length>.2,hasteeth)=P(male|length>.2)  Gender is conditionally independent of hasteeth given length

45  In most cases, the use of conditional independence reduces the size of the representation of the joint distribution from exponential in n to linear in n  Conditional independence is our most basic and robust form of knowledge about uncertain environments

46 Next Class: Turing Paper  A discussion class  Graduate students and non-degree students: Anyone beyond a bachelor’s:  Prepare a short statement on the paper. Can be your reaction, your position, a place where you disagree, an explication of a point.  Undergraduates: Be prepared with questions for the graduate students  All: Submit your statement or your question by midnight Wed night.  All statements and questions will be printed and distributed in class on Wednesday.