Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer Science CPSC 322 Lecture 16 Logic Wrap Up Intro to Probability 1.

Similar presentations


Presentation on theme: "Computer Science CPSC 322 Lecture 16 Logic Wrap Up Intro to Probability 1."— Presentation transcript:

1 Computer Science CPSC 322 Lecture 16 Logic Wrap Up Intro to Probability 1

2 Announcement Deadline for Assignment 3 is moved to Tuesday March15, 11:59pm Just for this week, Prof. Conati’s office hours have been moved to Th. 11am 2

3 Lecture Overview Recap of Lecture 15 TD resolution as search in the Deduction applet SLD Resolution in Datalog Intro to Reasoning Under Uncertainty Introduction to Probability Random Variables and Possible World Semantics Probability Distributions 3

4 Proved that bottom-up proof procedure is sound and complete BU is sound: it derives only atoms that logically follow from KB BU is complete: it derives all atoms that logically follow from KB Together: it derives exactly the atoms that logically follow from KB And, it is efficient! Linear in the number of clauses in KB Each clause is used maximally once by BU 4

5 Bottom-up vs. Top-down Key Idea of top-down: search backward from a query g to determine if it can be derived from KB. KB C Query g is proven if g  C BU derives the same C regardless of the query Derivation process not guided by the query Bottom-up Top-down KB answer Query G 5

6 Rule of derivation: the SLD Resolution of clause yes  a 1 ...  a i-1  a i  a i+1 …  a m on atom a i with the clause: a i  b 1 ...  b p is the answer clause yes  a 1 ...  a i-1  b 1 ...  b p  a i+1...  a m SLD Resolution (Selective Linear Definite clause ) yes ← b ∧ c. b ← k ∧ f. yes← k ∧ f ∧ c yes ← e ∧ f. e. yes ← f SLD resolution 6

7 Derivations An answer is an answer clause with m = 0. yes . A successful derivation from KB of query ? q 1 ...  q k is a sequence of answer clauses  0,  1,..,  n such that   0 is the answer clause yes  q 1 ...  q k.   i is obtained by resolving  i-1 with a clause in KB, and   n is an answer. yes . An unsuccessful derivation from KB of query ? q 1 ...  q k  We get to something like yes  b 1 ...  b k.  There is no clause in KB with any of the b i as its head 7

8 To solve the query ? q 1 ...  q k : ac:= yes  body, where body is q 1 ...  q k repeat select q i  body; choose clause Cl  KB, Cl is q i  b c ; replace q i in body by b c until ac is an answer (fail if no clause with q i as head) select: any choice will work choose: have to pick the right one Top-down Proof Procedure for PDCL We showed completeness and soundness 8

9 State: answer clause of the form yes  a 1 ...  a k Successor function: state resulting from substituting first atom a 1 with b 1  …  b m if there is a clause a 1 ← b 1  …  b m Goal test: is the answer clause empty (i.e. yes  ) ? Solution: the proof, i.e. the sequence of SLD resolutions Top-down/SLD resolution as Search Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. 9

10 Lecture Overview Recap of Lecture 15 TD resolution as search in the Deduction applet SLD Resolution in Datalog Intro to Reasoning Under Uncertainty Introduction to Probability Random Variables and Possible World Semantics Probability Distributions 10

11 State: answer clause of the form yes  a 1 ...  a k Successor function: state resulting from substituting first atom a 1 with b 1  …  b m if there is a clause a 1 ← b 1  …  b m Goal test: is the answer clause empty (i.e. yes  ) ? Solution: the proof, i.e. the sequence of SLD resolutions Top-down/SLD resolution as Search Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. can trace the example in the Deduction Applet at http://aispace.org/deduction/ using file kb-for-top- down-search available in course schedule http://aispace.org/deduction/ 11

12 State: answer clause of the form yes  a 1 ...  a k Successor function: state resulting from substituting first atom a 1 with b 1  …  b m if there is a clause a 1 ← b 1  …  b m Goal test: is the answer clause empty (i.e. yes  ) ? Solution: the proof, i.e. the sequence of SLD resolutions Top-down/SLD resolution as Search Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. Possible Heuristic? 12

13 State: answer clause of the form yes  a 1 ...  a k Successor function: state resulting from substituting first atom a 1 with b 1  …  b m if there is a clause a 1 ← b 1  …  b m Goal test: is the answer clause empty (i.e. yes  ) ? Solution: the proof, i.e. the sequence of SLD resolutions Top-down/SLD resolution as Search Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. Possible Heuristic? 13

14 Search Graph Possible Heuristic? Number of atoms in the answer clause Admissible? A. Yes Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. KB B. No C. It depends 14

15 Search Graph Possible Heuristic? Number of atoms in the answer clause Admissible? A. Yes Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. KB It takes at least that many steps to reduce all Atoms in the body of the answer clause 15

16 Representation and Reasoning in complex domains Expressing knowledge with propositions can be quite limiting up_s 2 up_s 3 ok_cb 1 ok_cb 2 live_w 1 connected_w 1 _w 2 up( s 2 ) up( s 3 ) ok( cb 1 ) ok( cb 2 ) live( w 1 ) connected( w 1, w 2 ) What we need is a more natural way to consider individuals and their properties E.g. there is no notion that Now there is a notion that up_s 1 and up_s 3 are about the same property w 1 is the same in live_w 1 and in connected_w 1 _w 2 w 1 is the same in live(w 1 ) and in connected(w 1, w 2 ) up is the same in up(s 1 ) and up(s 3 ) 16

17 Lecture Overview Recap of Lecture 15 TD resolution as search in the Deduction applet SLD Resolution in Datalog Intro to Reasoning Under Uncertainty Introduction to Probability Random Variables and Possible World Semantics Probability Distributions 17

18 Datalog An extension of propositional definite clause (PDC) logic We now have constants and variables We now have relationships between those We can express knowledge that holds for a set of individuals, writing more powerful clauses by introducing variables, such as: We can ask generic queries, E.g. “which wires are connected to w 1 ?“ live(W)  wire(W) ∧ connected_to(W,W 1 ) ∧ wire(W 1 ) ∧ live(W 1 ). ? connected_to(W, w 1 ) 18

19 Datalog: a relational rule language A variable is a symbol starting with an upper case letter A constant is a symbol starting with lower-case letter or a sequence of digits. A predicate symbol is a symbol starting with a lower-case letter. A term is either a variable or a constant. Datalog expands the syntax of PDCL…. Examples: X, Y Examples: alan, w1 Examples: live, connected, part-of, in Examples: X, Y, alan, w1 19

20 Datalog Syntax (cont’d) An atom is a symbol of the form p or p(t 1 …. t n ) where p is a predicate symbol and t i are terms A definite clause is either an atom (a fact) or of the form: h ← b 1 ∧ … ∧ b m where h and the b i are atoms (Read this as ``h if b.'') A knowledge base is a set of definite clauses Examples: sunny, in(alan,X) Example: in(X,Z) ← in(X,Y) ∧ part-of(Y,Z) 20

21 Summary of Datalog Syntax Definite Clause atom p p(t 1,….t n ) a ← b 1 ∧ … ∧ b m where a and b 1.. b n are atoms Datalog Expression Term Query: ?body (answer clause) constantvariable 21

22 DataLog Sematics Role of semantics is still to connect symbols and sentences in the language with the target domain. Main difference: need to create correspondence both between terms and individuals, as well as between predicate symbols and relations We won’t cover the formal definition of Datalog semantics, but if you are interested see 12.3.1 and 12.3.2 in textbook 22

23 Datalog: Top Down Proof Procedure Extension of Top-Down procedure for PDCL. How do we deal with variables? Idea: -Find a clause with head that matches the query -Substitute variables in the clause with their matching constants Example: We will not cover the formal details of this process, called unification. See textbook Section 12.4.2, p. 511 for the details. in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). Query: yes  in(alan, cs_building). yes  part_of(Z,cs_building), in(alan, Z). in(X,Y)  part_of(Z,Y) & in(X,Z). with Y = cs_building X = alan 23

24 Example proof of a Datalog query in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) ∧ in(X,Z). Query: yes  in(alan, cs_building). yes  part_of(Z,cs_building) ∧ in(alan, Z). B. yes  in(alan, r123). A. yes  part_of(Z, r123) ∧ in(alan, Z). C. yes . Using clause: in(X,Y)  part_of(Z,Y) ∧ in(X,Z), with Y = cs_building X = alan Using clause: part_of(r123,cs_building) with Z = r123 D. None of the above ?????? 24

25 Example proof of a Datalog query in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) ∧ in(X,Z). Query: yes  in(alan, cs_building). yes  part_of(Z,cs_building) ∧ in(alan, Z). B. yes  in(alan, r123). Using clause: in(X,Y)  part_of(Z,Y) ∧ in(X,Z), with Y = cs_building X = alan Using clause: part_of(r123,cs_building) with Z = r123 ?????? 25

26 Example proof of a Datalog query in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). Query: yes  in(alan, cs_building). yes  part_of(Z,cs_building), in(alan, Z). yes  in(alan, r123). yes  part_of(Z, r123), in(alan, Z). yes . Using clause: in(X,Y)  part_of(Z,Y) & in(X,Z), with Y = cs_building X = alan Using clause: part_of(r123,cs_building) with Z = r123 Using clause: in(alan, r123). Using clause: in(X,Y)  part_of(Z,Y) & in(X,Z). With X = alan Y = r123 fail No clause with matching head: part_of(Z,r123). 26

27 Tracing Datalog proofs in AIspace You can trace the example from the last slide in the AIspace Deduction Applet at http://aispace.org/deduction/ using file in-part-of available in course schedule http://aispace.org/deduction/ 27

28 Datalog: queries with variables What would the answer(s) be? Query: in(alan, X1). in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). yes(X1)  in(alan, X1). 28

29 Datalog: queries with variables What would the answer(s) be? yes(r123). yes(cs_building). Query: in(alan, X1). in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). yes(X1)  in(alan, X1). Again, you can trace the SLD derivation for this query in the AIspace Deduction Applet, 29

30 Learning Goals For Logic PDCL syntax & semantics - Verify whether a logical statement belongs to the language of propositional definite clauses - Verify whether an interpretation is a model of a PDCL KB. - Verify when a conjunction of atoms is a logical consequence of a KB Bottom-up proof procedure - Define/read/write/trace/debug the Bottom Up (BU) proof procedure - Prove that the BU proof procedure is sound and complete Top-down proof procedure - Define/read/write/trace/debug the Top-down (SLD) proof procedure - Define it as a search problem - Prove that the TD proof procedure is sound and complete Datalog - Represent simple domains in Datalog - Apply the Top-down proof procedure in Datalog 30

31 Logics: Big picture We only covered rather simple logics There are much more powerful representation and reasoning systems based on logics e.g. full first order logic (with negation, disjunction and function symbols) second-order logics (predicates over predicates) non-monotonic logics, modal logics, … There are many important applications of logic For example, software agents roaming the web on our behalf Based on a more structured representation: the semantic web This is just one example for how logics are used 31

32 Logics: Big Picture Propositional Logics First-Order Logics Propositional Definite Clause Logics Semantics and Proof Theory Satisfiability Testing Satisfiability Testing (SAT) Description Logics Cognitive Architectures Video Games Hardware Verification Product Configuration Ontologies Semantic Web Information Extraction Summarization Production Systems Tutoring Systems 32

33 Examples for typical \Web queries How much is a typical flight to Mexico for a given date? What’s the cheapest vacation package to some place in the Caribbean in a given week? Plus, the hotel should have a white sandy beach and scuba diving If webpages are based on basic HTML Humans need to scout for the information and integrate it Computers are not reliable enough (yet) Natural language processing (NLP) can be powerful (see Watson and Siri!) But some information may be in pictures (beach), or implicit in the text, so existing NLP techniques still don’t get all the info. Semantic Web: Extracting data 33

34 More structured representation: the Semantic Web Beyond HTML pages only made for humans Languages and formalisms based on description logics that allow websites to include rich, explicit information on relevant concepts, individual and their relationships \ Goal: software agents that can roam the web and carry out sophisticated tasks on our behalf, based on these richer representations Different than searching content for keywords and popularity. Infer meaning from content based on metadata and assertions that have already been made. Automatically classify and integrate information For further material, P&M text, Chapter 13. Also the Introduction to the Semantic Web tutorial given at 2011 Semantic TechnologyConference http://www.w3.org/People/Ivan/CorePresentations/SWTutorial/ http://www.w3.org/People/Ivan/CorePresentations/SWTutorial/ 34

35 Examples of ontologies for the Semantic Web “Ontology”: logic-based representation of the world eClassOwl: eBusiness ontology for products and services 75,000 classes (types of individuals) and 5,500 properties National Cancer Institute’s ontology: 58,000 classes Open Biomedical Ontologies Foundry: several ontologies including the Gene Ontology to describe gene and gene product attributes in any organism or protein sequence OpenCyc project: a 150,000-concept ontology including Top-level ontology describes general concepts such as numbers, time, space, etc Many specific concepts such as “OLED display”, “iPhone” 35

36 A different example of applications of logic Cognitive Tutors (http://pact.cs.cmu.edu/)http://pact.cs.cmu.edu/ computer tutors for a variety of domains (math, geometry, programming, etc.) Provide individualized support to problem solving exercises, as good human tutors do Rely on logic-based, detailed computational models (ACT-R) of skills and misconceptions underlying a learning domain. CarnegieLearning (http://www.carnegielearning.com/ ):http://www.carnegielearning.com a company that commercializes these tutors, sold to hundreds of thousands of high schools in the USA 36

37 Where are we? Environment Problem Type Query Planning Deterministic Stochastic Constraint Satisfaction Search Arc Consistency Search Logics STRIPS Vars + Constraints Value Iteration Variable Elimination Belief Nets Decision Nets Markov Processes Static Sequential Representation Reasoning Technique Variable Elimination Done with Deterministic Environments 37

38 Where are we? Environment Problem Type Query Planning Deterministic Stochastic Constraint Satisfaction Search Arc Consistency Search Logics STRIPS Vars + Constraints Value Iteration Variable Elimination Belief Nets Decision Nets Markov Processes Static Sequential Representation Reasoning Technique Variable Elimination Second Part of the Course 38

39 Where Are We? Environment Problem Type Query Planning Deterministic Stochastic Constraint Satisfaction Search Arc Consistency Search Logics STRIPS Vars + Constraints Value Iteration Variable Elimination Belief Nets Decision Nets Markov Processes Static Sequential Representation Reasoning Technique Variable Elimination We’ll focus on Belief Nets 39

40 Lecture Overview Recap of Lecture 15 TD resolution as search in the Deduction applet SLD Resolution in Datalog Intro to Reasoning Under Uncertainty Introduction to Probability Random Variables and Possible World Semantics Probability Distributions 40

41 Two main sources of uncertainty (From Lecture 2) Sensing Uncertainty: The agent cannot fully observe a state of interest. For example: Right now, how many people are in this building? What disease does this patient have? Where is the soccer player behind me? Effect Uncertainty: The agent cannot be certain about the effects of its actions. For example: If I work hard, will I get an A? Will this drug work for this patient? Where will the ball go when I kick it? 41

42 Motivation for uncertainty To act in the real world, we almost always have to handle uncertainty (both effect and sensing uncertainty) Deterministic domains are an abstraction Sometimes this abstraction enables more powerful inference Now we don’t always make this abstraction anymore AI main focus shifted from logic to probability in the 1980s The language of probability is very expressive and general New representations enable efficient reasoning We will see some of these, in particular Bayesian networks Reasoning under uncertainty is part of the ‘new’ AI This is not a dichotomy: framework for probability is logical! New frontier: combine logic and probability 42

43 Interesting article about AI and uncertainty “The machine age”, by Peter Norvig (head of research at Google) New York Post, 12 February 2011 http://www.nypost.com/f/print/news/opinion/opedcolumnists/the_machine _age_tM7xPAv4pI4JslK0M1JtxI http://www.nypost.com/f/print/news/opinion/opedcolumnists/the_machine _age_tM7xPAv4pI4JslK0M1JtxI “The things we thought were hard turned out to be easier.” Playing grandmaster level chess, or proving theorems in integral calculus “Tasks that we at first thought were easy turned out to be hard.” A toddler (or a dog) can distinguish hundreds of objects (ball, bottle, blanket, mother, …) just by glancing at them Very difficult for computer vision to perform at this level “Dealing with uncertainty turned out to be more important than thinking with logical precision.” Reasoning under uncertainty (and lots of data) are key to progress 43

44 Probability as a measure of uncertainty/ignorance Probability measures an agent's degree of belief in truth of propositions about states of the world It does not measure how true a proposition is Propositions are true or false. We simply may not know exactly which. Example: I roll a fair dice. What is ‘the’ (my) probability that the result is a ‘6’? 44

45 45

46 Probability as a measure of uncertainty/ignorance Probability measures an agent's degree of belief in truth of propositions about states of the world It does not measure how true a proposition is Propositions are true or false. We simply may not know exactly which. Example: I roll a fair dice. What is ‘the’ (my) probability that the result is a ‘6’? It is 1/6 ≈ 16.7%. I now look at the dice. What is ‘the’ (my) probability now? My probability is now Your probability (you have not looked at the dice) What if I tell some of you the result is even? Their probability becomes Different agents can have different degrees of belief in (probabilities for) a proposition, based on the evidence they have. 46

47 Probability as a measure of uncertainty/ignorance Probability measures an agent's degree of belief in truth of propositions about states of the world It does not measure how true a proposition is Propositions are true or false. We simply may not know exactly which. Example: I roll a fair dice. What is ‘the’ (my) probability that the result is a ‘6’? It is 1/6 ≈ 16.7%. I now look at the dice. What is ‘the’ (my) probability now? My probability is now either 1 or 0, depending on what I observed. Your probability hasn’t changed: 1/6 ≈ 16.7% What if I tell some of you the result is even? Their probability increases to 1/3 ≈ 33.3%, if they believe me Different agents can have different degrees of belief in (probabilities for) a proposition, based on the evidence they have. 47

48 Probability as a measure of uncertainty/ignorance Probability measures an agent's degree of belief in truth of propositions about states of the world Belief in a proposition f can be measured in terms of a number between 0 and 1 this is the probability of f E.g: P(“roll of fair die came out as a 6”) = 1/6 ≈ 16.7% = 0.167 Using probabilities between 0 and 1 is purely a convention. P(f) = 0 means that f is believed to be. B. Probably false A. Probably true C. Definitely falseD. Definitely true 48

49 Probability as a measure of uncertainty/ignorance Probability measures an agent's degree of belief in truth of propositions about states of the world Belief in a proposition f can be measured in terms of a number between 0 and 1 this is the probability of f E.g. P(“roll of fair die came out as a 6”) = 1/6 ≈ 16.7% = 0.167 Using probabilities between 0 and 1 is purely a convention. P(f) = 0 means that f is believed to be Definitely false: the probability of f being true is zero. Likewise, P(f) = 1 means f is believed to be definitely true. 49

50 Lecture Overview Recap of Lecture 15 TD resolution as search in the Deduction applet SLD Resolution in Datalog Intro to Reasoning Under Uncertainty Introduction to Probability Random Variables and Possible World Semantics Probability Distributions 50

51 Probability Theory and Random Variables Probability Theory system of logical axioms and formal operations for sound reasoning under uncertainty Basic element: random variable X X is a variable like the ones we have seen in CSP/Planning/Logic but the agent can be uncertain about the value of X As usual, the domain of a random variable X, written dom(X), is the set of values X can take Types of variables Boolean: e.g., Cancer (does the patient have cancer or not?) Categorical: e.g., CancerType could be one of {breastCancer, lungCancer, skinMelanomas} Numeric: e.g., Temperature (integer or real) We will focus on Boolean and categorical variables 51

52 Random Variables (cont’) Assignment X=x means X has value x A proposition is a Boolean formula made from assignments of values to variables Example A tuple of random variables is a complex random variable with domain.. Dom(X 1 ) × Dom(X 2 )… × Dom(X n )… Cavity = T; Weather = sunny 52

53 Random Variables (cont’) A tuple of random variables is a complex random variable with domain.. Dom(X 1 ) × Dom(X 2 )… × Dom(X n )… Cavity = T; Weather = sunny 53

54 Possible Worlds E.g., if we model only two Boolean variables Cavity and Toothache, then there are distinct possible worlds: w1: Cavity = T  Toothache = T w2: Cavity = T  Toothache = F w3; Cavity = F  Toothache = T w4: Cavity = T  Toothache = T A possible world specifies an assignment to each random variable possible worlds are mutually exclusive and exhaustive CavityToothache TT TF FT FF w╞ f means that proposition f is true in world w A probability measure  (w) over possible worlds w is a nonnegative real number such that -  (w) sums to 1 over all possible worlds w Why does this make sense? 54

55 Possible Worlds E.g., if we model only two Boolean variables Cavity and Toothache, then there are distinct possible worlds: w1: Cavity = T  Toothache = T w2: Cavity = T  Toothache = F w3; Cavity = F  Toothache = T w4: Cavity = T  Toothache = T A possible world specifies an assignment to each random variable possible worlds are mutually exclusive and exhaustive CavityToothache TT TF FT FF w╞ f means that proposition f is true in world w A probability measure  (w) over possible worlds w is a nonnegative real number such that -  (w) sums to 1 over all possible worlds w - The probability of proposition f is defined by: P(f )=Σ w╞ f µ(w ). i.e. sum of the probabilities of the worlds w in which f is true Why does this make sense?Because for sure we are in one of these worlds! 55

56 Possible Worlds Semantics Example: weather in Vancouver one Boolean variable: Weather with domain {sunny, cloudy} Possible worlds: w 1 : Weather = sunny w 2 : Weather = cloudy If p(Weather = sunny) = 0.4 What is the probability of p(Weather = cloudy)? w╞ f means that proposition f is true in world w A probability measure  (w) over possible worlds w is a nonnegative real number such that -  (w) sums to 1 over all possible worlds w - The probability of proposition f is defined by: P(f )=Σ w╞ f µ(w). 56

57 Possible Worlds Semantics Example: weather in Vancouver one Boolean variable: Weather with domain {sunny, cloudy} Possible worlds: w 1 : Weather = sunny w 2 : Weather = cloudy If p(Weather = sunny) = 0.4 What is the probability of p(Weather = cloudy)? p(Weather = sunny) = 0.4 means that  (w 1 ) is 0.4  (w 1 ) and  (w 2 ) have to sum to 1 (those are the only 2 possible worlds) So  (w 2 ) has to be 0.6, and thus p(Weather = cloudy) = 0.6 w╞ f means that proposition f is true in world w A probability measure  (w) over possible worlds w is a nonnegative real number such that -  (w) sums to 1 over all possible worlds w - The probability of proposition f is defined by: P(f )=Σ w╞ f µ(w). 57

58 One more example Now we have an additional variable: Temperature, with domain {hot, mild, cold} There are now 6 possible worlds: What’s the probability of it being cloudy and cold? B. 0.2 A. 0.1C. 0.3E. Not enough info WeatherTemperatureµ(w) sunnyhot0.10 sunnymild0.20 sunnycold0.10 cloudyhot0.05 cloudymild0.35 cloudycold? D. 1 58

59 One more example Now we have an additional variable: Temperature, with domain {hot, mild, cold} There are now 6 possible worlds: What’s the probability of it being cloudy and cold? 0.10 + 0.20 + 0.10 + 0.05 + 0.35 = 0.8 It is 0.2: the probability has to sum to 1 over all possible worlds WeatherTemperatureµ(w) sunnyhot0.10 sunnymild0.20 sunnycold0.10 cloudyhot0.05 cloudymild0.35 cloudycold? 59

60 One more example Now we have an additional variable: Temperature, with domain {hot, mild, cold} There are now 6 possible worlds: What’s the probability of it being cloudy or cold? B. 0.6A. 1 C. 0.3 D. 0.7 WeatherTemperatureµ(w) sunnyhot0.10 sunnymild0.20 sunnycold0.10 cloudyhot0.05 cloudymild0.35 cloudycold0.20 Remember - The probability of proposition f is defined by: P(f )=Σ w╞ f µ(w) - sum of the probabilities of the worlds w in which f is true 60

61 One more example Now we have an additional variable: Temperature, with domain {hot, mild, cold} There are now 6 possible worlds: What’s the probability of it being cloudy or cold? µ(w3) + µ(w4) + µ(w5) + µ(w6) = 0.7 WeatherTemperatureµ(w) w1sunnyhot0.10 w2sunnymild0.20 w3sunnycold0.10 w4cloudyhot0.05 w5cloudymild0.35 w6cloudycold0.20 Remember - The probability of proposition f is defined by: P(f )=Σ w╞ f µ(w) - sum of the probabilities of the worlds w in which f is true 61

62 Probability Distributions Consider the case where possible worlds are simply assignments to one random variable. When dom(X) is infinite we need a probability density function We will focus on the finite case Definition (probability distribution) A probability distribution P on a random variable X is a function dom(X)  [0,1] such that x  P(X=x) 62

63 Define and give examples of random variables, their domains and probability distributions Calculate the probability of a proposition f given µ(w) for the set of possible worlds Next: Define a joint probability distribution (JPD) Marginalize over specific variables to compute distributions over any subset of the variables Conditional Probabilities Learning Goals For Probability so far 63


Download ppt "Computer Science CPSC 322 Lecture 16 Logic Wrap Up Intro to Probability 1."

Similar presentations


Ads by Google