Download presentation
Presentation is loading. Please wait.
Published byCarmella Garrison Modified over 9 years ago
1
Answering Complex Questions and Performing Deep Reasoning in Advance QA Systems – Reasoning with logic and probability Chitta Baral Arizona State university Co-Investigators: Michael Gelfond and Richard Scherl. Other collaborator: Nelson Rushton.
2
Developing formalisms for Reasoning with logic and probability One of the research issues that we are working on. A core component of our ``deep reasoning’’ goals. – What if reasoning; – Reasoning about counterfactuals; – Reasoning about causes, etc. In parallel: – We are also using existing languages and formalisms (AnsProlog) to domain knowledge and do reasoning.
3
Reasoning with counterfactuals. Text: (From Pearl’s Causality) – If the court orders the execution then the captain gives a signal. – If the captain gives a signal then the rifleman A shoots and rifleman B shoots. – Rifleman A may shoot out of nervousness. – If either of rifleman A or B shoots then the prisoner dies. – The probability that the court orders the execution is p. – The probability that A shoots out of nervousness is q. A counterfactual question: – What is the probability that the prisoner would be alive if A were not to have shot, given that the prisoner is in fact dead?
4
Pictorial representation U: court orders execution (p) C: Captain gives the signal A: Rifleman A shoots B: Rifleman B shoots V: Rifleman A is nervous (q) D The prisoner dies Logic: – U causes C – C causes A – V causes A – C causes B – A causes D – B causes D Probability: – Pr(U) = p – Pr(V) = q U C AB D V
5
Bayes’ Nets, Causal Bayes Nets, and Pearl’s structural causal models Bayes’ nets do not have to respect causality They only succinctly represent the joint probability distribution One can not reason about causes and effects with Bayes nets. (One can with causal Bayes nets) Even then, one needs to distinguish between doing and observing. This distinction is important to do counterfactual reasoning. Pearl’s structural causal models gives an algorithm to do counterfactual reasoning. But – The logical language is weak. (also true for other attempts) – (Need a more general knowledge representation language that can express defaults, normative statements etc.) – The variables with probabilities are assumed to be independent.
6
Further Motivation for a richer language: The Monty Hall problem http://www.remote.org/frederik/projects/ziege/beweis.html A player is given the opportunity to select one of three closed doors, behind one of which there is a prize, and the other 2 rooms are empty. Once the player has made a selection, Monty is obligated to open one of the remaining closed doors, revealing that it does not contain the prize. Monty gives a choice to the player to switch to the other unopened door if he wants. Question: Does it matter if the player switches, or – Which unopened door has the higher probability of containing the prize?
7
Illustration-1 First, let us assume that the car is behind door no. 1. – We can do this without reducing the validity of our proof, because if the car were behind door no. 2, we only had to exchange all occurrences of "door 1" with "door 2" and vice versa, and the proof would still hold. – The candidate has three choices of doors. – Because he has no additional information, he randomly selects one. – The possibility to choose each of the doors 1, 2, or 3 is 1/3 each: Candidate chooses p – Door 1 1/3 – Door 2 1/3 – Door 3 1/3 – Sum1
8
Illustration-2 Going on from this table, we have to split the case depending on the door opened by the host. – Since we assume that the car is behind door no. 1, the host has a choice if and only if the candidate selects the first door - because otherwise there is only one "goat door" left! – We assume that if the host has a choice, he will randomly select the door to open. – Candidate chooses Host opens p – Door 1Door 2 1/6 – Door 1Door 3 1/6 – Door 2Door 3 1/3 – Door 3 Door 2 1/3 – Sum1
9
Illustration-3 candidate who always sticks to his original choice no matter what happens: – Candidate chooses Host opens final choice win p – Door 1 Door 2 Door 1 yes 1/6 – Door 1 Door 3 Door 1 yes 1/6 – Door 2Door 3 Door 2 no 1/3 – Door 3 Door 2 Door 3 no 1/3 – Sum1Sum of cases where candidate wins1/3 candidate who always switches to the other door whenever he gets the chance: – Candidate choosesHost opensfinal choicewin p – Door 1Door 2 Door 3no 1/6 – Door 1Door 3 Door 2no 1/6 – Door 2Door 3 Door 1yes 1/3 – Door 3Door 2 Door 1yes 1/3 – Sum1Sum of cases where candidate wins 2/3
10
Key Issues The existing languages of probability do not really give us the syntax to express certain knowledge about the problem Lot of reasoning is done by the human being Our goal: Develop a knowledge representation language and a reasoning system such that once we express our knowledge in that language the system can do the desired reasoning P-log is such an attempt
11
Representing the Monty Hall problem in P-log. doors = {1, 2, 3}. open, selected, prize, D: doors ~can_open(D) selected = D. ~can_open(D) prize = D. can_open(D) not ~can_open(D). pr(open=D | c can_open(D), can_open(D1), D =/= D1 ) = ½ – By default pr(open = D | c can_open(D) ) = 1 when there is no D1, such that can_open(D1) and D =/= D1. random(prize), random(selected). random(open: {X : can_open(X)}). pr(prize = D) = 1/3. pr(selected = D) = 1/3. obs(selected = 1). obs(open = 2). obs(~prize = 2). Queries: P(prize = 1) = ? P(prize = 3) = ?
12
General Syntax of P-log Sorted Signature: – objects and function symbols (term building functions and attributes) Declaration: – Definition of Sorts and typing information for attributes – Eg. doors = {1, 2, 3}. open, selected, prize, D: doors. Regular part: Collection of AnsProlog rules Random Selection – [r] random(a(t) : { Y : p(Y) } ) B. Probabilistic Information – Pr r (a(t) = y) | c B) = v. Observations and Actions – obs(l). – do(a(t) = y).
13
Semantics: the main ideas The logical part is translated into an AnsProlog program. – The answer sets correspond to possible worlds. The probabilistic part is used to define a measure over the possible worlds. – It is then used in defining the probability of formulas, and – conditional probabilities. Consistency conditions. Sufficiency conditions for consistency. Bayes’s nets and Pearl’s causal models are special cases of P-log programs.
14
Rifleman Example -- various P-log encodings (Encoding 1) U, V, C, A, B, D : boolean random(U). random(V). random(C). random(A). random(B). random(D). Pr(U) = p. Pr(V) = q. Pr(C| C U) = 1. Pr(A| C C) = 1. Pr(A| C V) = 1. Pr(B| C C) = 1. Pr(D| C A) = 1. Pr(D| C B) = 1. Pr(~C| C ~U) = 1. Pr(~A| C ~V,~C) = 1. Pr(~B| C ~C) = 1. Pr(~D| C ~A,~B) = 1. Prediction, Explanation: works as expected. Formulating counterfactual reasoning – work in progress.
15
Conclusion: highlights of progress Modules and generalizations – Travel module – first milestone reached. (Dialog with PARC has started.) – Generalization of the methodology – in progress. – Develop couple of other modules – about to start. – Further generalize the process – subsequent step. AnsProlog enhancements and its use in various kinds of reasoning – P-log and reasoning using it -- in progress. – CR-Prolog (Consistency restoring Prolog) – in progress. – GUIs, Modular AnsProlog, etc. – in progress.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.