Finish Logics… Reasoning under Uncertainty: Intro to Probability

Slides:



Advertisements
Similar presentations
Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
Advertisements

CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Mar, 4, 2015 Slide credit: some slides adapted from Stuart.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
CPSC 322, Lecture 2Slide 1 Representational Dimensions Computer Science cpsc322, Lecture 2 (Textbook Chpt1) Sept, 6, 2013.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
CPSC 322, Lecture 23Slide 1 Logic: TD as search, Datalog (variables) Computer Science cpsc322, Lecture 23 (Textbook Chpt 5.2 & some basic concepts from.
CPSC 322, Lecture 19Slide 1 Propositional Logic Intro, Syntax Computer Science cpsc322, Lecture 19 (Textbook Chpt ) February, 23, 2009.
Decision Theory: Single Stage Decisions Computer Science cpsc322, Lecture 33 (Textbook Chpt 9.2) March, 30, 2009.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
CPSC 322, Lecture 18Slide 1 Planning: Heuristics and CSP Planning Computer Science cpsc322, Lecture 18 (Textbook Chpt 8) February, 12, 2010.
CPSC 322, Lecture 30Slide 1 Reasoning Under Uncertainty: Variable elimination Computer Science cpsc322, Lecture 30 (Textbook Chpt 6.4) March, 23, 2009.
CPSC 322, Lecture 11Slide 1 Constraint Satisfaction Problems (CSPs) Introduction Computer Science cpsc322, Lecture 11 (Textbook Chpt 4.0 – 4.2) January,
CPSC 322, Lecture 20Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Computer Science cpsc322, Lecture 20 (Textbook.
CPSC 322, Lecture 23Slide 1 Logic: TD as search, Datalog (variables) Computer Science cpsc322, Lecture 23 (Textbook Chpt 5.2 & some basic concepts from.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
CPSC 322, Lecture 17Slide 1 Planning: Representation and Forward Search Computer Science cpsc322, Lecture 17 (Textbook Chpt 8.1 (Skip )- 8.2) February,
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
CPSC 322, Lecture 31Slide 1 Probability and Time: Markov Models Computer Science cpsc322, Lecture 31 (Textbook Chpt 6.5) March, 25, 2009.
CPSC 322, Lecture 22Slide 1 Logic: Domain Modeling /Proofs + Top-Down Proofs Computer Science cpsc322, Lecture 22 (Textbook Chpt 5.2) March, 8, 2010.
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
Logic: Datalog wrap-up CPSC 322 – Logic 5 Textbook §12.3 March 14, 2011.
Slide 1 Logic: TD as search, Datalog (variables) Jim Little UBC CS 322 – CSP October 24, 2014 Textbook §5.2 and 12.
CPSC 322, Lecture 24Slide 1 Finish Logics… Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1)
Planning: Wrap up CSP Planning. Logic: Intro CPSC 322 – Planning 3 Textbook §8.4, §5.1 March 2, 2011.
Computer Science CPSC 322 Lecture 25 Logic Wrap Up Intro to Probability Slide 1.
Slide 1 Logic: Domain Modeling /Proofs + Top-Down Proofs Jim Little UBC CS 322 – CSP October 22, 2014.
Computer Science CPSC 322 Lecture 26 Uncertainty and Probability (Ch. 6.1, 6.1.1, 6.1.3)
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
CPSC 322, Lecture 22Slide 1 Logic: Domain Modeling /Proofs + Top-Down Proofs Computer Science cpsc322, Lecture 22 (Textbook Chpt 5.2) Oct, 26, 2010.
Department of Computer Science Undergraduate Events More
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
CPSC 322, Lecture 23Slide 1 Logic: TD as search, Datalog (variables) Computer Science cpsc322, Lecture 23 (Textbook Chpt 5.2 & some basic concepts from.
Reasoning Under Uncertainty: Introduction to Probability CPSC 322 – Uncertainty 1 Textbook §6.1 March 16, 2011.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
CPSC 322, Lecture 19Slide 1 (finish Planning) Propositional Logic Intro, Syntax Computer Science cpsc322, Lecture 19 (Textbook Chpt – 5.2) Oct,
Computer Science CPSC 322 Lecture 22 Logical Consequences, Proof Procedures (Ch 5.2.2)
Slide 1 Propositional Logic Intro, Syntax Jim Little UBC CS 322 – CSP October 17, 2014 Textbook § – 5.2.
CPSC 322, Lecture 2Slide 1 Representational Dimensions Computer Science cpsc322, Lecture 2 (Textbook Chpt1) Sept, 7, 2012.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Computer Science CPSC 322 Lecture 16 Logic Wrap Up Intro to Probability 1.
CPSC 322, Lecture 22Slide 1 Logic: Domain Modeling /Proofs + Top-Down Proofs Computer Science cpsc322, Lecture 22 (Textbook Chpt 5.2) Oct, 30, 2013.
Computer Science cpsc322, Lecture 25
Constraint Satisfaction Problems (CSPs) Introduction
Logic: TD as search, Datalog (variables)
Reasoning Under Uncertainty: Belief Networks
Marginal Independence and Conditional Independence
Computer Science cpsc322, Lecture 20
Propositional Logic Intro, Syntax Computer Science cpsc322, Lecture 19
Propositional Logic Intro, Syntax Computer Science cpsc322, Lecture 19
Propositional Definite Clause Logic: Syntax, Semantics, R&R and Proofs
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Planning: Representation and Forward Search
Decision Theory: Single Stage Decisions
Planning: Heuristics and CSP Planning
Planning: Representation and Forward Search
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 20
Probability and Time: Markov Models
Probability and Time: Markov Models
Planning: Heuristics and CSP Planning
Decision Theory: Sequential Decisions
Logic: Top-down proof procedure and Datalog
Decision Theory: Single Stage Decisions
Probability and Time: Markov Models
Assignment-2 due now Enterprise Architecture Conference Sat. Oct 27
Logic: Domain Modeling /Proofs + Computer Science cpsc322, Lecture 22
Probability and Time: Markov Models
Computer Science cpsc322, Lecture 20
Reasoning Under Uncertainty: Variable elimination
Planning: Representation and Forward Search
Presentation transcript:

Finish Logics… Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) June, 13, 2017 CPSC 322, Lecture 24

Midterm review 120 midterm overall Average 71 Best 101.5! 12 students > 90% 12 students <50% How to learn more from midterm Carefully examine your mistakes (and our feedback) If you still do not see the correct answer/solution go back to your notes, the slides and the textbook If you are still confused come to office hours with specific questions Solutions will be posted but that should be your last resort (not much learning) Greater than 1001 90 – 100 12 80 – 89 29 70 – 79 26 60 – 69 21 50 – 59 21 40 – 49 8 30 – 39 4 CPSC 322, Lecture 23

Tracing Datalog proofs in AIspace You can trace the example from the last slide in the AIspace Deduction Applet at http://aispace.org/deduction/ using file ex-Datalog available in course schedule One question of assignment 3 asks you to use this applet

Datalog: queries with variables in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). Query: in(alan, X1). yes(X1)  in(alan, X1). What would the answer(s) be?

Datalog: queries with variables in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). Query: in(alan, X1). yes(X1)  in(alan, X1). What would the answer(s) be? yes(r123). yes(cs_building). Again, you can trace the SLD derivation for this query in the AIspace Deduction Applet

To complete your Learning about Logics Review textbook and inked slides Practice Exercises : 5.A, 12.A, 12.B, 12.C Assignment 3 It will be out today. It is due on the 20th. Make sure you start working on it soon. One question requires you to use Datalog (with TopDown proof) in the AIspace. To become familiar with this applet download and play with the simple examples we saw in class (available at course webPage) and work on the Practice Exercises CPSC 322, Lecture 24

Logics in AI: Similar slide to the one for planning Propositional Definite Clause Logics Semantics and Proof Theory Satisfiability Testing (SAT) Propositional Logics First-Order Logics Description Logics Hardware Verification Production Systems Product Configuration Ontologies Cognitive Architectures Semantic Web Video Games Summarization Tutoring Systems Information Extraction CPSC 322, Lecture 8

Paper published in AI journal from Oxford (2013) Towards more expressive ontology languages: The query answering problem ☆ Andrea Cali`c, b, , Georg Gottloba, b, , Andreas Pierisa, , a Department of Computer Science, University of Oxford, UK b Oxford-Man Institute of Quantitative Finance, University of Oxford, UK c Department of Computer Science and Information Systems, Birkbeck University of London, UK Abstract …… query answering amounts to computing the answers to the query that are entailed by the extensional database EDB and the ontology. …..In particular, our new classes belong to the recently introduced family of Datalog-based languages, called Datalog±. The basic Datalog± rules are (function-free) Horn rules extended with existential quantification in the head, known as tuple-generating dependencies (TGDs). …… We establish complexity results for answering conjunctive queries under sticky sets of TGDs, showing, in particular, that queries can be compiled into domain independent first-order (and thus translatable into SQL) queries over the given EDB. Towards more expressive ontology languages: The query answering problem ☆ Andrea Cali`c, b, , Georg Gottloba, b, , Andreas Pierisa, , a Department of Computer Science, University of Oxford, UK b Oxford-Man Institute of Quantitative Finance, University of Oxford, UK c Department of Computer Science and Information Systems, Birkbeck University of London, UK http://dx.doi.org/10.1016/j.artint.2012.08.002, How to Cite or Link Using DOI Permissions & Reprints Abstract Ontology reasoning finds a relevant application in the so-called ontology-based data access, where a classical extensional database (EDB) is enhanced by an ontology, in the form of logical assertions, that generates new intensional knowledge which contributes to answering queries. In this setting, queries are therefore answered against a logical theory constituted by the EDB and the ontology; more specifically, query answering amounts to computing the answers to the query that are entailed by the EDB and the ontology. In this paper, we study novel relevant classes of ontological theories for which query answering is both decidable and of tractable data complexity, that is, the complexity with respect to the size of the data only. In particular, our new classes belong to the recently introduced family of Datalog-based languages, called Datalog±. The basic Datalog± rules are (function-free) Horn rules extended with existential quantification in the head, known as tuple-generating dependencies (TGDs). We propose the language of sticky sets of TGDs (or sticky Datalog±), which are sets of TGDs with a restriction on multiple occurrences of variables in the rule-bodies. We establish complexity results for answering conjunctive queries under sticky sets of TGDs, showing, in particular, that queries can be compiled into domain independent first-order (and thus translatable into SQL) queries over the given EDB. We also present several extensions of sticky sets of TGDs, and investigate the complexity of query answering under such classes. In summary, we obtain highly expressive and effective ontology languages that unify and generalize both classical database constraints, and important features of the most widespread tractable description logics; in particular, the DL-Lite family of description logics. CPSC 322, Lecture 24

Big Transition Intro to Probability …. Lecture Overview CPSC 322, Lecture 24

Big Picture: R&R systems Environment Stochastic Deterministic Problem Arc Consistency Search Constraint Satisfaction Vars + Constraints SLS Static Belief Nets Logics Query Var. Elimination Search R&R Sys Representation and reasoning Systems Each cell is a R&R system STRIPS actions preconditions and effects Sequential Decision Nets STRIPS Var. Elimination Planning Search Markov Processes Representation Value Iteration Reasoning Technique CPSC 322, Lecture 2

Answering Query under Uncertainty Probability Theory Dynamic Bayesian Network Static Bayesian Network & Variable Elimination Hidden Markov Models Monitoring (e.g credit cards) Student Tracing in Tutoring Systems Natural Language Processing Diagnostic Systems (e.g., medicine) Email spam filters CPSC 322, Lecture 18

Intro to Probability (Motivation) Will it rain in 10 days? Was it raining 198 days ago? Right now, how many people are in this room? in this building (DMP)? At UBC? ….Yesterday? AI agents (and humans ) are not omniscient And the problem is not only predicting the future or “remembering” the past CPSC 322, Lecture 24

Intro to Probability (Key points) Are agents all ignorant/uncertain to the same degree? Should an agent act only when it is certain about relevant knowledge? (not acting usually has implications) So agents need to represent and reason about their ignorance/ uncertainty Epistemological not ontological Subjective Raining outside… who can look outside the window….. Who can look at who is looking outside the window… CPSC 322, Lecture 24

Probability as a formal measure of uncertainty/ignorance Belief in a proposition f (e.g., it is raining outside, there are 31 people in this room) can be measured in terms of a number between 0 and 1 – this is the probability of f The probability f is 0 means that f is believed to be The probability f is 1 means that f is believed to be Using 0 and 1 is purely a convention. CPSC 322, Lecture 24

Random Variables A random variable is a variable like the ones we have seen in CSP and Planning, but the agent can be uncertain about its value. As usual The domain of a random variable X, written dom(X), is the set of values X can take values are mutually exclusive and exhaustive Examples (Boolean and discrete) Similarly to Constraint satisfaction Boolean random variables e.g., Cavity (do I have a cavity?) Discrete random variables e.g., Weather is one of <sunny,rainy,cloudy,snow> CPSC 322, Lecture 24

Random Variables (cont’) A tuple of random variables <X1 ,…., Xn> is a complex random variable with domain.. Assignment X =x means X has value x A proposition is a Boolean formula made from assignments of values to variables Examples Constraint satisfaction Use OR AND and NOT CPSC 322, Lecture 24

Possible Worlds A possible world specifies an assignment to each random variable E.g., if we model only two Boolean variables Cavity and Toothache, then there are 4 distinct possible worlds: Cavity = T Toothache = T Cavity = T  Toothache = F Cavity = F  Toothache = T Cavity = T  Toothache = T cavity toothache T F As usual, possible worlds are mutually exclusive and exhaustive Example: number the worlds above w╞ X=x means variable X is assigned value x in world w CPSC 322, Lecture 24

Semantics of Probability The belief of being in each possible world w can be expressed as a probability µ(w) For sure, I must be in one of them……so µ(w) for possible worlds generated by three Boolean variables: cavity, toothache, catch (the probe caches in the tooth) cavity toothache catch µ(w) T .108 F .012 .072 .008 .016 .064 .144 .576 Example: number the worlds above CPSC 322, Lecture 24

Probability of proposition What is the probability of a proposition f ? cavity toothache catch µ(w) T .108 F .012 .072 .008 .016 .064 .144 .576 For any f, sum the prob. of the worlds where it is true: P(f )=Σ w╞ f µ(w) Inference by enumeration Ex: P(toothache = T ) = CPSC 322, Lecture 24

Probability of proposition What is the probability of a proposition f ? cavity toothache catch µ(w) T .108 F .012 .072 .008 .016 .064 .144 .576 For any f, sum the prob. of the worlds where it is true: P(f )=Σ w╞ f µ(w) Inference by enumeration P(cavity=T and toothache=F) = CPSC 322, Lecture 24

Probability of proposition What is the probability of a proposition f ? cavity toothache catch µ(w) T .108 F .012 .072 .008 .016 .064 .144 .576 For any f, sum the prob. of the worlds where it is true: P(f )=Σ w╞ f µ(w) Inference by enumeration P(cavity or toothache) = 0.108 + 0.012 + 0.016 + 0.064 + + 0.072+0.08 = 0.28 CPSC 322, Lecture 24

One more example A. 1 B. 0.3 C. 0 .6 D. 0.7 Remember Weather, with domain {sunny, cloudy) Temperature, with domain {hot, mild, cold} There are now 6 possible worlds: What’s the probability of it being cloudy or cold? Weather Temperature µ(w) sunny hot 0.10 mild 0.20 cold cloudy 0.05 0.35 A. 1 B. 0.3 C. 0 .6 D. 0.7 Remember The probability of proposition f is defined by: P(f )=Σ w╞ f µ(w) sum of the probabilities of the worlds w in which f is true

One more example Remember Weather, with domain {sunny, cloudy) Temperature, with domain {hot, mild, cold} There are now 6 possible worlds: What’s the probability of it being cloudy or cold? µ(w3) + µ(w4) + µ(w5) + µ(w6) = 0.7 Weather Temperature µ(w) w1 sunny hot 0.10 w2 mild 0.20 w3 cold w4 cloudy 0.05 w5 0.35 w6 Remember The probability of proposition f is defined by: P(f )=Σ w╞ f µ(w) sum of the probabilities of the worlds w in which f is true

Probability Distributions A probability distribution P on a random variable X is a function dom(X) - > [0,1] such that x -> P(X=x) cavity? cavity toothache catch µ(w) T .108 F .012 .072 .008 .016 .064 .144 .576 Similarly to Constraint satisfaction Boolean random variables e.g., Cavity (do I have a cavity?) Discrete random variables e.g., Weather is one of <sunny,rainy,cloudy,snow> CPSC 322, Lecture 24

Probability distribution (non binary) A probability distribution P on a random variable X is a function dom(X) - > [0,1] such that x -> P(X=x) Number of people in this room at this time I believe that in this room…. A. There are exactly 79 people There are more than 79 people C. There are less than 79 people D. None of the above CPSC 322, Lecture 24

Probability distribution (non binary) A probability distribution P on a random variable X is a function dom(X) - > [0,1] such that x -> P(X=x) Number of people in this room at this time CPSC 322, Lecture 24

Joint Probability Distributions When we have multiple random variables, their joint distribution is a probability distribution over the variable Cartesian product E.g., P(<X1 ,…., Xn> ) Think of a joint distribution over n variables as an n-dimensional table Each entry, indexed by X1 = x1,…., Xn= xn corresponds to P(X1 = x1  ….  Xn= xn ) The sum of entries across the whole table is 1 Similarly to Constraint satisfaction Boolean random variables e.g., Cavity (do I have a cavity?) Discrete random variables e.g., Weather is one of <sunny,rainy,cloudy,snow> CPSC 322, Lecture 24

Question If you have the joint of n variables. Can you compute the probability distribution for each variable? If you have the probability distribution for n variables. Can you compute their joint? CPSC 322, Lecture 24

Learning Goals for today’s class You can: Define and give examples of random variables, their domains and probability distributions. Calculate the probability of a proposition f given µ(w) for the set of possible worlds. Define a joint probability distribution CPSC 322, Lecture 4

Assignment-3: Logics – out today Next Class More probability theory Marginalization Conditional Probability Chain Rule Bayes' Rule Independence Assignment-3: Logics – out today CPSC 322, Lecture 24