Our Status We’re done with Part I (for now at least) Search and Planning! Reference AI winter --- logic (exceptions, etc. + so much knowledge --- can.

Slides:



Advertisements
Similar presentations
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Advertisements

PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
CS 188: Artificial Intelligence Spring 2007 Lecture 11: Probability 2/20/2007 Srini Narayanan – ICSI and UC Berkeley.
CPSC 422 Review Of Probability Theory.
Probability.
CS 188: Artificial Intelligence Fall 2009 Lecture 13: Probability 10/8/2009 Dan Klein – UC Berkeley 1.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
CS 188: Artificial Intelligence Fall 2006 Lecture 17: Bayes Nets III 10/26/2006 Dan Klein – UC Berkeley.
CS 188: Artificial Intelligence Fall 2009 Lecture 19: Hidden Markov Models 11/3/2009 Dan Klein – UC Berkeley.
Uncertainty Chapter 13.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
CHAPTER 15 SECTION 1 – 2 Markov Models. Outline Probabilistic Inference Bayes Rule Markov Chains.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Announcements Project 4: Ghostbusters Homework 7
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
4 Proposed Research Projects SmartHome – Encouraging patients with mild cognitive disabilities to use digital memory notebook for activities of daily living.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
CS 511a: Artificial Intelligence Spring 2013 Lecture 13: Probability/ State Space Uncertainty 03/18/2013 Robert Pless Class directly adapted from Kilian.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
Computer Science cpsc322, Lecture 25
Artificial Intelligence
Probabilistic Reasoning
From last time: on-policy vs off-policy Take an action Observe a reward Choose the next action Learn (using chosen action) Take the next action Off-policy.
Probability Lirong Xia Spring Probability Lirong Xia Spring 2017.
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
Artificial Intelligence
CS 4/527: Artificial Intelligence
Quizzz Rihanna’s car engine does not start (E).
Markov ó Kalman Filter Localization
CS 4/527: Artificial Intelligence
CS 4/527: Artificial Intelligence
CAP 5636 – Advanced Artificial Intelligence
State Estimation Probability, Bayes Filtering
Uncertainty.
Probabilistic Reasoning
Representing Uncertainty
Probability Topics Random Variables Joint and Marginal Distributions
Inference Inference: calculating some useful quantity from a joint probability distribution Examples: Posterior probability: Most likely explanation: B.
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence Fall 2008
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence
Probabilistic Reasoning
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
Probabilistic Reasoning
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
CS 188: Artificial Intelligence Fall 2008
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
CS 188: Artificial Intelligence Spring 2006
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Probability Lirong Xia.
Uncertainty Chapter 13.
CS 188: Artificial Intelligence Fall 2008
Uncertainty Chapter 13.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
Presentation transcript:

Our Status We’re done with Part I (for now at least) Search and Planning! Reference AI winter --- logic (exceptions, etc. + so much knowledge --- can learn) Diagnosis: temperature, blood pressure, types of pain  disease Speech: temporal signal of pressure variation  words and sentences Tracking objects: e.g. helicopter you saw in last lecture, combines gps data with accelerometers, gyros and magnetic compass to infer its position and orientation using probabilistic reasoning Robot mapping: turns out if you put a camera on the helicopter, it would not only be able to fly upside down, but also build a map of the environment! Genetics: discovery of gene interactions from gene expression data Error correcting codes: which we use on pretty much all our digital communication lines. They ensure that even in the presence of noise (which is everywhere) we can communicate reliably. OK: how do we collect all that knowledge in general? --- Part III Good news: We are going to learn about the tools that got us out of AI winter The other news: It’s going to get very mathematical! Total shift in gears compared to what we have looked at before. And yet other news: Time for a fresh start --- which might help if you fell behind in the past weeks.

CS 188: Artificial Intelligence Probability Please retain proper attribution, including the reference to ai.berkeley.edu. Thanks! Instructors: David Suter and Qince Li Course Delivered @ Harbin Institute of Technology [Many slides adapted from those created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. Some others from colleagues at Adelaide University.]

Topics Probability Random Variables Joint and Marginal Distributions Conditional Distribution Product Rule, Chain Rule, Bayes’ Rule Inference Independence You’ll need all this stuff A LOT for the next lectures, so make sure you go over it now! Like “sort this list” or “add these two numbers”

Random Variables A random variable is some aspect of the world about which we (may) have uncertainty R = Is it raining? T = Is it hot or cold? D = How long will it take to drive to work? L = Where is the ghost? We denote random variables with capital letters Random variables have domains R in {true, false} (often write as {+r, -r}) T in {hot, cold} D in [0, ) L in possible locations, maybe {(0,0), (0,1), …}

Probability Distributions Weather: Associate a probability with each value Temperature: W P sun 0.6 rain 0.1 fog 0.3 meteor 0.0 T P hot 0.5 cold Discrete distributions!

Probability Distributions Unobserved random variables have distributions A distribution is a TABLE of probabilities of values A probability (lower case value) is a single number Must have: and Shorthand notation: OK if all domain entries are unique W P sun 0.6 rain 0.1 fog 0.3 meteor 0.0 T P hot 0.5 cold

Joint Distributions A joint distribution over a set of random variables: specifies a real number for each assignment (or outcome): Must obey: Size of distribution if n variables with domain sizes d? For all but the smallest distributions, impractical to write out! T W P hot sun 0.4 rain 0.1 cold 0.2 0.3

Probabilistic Models Distribution over T,W A probabilistic model is a joint distribution over a set of random variables Probabilistic models: (Random) variables with domains Assignments are called outcomes Joint distributions: say whether assignments (outcomes) are likely Normalized: sum to 1.0 Ideally: only certain variables directly interact T W P hot sun 0.4 rain 0.1 cold 0.2 0.3

Events An event is a set E of outcomes From a joint distribution, we can calculate the probability of any event Probability that it’s hot AND sunny? Probability that it’s hot? Probability that it’s hot OR sunny? Typically, the events we care about are partial assignments, like P(T=hot) T W P hot sun 0.4 rain 0.1 cold 0.2 0.3

Marginal Distributions Marginal distributions are sub-tables which eliminate variables Marginalization (summing out): Combine collapsed rows by adding T P hot 0.5 cold T W P hot sun 0.4 rain 0.1 cold 0.2 0.3 W P sun 0.6 rain 0.4

Quiz: Marginal Distributions X P +x -x X Y P +x +y 0.2 -y 0.3 -x 0.4 0.1 Y P +y -y

Conditional Probabilities A simple relation between joint and conditional probabilities In fact, this is taken as the definition of a conditional probability P(a,b) P(a) P(b) T W P hot sun 0.4 rain 0.1 cold 0.2 0.3

Conditional Distributions Conditional distributions are probability distributions over some variables given fixed values of others Conditional Distributions Joint Distribution W P sun 0.8 rain 0.2 T W P hot sun 0.4 rain 0.1 cold 0.2 0.3 W P sun 0.4 rain 0.6

Normalization Trick T W P hot sun 0.4 rain 0.1 cold 0.2 0.3 W P sun Bayes Rule Marginalise To get denom T W P hot sun 0.4 rain 0.1 cold 0.2 0.3 W P sun 0.4 rain 0.6

Normalization Trick SELECT the joint probabilities matching the evidence NORMALIZE the selection (make it sum to one) T W P hot sun 0.4 rain 0.1 cold 0.2 0.3 T W P cold sun 0.2 rain 0.3 W P sun 0.4 rain 0.6

Normalization Trick NORMALIZE the selection (make it sum to one) SELECT the joint probabilities matching the evidence T W P hot sun 0.4 rain 0.1 cold 0.2 0.3 T W P cold sun 0.2 rain 0.3 W P sun 0.4 rain 0.6

Probabilistic Inference Probabilistic inference: compute a desired probability from other known probabilities (e.g. conditional from joint) We generally compute conditional probabilities P(on time | no reported accidents) = 0.90 These represent the agent’s beliefs given the evidence Probabilities change with new evidence: P(on time | no accidents, 5 a.m.) = 0.95 P(on time | no accidents, 5 a.m., raining) = 0.80 Observing new evidence causes beliefs to be updated

Inference by Enumeration * Works fine with multiple query variables, too General case: Evidence variables: Query* variable: Hidden variables: We want: All variables Step 1: Select the entries consistent with the evidence Step 2: Sum out H to get joint of Query and evidence Step 3: Normalize

Inference by Enumeration Obvious problems: Worst-case time complexity O(dn) Space complexity O(dn) to store the joint distribution

The Product Rule Sometimes have conditional distributions but want the joint

The Product Rule Example: D W P wet sun 0.1 dry 0.9 rain 0.7 0.3 D W P 0.08 dry 0.72 rain 0.14 0.06 R P sun 0.8 rain 0.2

The Chain Rule More generally, can always write any joint distribution as an incremental product of conditional distributions Why is this always true? P(x1)P(x1|x2)=P(X1,X2) so it is just the rule with compound event (X1,X2) Thus can recurse P(X1…Xn)=P(xn|x1..xn-1)P(x1,...xn-1)

Bayes Rule

Bayes’ Rule Two ways to factor a joint distribution over two variables: Dividing, we get: Why is this at all helpful? Lets us build one conditional from its reverse Often one conditional is tricky but the other one is simple Foundation of many systems we’ll see later (e.g. ASR,. In the running for most important AI equation! That’s my rule!

Inference with Bayes’ Rule Example: Diagnostic probability from causal probability: Example: M: meningitis, S: stiff neck Note: posterior probability of meningitis still very small Example givens