Made by: Maor Levy, Temple University 2012 1.  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 

Slides:



Advertisements
Similar presentations
A Tutorial on Learning with Bayesian Networks
Advertisements

BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Jim Little Nov (Textbook 6.3)
Introduction of Probabilistic Reasoning and Bayesian Networks
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Review: Bayesian learning and inference
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
CS 188: Artificial Intelligence Fall 2009 Lecture 15: Bayes’ Nets II – Independence 10/15/2009 Dan Klein – UC Berkeley.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
CS 188: Artificial Intelligence Spring 2009 Lecture 15: Bayes’ Nets II -- Independence 3/10/2009 John DeNero – UC Berkeley Slides adapted from Dan Klein.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Advanced Artificial Intelligence
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
Summary of the Bayes Net Formalism David Danks Institute for Human & Machine Cognition.
Bayesian Networks Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart Russell,
Bayesian Belief Networks. What does it mean for two variables to be independent? Consider a multidimensional distribution p(x). If for two features we.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
CSE 473: Artificial Intelligence Spring 2012 Bayesian Networks Dan Weld Many slides adapted from Dan Klein, Stuart Russell, Andrew Moore & Luke Zettlemoyer.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Introduction to Bayesian Networks
QUIZ!!  T/F: Traffic, Umbrella are cond. independent given raining. TRUE  T/F: Fire, Smoke are cond. Independent given alarm. FALSE  T/F: BNs encode.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Announcements Project 4: Ghostbusters Homework 7
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
INTERVENTIONS AND INFERENCE / REASONING. Causal models  Recall from yesterday:  Represent relevance using graphs  Causal relevance ⇒ DAGs  Quantitative.
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Introduction on Graphic Models
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
Recuperação de Informação B Modern Information Retrieval Cap. 2: Modeling Section 2.8 : Alternative Probabilistic Models September 20, 1999.
Another look at Bayesian inference
Reasoning Under Uncertainty: Belief Networks
CS 188: Artificial Intelligence Spring 2007
CS 2750: Machine Learning Directed Graphical Models
Qian Liu CSE spring University of Pennsylvania
Read R&N Ch Next lecture: Read R&N
Bayesian Networks: A Tutorial
Artificial Intelligence
CS 4/527: Artificial Intelligence
Bayesian Networks Probability In AI.
Read R&N Ch Next lecture: Read R&N
CAP 5636 – Advanced Artificial Intelligence
CSE 473: Artificial Intelligence Autumn 2011
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence
Class #16 – Tuesday, October 26
CS 188: Artificial Intelligence Spring 2007
Probabilistic Reasoning
Read R&N Ch Next lecture: Read R&N
Bayesian networks (2) Lirong Xia. Bayesian networks (2) Lirong Xia.
Bayesian networks (1) Lirong Xia. Bayesian networks (1) Lirong Xia.
CS 188: Artificial Intelligence Fall 2008
Bayesian networks (2) Lirong Xia.
Presentation transcript:

Made by: Maor Levy, Temple University

 Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning  Information Retrieval (e.g., Web)  Computer Vision  Robotics  Based on mathematical calculus.  Probability of a fair coin: 2

3

4

5

 A belief network consists of: ◦ A directed acyclic graph with nodes labeled with random variables ◦ a domain for each random variable ◦ a set of conditional probability tables for each variable ◦ given its parents (including prior probabilities for nodes w ith no parents).  A belief network is a graph: the nodes are random variables; there is an arc from the parents of each node into that node. ◦ A belief network is automatically acyclic by construction. ◦ A belief network is a directed acyclic graph (DAG) where nodes are random variables. ◦ The parents of a node n are those variables on which n directly depends. ◦ A belief network is a graphical representation of dependence and independence:  A variable is independent of its non-descendants given its parents. 6

 Whether l1 is lit (L1_lit) depends only on the status of the light (L1_st) and whether there is power in wire w0. Thus, L1_lit is independent of the other variables given L1_st and W0.  In a belief network, W0 and L1_st are parents of L1_lit.  Similarly, W0 depends only on whether there is power in w1, whether there is power in w2, the position of switch s2 (S2_pos), and the status of switch s2 (S2_st). 7

 To represent a domain in a belief network, you need to consider:  What are the relevant variables?  What will you observe?  What would you like to find out (query)?  What other features make the model simpler?  What values should these variables take?  What is the relationship between them? This should be expressed in terms of local influence.  How does the value of each variable depend on its parents? This is expressed in terms of the conditional probabilities. 8

 The power network can be used in a number of ways: ◦ Conditioning on the status of the switches and circuit ◦ breakers, whether there is outside power and the position of the switches, you can simulate the lighting. ◦ Given values for the switches, the outside power, and whether the lights are lit, you can determine the posterior probability that each switch or circuit breaker is ok or not. ◦ Given some switch positions and some outputs and some intermediate values, you can determine the probability of any other variable in the network. 9

 A Bayes network is a form of probabilistic graphical model. Specifically, a Bayes network is a directed acyclic graph of nodes representing variables and arcs representing dependence relations among the variables.  A representation of the joint distribution over all the variables represented by nodes in the graph. Let the variables be X(1),..., X(n).  Let parents(A) be the parents of the node A.  Then the joint distribution for X(1) through X(n) is represented as the product of the probability distributions P(Xi | Parents(Xi)) for i = 1 to n:  If X has no parents, its probability distribution is said to be unconditional, otherwise it is conditional. 10

 Examples of Bayes network: 11

12

13

14

 X and Y are conditionally independent given a third event Z precisely if the occurrence or non-occurrence of X and the occurrence or non-occurrence of Y are independent events in their conditional probability distribution given Z. We write: 15

16 AB E C D

17

 Are X and Y conditionally independent given evidence vars {Z}?  Yes, if X and Y “separated” by Z  Look for active paths from X to Y  No active paths = independence!  A path is active if each triple is active:  Causal chain A  B  C where B is unobserved (either direction)  Common cause A  B  C where B is unobserved  Common effect (aka v-structure) A  B  C where B or one of its descendants is observed  All it takes to block a path is a single inactive segment  18 Active TriplesInactive Triples

 Examples: 19 Yes R T B T’T’ R T B D L T’T’

 Overview:  Bayes network:  Graphical representation of joint distributions  Efficiently encode conditional independencies  Reduce number of parameters from exponential to linear (in many cases) 20