Daphne Koller Bayesian Networks Semantics & Factorization Probabilistic Graphical Models Representation.

Slides:



Advertisements
Similar presentations
Local structures; Causal Independence, Context-sepcific independance COMPSCI 276 Fall 2007.
Advertisements

Bayesian Networks. Contents Semantics and factorization Reasoning Patterns Flow of Probabilistic Influence.
…. ... Look at the following pictures of a family, and list five characteristics you can see.
CPSC 422, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33 Apr, 8, 2015 Slide source: from David Page (MIT) (which were.
Parameter Learning in Markov Nets Dhruv Batra, Recitation 11/13/2008.
Probabilistic Graphical Models Tool for representing complex systems and performing sophisticated reasoning tasks Fundamental notion: Modularity Complex.
Exact Inference: Clique Trees
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Approximate Inference 2: Monte Carlo Markov Chain
Comp. Genomics Recitation 12 Bayesian networks Taken from Artificial Intelligence course, MIT, 6.034
Worksheet Part 3 Lisa: My family is big. Bart is my brother. Marge is my mother. Homer is my mother. Maggie is my sister. Abe and.
Temporal Models Template Models Representation Probabilistic Graphical
Dan Boneh Symmetric Encryption History Crypto. Dan Boneh History David Kahn, “The code breakers” (1996)
Dynamic Bayesian Networks
Lectures 2 – Oct 3, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday 12:00-1:20 Johnson Hall.
V13: Causality Aims: (1) understand the causal relationships between the variables of a network (2) interpret a Bayesian network as a causal model whose.
Generalizing Variable Elimination in Bayesian Networks 서울 시립대학원 전자 전기 컴퓨터 공학과 G 박민규.
Bayesian Network By Zhang Liliang. Key Point Today Intro to Bayesian Network Usage of Bayesian Network Reasoning BN: D-separation.
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2008 Readings: K&F: 3.1, 3.2, –  Carlos.
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
CPSC 322, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33 Nov, 30, 2015 Slide source: from David Page (MIT) (which were.
Exploiting Structure in Probability Distributions Irit Gat-Viks Based on presentation and lecture notes of Nir Friedman, Hebrew University.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Motivation and Overview
TO BE & HAVE GOT My name … Homer Simpson. I … a big family.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
CS B 553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Bayesian Networks.
Daphne Koller Markov Networks General Gibbs Distribution Probabilistic Graphical Models Representation.
Daphne Koller Wrapup BNs vs MNs Probabilistic Graphical Models Representation.
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.
7.2 Means & Variances of Random Variables AP Statistics.
Daphne Koller Template Models Plate Models Probabilistic Graphical Models Representation.
Daphne Koller Bayesian Networks Semantics & Factorization Probabilistic Graphical Models Representation.
Reasoning Patterns Bayesian Networks Representation Probabilistic
Daphne Koller Introduction Motivation and Overview Probabilistic Graphical Models.
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2006 Readings: K&F: 3.1, 3.2, 3.3.
Daphne Koller Sampling Methods Metropolis- Hastings Algorithm Probabilistic Graphical Models Inference.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Daphne Koller Variable Elimination Variable Elimination Algorithm Probabilistic Graphical Models Inference.
Daphne Koller Independencies Bayesian Networks Probabilistic Graphical Models Representation.
Daphne Koller Bayesian Networks Naïve Bayes Probabilistic Graphical Models Representation.
Family Members. This is Bart Simpson and this is his family.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33
Are has got am is have got TO BE & HAVE GOT.
Mon père s’appelle Homer Possessive Adjectives Mon père s’appelle Homer.
Mon père s’appelle Homer Possessive Adjectives Mon père s’appelle Homer.
Context-Specific CPDs
Neural Networks for Vertex Covering
General Gibbs Distribution
Preliminaries: Distributions
Luger: Artificial Intelligence, 5th edition
Bayesian Networks Independencies Representation Probabilistic
Mon père s’appelle Homer Possessive Adjectives Mon père s’appelle Homer.
How are we different?.
Whose… is it? Whose …are they?
© Laurent Jollet – extraTeacher.com
Simple Sampling Sampling Methods Inference Probabilistic Graphical
I-equivalence Bayesian Networks Representation Probabilistic Graphical
MCMC for PGMs: The Gibbs Chain
Conditional Random Fields
Probabilistic Influence & d-separation
Reasoning Patterns Bayesian Networks Representation Probabilistic
Factorization & Independence
Factorization & Independence
Crypto Encryption Intro to public key.
Probabilistic Reasoning
Flow of Probabilistic Influence
Preliminaries: Independence
3-3e. Pedigrees Chapter 7.4.
Family members.
Presentation transcript:

Daphne Koller Bayesian Networks Semantics & Factorization Probabilistic Graphical Models Representation

Daphne Koller Grade Course Difficulty Student Intelligence Student SAT Reference Letter

Daphne Koller The Student Network IntelligenceDifficulty Grade Letter SAT g2g i 1,d i 0,d g1g1 g3g3 0.2i 1,d 1 0.3i 0,d 0 l1l1 l0l g1g1 0.01g3g3 0.6g2g s0s0 s1s1 0.8i1i1 0.05i0i d1d1 d0d i1i1 i0i0

Template vertLeftWhite2 Defining a joint distribution What is the joint distribution P(D,I,G,S,L)? P(D) P(I) P(G) P(S) P(L) P(D) P(I) P(G|I,D) P(S|I) P(L|G) P(D) P(I) P(G|I) P(G|D) P(S|I) P(L|G) IntelligenceDifficulty Grade Letter SAT P(D|G) P(I|D) P(S|I) P(G|L,I,D) P(L|G)

Daphne Koller The Chain Rule for Bayesian Nets IntelligenceDifficulty Grade Letter SAT

Daphne Koller Bayesian Network A Bayesian network is:

Daphne Koller BN Is a Legal Distribution

Template vertLeft1 What is the value of 1 P(L) P(G) None of the above

Daphne Koller P Factorizes over G Let G be a graph over X 1,…,X n. P factorizes over G if P(X 1,…,X n ) =

Daphne Koller Genetic Inheritance Homer Bart Marge LisaMaggie Clancy Jackie Selma

Daphne Koller BNs for Genetic Inheritance G Homer G Bart G Marge G Lisa G Maggie G Clancy G Jackie G Selma B Clancy B Jackie B Selma B Homer B Marge B Bart B Lisa B Maggie

Daphne Koller END END END

Daphne Koller

The Chain Rule for Bayesian Nets IntelligenceDifficulty Grade Letter SAT g2g i 1,d i 0,d g1g1 g3g3 0.2i 1,d 1 0.3i 0,d 0 l1l1 l0l g1g1 0.01g3g3 0.6g2g s0s0 s1s1 0.8i1i i0i d1d1 d0d i1i1 i0i0 P(D,I,G,S,L) = P(D) P(I) P(G | I,D) P(L | G) P(S | I)

Template vertLeftWhite1 Suppose  is at a local minimum of a function. What will one iteration of gradient descent do? Leave  unchanged. Change  in a random direction. Move  towards the global minimum of J(  ). Decrease .

Template vertLeftWhite1 Consider the weight update: Which of these is a correct vectorized implementation?

Template vertLeftWhite2 Fig. A corresponds to  =0.01, Fig. B to  =0.1, Fig. C to  =1. Fig. A corresponds to  =0.1, Fig. B to  =0.01, Fig. C to  =1. Fig. A corresponds to  =1, Fig. B to  =0.01, Fig. C to  =0.1. Fig. A corresponds to  =1, Fig. B to  =0.1, Fig. C to  =0.01.

Template vertLeftWhite2

Template block2x2White1 Ordering of buttons is: 13 24