I-maps and perfect maps

Slides:



Advertisements
Similar presentations
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
Advertisements

Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Bayes Networks Markov Networks Noah Berlow. Bayesian -> Markov (Section 4.5.1) Given B, How can we turn into Markov Network? The general idea: – Convert.
Graph Algorithms: Minimum Spanning Tree We are given a weighted, undirected graph G = (V, E), with weight function w:
CSE 571 Advanced Artificial Intelligence Nov 24, 2003 Class Notes Transcribed By: Jon Lammers.
Bayesian Network Representation Continued
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Bayesian Networks Alan Ritter.
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Daphne Koller Variable Elimination Graph-Based Perspective Probabilistic Graphical Models Inference.
1/24 Introduction to Graphs. 2/24 Graph Definition Graph : consists of vertices and edges. Each edge must start and end at a vertex. Graph G = (V, E)
1 BN Semantics 2 – The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 20 th, 2006 Readings: K&F:
1 Learning P-maps Param. Learning Graphical Models – Carlos Guestrin Carnegie Mellon University September 24 th, 2008 Readings: K&F: 3.3, 3.4, 16.1,
Daphne Koller Markov Networks General Gibbs Distribution Probabilistic Graphical Models Representation.
Daphne Koller Wrapup BNs vs MNs Probabilistic Graphical Models Representation.
Introduction on Graphic Models
1 BN Semantics 2 – Representation Theorem The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 17.
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2006 Readings: K&F: 3.1, 3.2, 3.3.
Dependency Networks for Inference, Collaborative filtering, and Data Visualization Heckerman et al. Microsoft Research J. of Machine Learning Research.
An Algorithm to Learn the Structure of a Bayesian Network Çiğdem Gündüz Olcay Taner Yıldız Ethem Alpaydın Computer Engineering Taner Bilgiç Industrial.
1 BN Semantics 3 – Now it’s personal! Parameter Learning 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 22 nd, 2006 Readings:
Daphne Koller Independencies Bayesian Networks Probabilistic Graphical Models Representation.
BN Semantic II d-Separation, PDAGs, etc
Functions and relations
Planar Graphs Hubert Chan (Chapter 9.7) [O2 Proof Techniques]
Perfect Matchings in Bipartite Graphs
Hire Toyota Innova in Delhi for Outstation Tour
Applied Discrete Mathematics Week 10: Equivalence Relations
Functions and relations
General Gibbs Distribution
Structure and Semantics of BN
Independence in Markov Networks
Markov Networks Independencies Representation Probabilistic Graphical
General Gibbs Distribution
Preliminaries: Distributions
Bayesian Networks Independencies Representation Probabilistic
Independence in Markov Networks
Pairwise Markov Networks
General Gibbs Distribution
Graphical Models.
Бази от данни и СУБД Основни понятия инж. Ангел Ст. Ангелов.
Chapter 15 Graphs © 2006 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.
Learning Probabilistic Graphical Models Overview Learning Problems.
Structure and Semantics of BN
Cluster Graph Properties
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
Factorization & Independence
Factorization & Independence
Shared Features in Log-Linear Models
Graphs G = (V, E) V are the vertices; E are the edges.
BN Semantics 3 – Now it’s personal! Parameter Learning 1
Markov Networks Independencies Representation Probabilistic Graphical
I-maps and perfect maps
Junction Trees 3 Undirected Graphical Models
Tree-structured CPDs Local Structure Representation Probabilistic
Independence in Markov Networks
Markov Networks Independencies Representation Probabilistic Graphical
IMPLICIT Differentiation.
Preliminaries: Independence
Polynomial Functions and Models
Alegebra 2A Function Lesson 1 Objective: Relations, and Functions.
Variable Elimination Graphical Models – Carlos Guestrin
Algorithm Course Dr. Aref Rashad
Line Graphs.
For Friday Read chapter 9, sections 2-3 No homework
BN Semantics 2 – The revenge of d-separation
Presentation transcript:

I-maps and perfect maps Representation Probabilistic Graphical Models Independencies I-maps and perfect maps

Capturing Independencies in P P factorizes over G  G is an I-map for P: But not always vice versa: there can be independencies in I(P) that are not in I(G)

Want a Sparse Graph If the graph encodes more independencies it is sparser (has fewer parameters) and more informative Want a graph that captures as much of the structure in P as possible

Minimal I-map I-map without redundant edges I D G I D G

MN as a perfect map Exactly capture the independencies in P: I(G) = I(P) or I(H) = I(P) I D G I D G

BN as a perfect map Exactly capture the independencies in P: I(G) = I(P) or I(H) = I(P) B D C A B D C A B D C A B D C A

More imperfect maps Exactly capture the independencies in P: I(G) = I(P) or I(H) = I(P) X1 X2 Y Prob 0.25 1 X1 X2 Y XOR

Summary Graphs that capture more of I(P) are more compact and provide more insight But it may be impossible to capture I(P) perfectly (as a BN, as an MN, or at all) BN to MN: lose the v-structures MN to BN: add edges to undirected loops