Plate Models Template Models Representation Probabilistic Graphical

Slides:



Advertisements
Similar presentations
1 Probability and the Web Ken Baclawski Northeastern University VIStology, Inc.
Advertisements

UMBC an Honors University in Maryland 1 Uncertainty in Ontology Mapping: Uncertainty in Ontology Mapping: A Bayesian Perspective Yun Peng, Zhongli Ding,
BAYESIAN NETWORKS CHAPTER#4 Book: Modeling and Reasoning with Bayesian Networks Author : Adnan Darwiche Publisher: CambridgeUniversity Press 2009.
Bayesian Networks. Contents Semantics and factorization Reasoning Patterns Flow of Probabilistic Influence.
Learning First-Order Probabilistic Models with Combining Rules Sriraam Natarajan Prasad Tadepalli Eric Altendorf Thomas G. Dietterich Alan Fern Angelo.
CPSC 422, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33 Apr, 8, 2015 Slide source: from David Page (MIT) (which were.
10/28 Temporal Probabilistic Models. Temporal (Sequential) Process A temporal process is the evolution of system state over time Often the system state.
Bayesian networks Chapter 14 Section 1 – 2.
1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CIS 410/510 Probabilistic Methods for Artificial Intelligence Instructor: Daniel Lowd.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Aspects of Bayesian Inference and Statistical Disclosure Control in Python Duncan Smith Confidentiality and Privacy Group CCSR University of Manchester.
Advanced Artificial Intelligence
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Lectures 2 – Oct 3, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday 12:00-1:20 Johnson Hall.
V13: Causality Aims: (1) understand the causal relationships between the variables of a network (2) interpret a Bayesian network as a causal model whose.
Bayesian Network By Zhang Liliang. Key Point Today Intro to Bayesian Network Usage of Bayesian Network Reasoning BN: D-separation.
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2008 Readings: K&F: 3.1, 3.2, –  Carlos.
Multiplication Facts. 9 6 x 4 = 24 5 x 9 = 45 9 x 6 = 54.
Daphne Koller Markov Networks General Gibbs Distribution Probabilistic Graphical Models Representation.
Multiplication Facts. 2x2 4 8x2 16 4x2 8 3x3.
Multiplication Facts Review: x 1 = 1 10 x 8 =
Daphne Koller Template Models Plate Models Probabilistic Graphical Models Representation.
Reasoning Patterns Bayesian Networks Representation Probabilistic
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Multiplication Facts All Facts. 0 x 1 2 x 1 10 x 5.
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
Multiplication Facts.
Maximum Expected Utility
Presented By S.Yamuna AP/CSE
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Temporal Models Template Models Representation Probabilistic Graphical
Multiplication Facts.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Preliminaries: Distributions
CAP 5636 – Advanced Artificial Intelligence
Bayesian Networks Independencies Representation Probabilistic
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
Multiplication Facts.
CS 188: Artificial Intelligence
Simple Sampling Sampling Methods Inference Probabilistic Graphical
Learning Probabilistic Graphical Models Overview Learning Problems.
I-equivalence Bayesian Networks Representation Probabilistic Graphical
Class #16 – Tuesday, October 26
Shared Features in Log-Linear Models
A Direct Measure for the Efficacy of Bayesian Network Structures Learned from Data Gary F. Holness.
MCMC for PGMs: The Gibbs Chain
Conditional Random Fields
Probabilistic Influence & d-separation
Reasoning Patterns Bayesian Networks Representation Probabilistic
Belief Networks CS121 – Winter 2003 Belief Networks.
Shared Features in Log-Linear Models
Conditional Random Fields
Label and Link Prediction in Relational Data
Plate Models Template Models Representation Probabilistic Graphical
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Flow of Probabilistic Influence
CS 188: Artificial Intelligence Spring 2006
Variable Elimination Graphical Models – Carlos Guestrin
CS 188: Artificial Intelligence Fall 2008
Evidence of Sexual Bias in Graduate School Admissions
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

Plate Models Template Models Representation Probabilistic Graphical

Modeling Repetition

Intelligence G(s1)‏ I(s1)‏ G(s2)‏ I(s2)‏ Grade Students s

Nested Plates Difficulty Grade Intelligence D(c1)‏ G(s1,c1)‏ I(s1,c1)‏ Courses c Difficulty Grade Intelligence Students s D(c1)‏ G(s1,c1)‏ I(s1,c1)‏ G(s2,c1)‏ I(s2,c1)‏ D(c2)‏ I(s1,c2)‏ I(s2,c2)‏

Overlapping Plates Difficulty Intelligence Grade D(c1)‏ G(s1,c1)‏ Courses c Students s D(c1)‏ G(s1,c1)‏ I(s1)‏ D(c2)‏ I(s2)‏ G(s1,c2)‏ G(s2,c1)‏ G(s2,c2)‏

Explicit Parameter Sharing D I G D(c1)‏ G(s1,c1)‏ I(s1)‏ D(c2)‏ I(s2)‏ G(s1,c2)‏ G(s2,c1)‏ G(s2,c2)‏

Collective Inference CS101 C A low high Geo101 easy / hard low / high Welcome to CS101 C A low high Welcome to Geo101 easy / hard low / high This web of influence has interesting ramifications from the perspective of the types of reasoning patterns that it supports. Consider Forrest Gump. A priori, we believe that he is pretty likely to be smart. Evidence about two classes that he took changes our probabilities only very slightly. However, we see that most people who took CS101 got A’s. In fact, even people who did fairly poorly in other classes got an A in CS101. Therefore, we believe that CS101 is probably an easy class. To get a C in an easy class is unlikely for a smart student, so our probability that Forrest Gump is smart goes down substantially.

Plate Dependency Model For a template variable A(U1,…,Uk): Template parents B1(U1),…,Bm(Um) CPD P(A | B1,…, Bm)

Ground Network A(U1,…,Uk) with parents B1(U1),…,Bm(Um)

Plate Dependency Model For a template variable A(U1,…,Uk): Template parents B1(U1),…, Bm(Um)

Summary Template for an infinite set of BNs, each induced by a different set of domain objects Parameters and structure are reused within a BN and across different BNs Models encode correlations across multiple objects, allowing collective inference Multiple “languages”, each with different tradeoffs in expressive power