Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Introduction to Graphical Models.

Slides:



Advertisements
Similar presentations
KSU Math Department Colloquium
Advertisements

ETHEM ALPAYDIN © The MIT Press, Lecture Slides for 1 Lecture Notes for E Alpaydın 2010.
1 Chapter 5 Belief Updating in Bayesian Networks Bayesian Networks and Decision Graphs Finn V. Jensen Qunyuan Zhang Division. of Statistical Genomics,
Lauritzen-Spiegelhalter Algorithm
Exact Inference in Bayes Nets
IMPORTANCE SAMPLING ALGORITHM FOR BAYESIAN NETWORKS
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Bayesian Networks A causal probabilistic network, or Bayesian network,
Regulatory Network (Part II) 11/05/07. Methods Linear –PCA (Raychaudhuri et al. 2000) –NIR (Gardner et al. 2003) Nonlinear –Bayesian network (Friedman.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Bayes Nets Rong Jin. Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, March 6, 2000 William.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Kansas State University Department of Computing and Information Sciences Kansas State University KDD Lab ( Graphical.
Computing & Information Sciences Kansas State University Lecture 28 of 42 CIS 530 / 730 Artificial Intelligence Lecture 28 of 42 William H. Hsu Department.
A Brief Introduction to Graphical Models
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 26 of 41 Friday, 22 October.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, March 27, 2000 William.
Computing & Information Sciences Kansas State University Lecture 27 of 42 CIS 530 / 730 Artificial Intelligence Lecture 27 of 42 William H. Hsu Department.
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Michael Baron + * Department of Computer Science, University of Texas at Dallas + Department of Mathematical.
Kansas State University Department of Computing and Information Sciences Kansas State University KDD Lab ( Permutation.
Computing & Information Sciences Kansas State University Lecture 30 of 42 CIS 530 / 730 Artificial Intelligence Lecture 30 of 42 William H. Hsu Department.
Kansas State University Department of Computing and Information Sciences Kansas State University KDD Lab ( cDNA.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
Aprendizagem Computacional Gladys Castillo, UA Bayesian Networks Classifiers Gladys Castillo University of Aveiro.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
Computing & Information Sciences Kansas State University Monday, 29 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 25 of 42 Wednesday, 29 October.
Computing & Information Sciences Kansas State University Wednesday, 22 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 22 of 42 Wednesday, 22 October.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 28 of 41 Friday, 22 October.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Friday, 29 October 2004 William.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, March 10, 2000 William.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 of 41 Monday, 25 October.
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
Probabilistic Networks Chapter 14 of Dechter’s CP textbook Speaker: Daniel Geschwender April 1, 2013 April 1&3, 2013DanielG--Probabilistic Networks1.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
Computing & Information Sciences Kansas State University Monday, 06 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 31 of 42 Monday, 06 November.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
BNJ 2.03α Beginner Developer Tutorial Chris H. Meyer (revised by William H. Hsu) Kansas State University KDD Laboratory
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
K2 Algorithm Presentation KDD Lab, CIS Department, KSU
Lecture 2: Statistical learning primer for biologists
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Wednesday, 21 February 2007.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Computing & Information Sciences Kansas State University Friday, 27 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 27 of 42 Friday, 27 October.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Graphical Models of Probability.
Computing & Information Sciences Kansas State University Wednesday, 08 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 32 of 42 Wednesday, 08 November.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
Bayesian Learning Bayes Theorem MAP, ML hypotheses MAP learners
Computing & Information Sciences Kansas State University Wednesday, 08 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 32 of 42 Wednesday, 08 November.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Wednesday, 28 February 2007.
CHAPTER 3: BAYESIAN DECISION THEORY. Making Decision Under Uncertainty Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
Computing & Information Sciences Kansas State University Friday, 03 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 30 of 42 Friday, 03 November.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Monday, 01 February 2016 William.
Bayesian Decision Theory Introduction to Machine Learning (Chap 3), E. Alpaydin.
Computing & Information Sciences Kansas State University Friday, 31 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 26 of 42 Friday, 31 October.
Computing & Information Sciences Kansas State University Wednesday, 01 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 29 of 42 Wednesday, 01 November.
INTRODUCTION TO Machine Learning 2nd Edition
Exact Inference Continued
INTRODUCTION TO Machine Learning
Exact Inference ..
Exact Inference Continued
INTRODUCTION TO Machine Learning
Markov Networks.
Presentation transcript:

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Introduction to Graphical Models Part 2 of 2 Friday, 04 November 2005 William H. Hsu Laboratory for Knowledge Discovery in Databases Department of Computing and Information Sciences Kansas State University This presentation is based upon: Lecture 30 of 42

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Graphical Models Overview [1]: Bayesian Networks P(20s, Female, Low, Non-Smoker, No-Cancer, Negative, Negative) = P(T) · P(F) · P(L | T) · P(N | T, F) · P(N | L, N) · P(N | N) · P(N | N) Conditional Independence –X is conditionally independent (CI) from Y given Z (sometimes written X  Y | Z) iff P(X | Y, Z) = P(X | Z) for all values of X, Y, and Z –Example: P(Thunder | Rain, Lightning) = P(Thunder | Lightning)  T  R | L Bayesian (Belief) Network –Acyclic directed graph model B = (V, E,  ) representing CI assertions over  –Vertices (nodes) V: denote events (each a random variable) –Edges (arcs, links) E: denote conditional dependencies Markov Condition for BBNs (Chain Rule): Example BBN X1X1 X3X3 X4X4 X5X5 Age Exposure-To-Toxins Smoking Cancer X6X6 Serum Calcium X2X2 Gender X7X7 Lung Tumor

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Adapted from slides by S. Russell, UC Berkeley Fusion –Methods for combining multiple beliefs –Theory more precise than for fuzzy, ANN inference –Data and sensor fusion –Resolving conflict (vote-taking, winner-take-all, mixture estimation) –Paraconsistent reasoning Propagation –Modeling process of evidential reasoning by updating beliefs –Source of parallelism –Natural object-oriented (message-passing) model –Communication: asynchronous –dynamic workpool management problem –Concurrency: known Petri net dualities Structuring –Learning graphical dependencies from scores, constraints –Two parameter estimation problems: structure learning, belief revision Fusion, Propagation, and Structuring

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Bayesian Learning Framework: Interpretations of Probability [Cheeseman, 1985] –Bayesian subjectivist view A measure of an agent’s belief in a proposition Proposition denoted by random variable (sample space: range) e.g., Pr(Outlook = Sunny) = 0.8 –Frequentist view: probability is the frequency of observations of an event –Logicist view: probability is inferential evidence in favor of a proposition Typical Applications –HCI: learning natural language; intelligent displays; decision support –Approaches: prediction; sensor and data fusion (e.g., bioinformatics) Prediction: Examples –Measure relevant parameters: temperature, barometric pressure, wind speed –Make statement of the form Pr(Tomorrow’s-Weather = Rain) = 0.5 –College admissions: Pr(Acceptance)  p Plain beliefs: unconditional acceptance (p = 1) or categorical rejection (p = 0) Conditional beliefs: depends on reviewer (use probabilistic model)

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Choosing Hypotheses Bayes’s Theorem MAP Hypothesis –Generally want most probable hypothesis given the training data –Define:  the value of x in the sample space  with the highest f(x) –Maximum a posteriori hypothesis, h MAP ML Hypothesis –Assume that p(h i ) = p(h j ) for all pairs i, j (uniform priors, i.e., P H ~ Uniform) –Can further simplify and choose the maximum likelihood hypothesis, h ML

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Propagation Algorithm in Singly-Connected Bayesian Networks – Pearl (1983) C1C1 C2C2 C3C3 C4C4 C5C5 C6C6 Upward (child-to- parent) messages  ’ (C i ’ ) modified during message-passing phase Downward  messages P ’ (C i ’ ) is computed during  message-passing phase Adapted from Neapolitan (1990), Guo (2000) Multiply-connected case: exact, approximate inference are #P-complete (counting problem is #P-complete iff decision problem is NP-complete)

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Inference by Clustering [1]: Graph Operations (Moralization, Triangulation, Maximal Cliques) Adapted from Neapolitan (1990), Guo (2000) A D BE G C H F Bayesian Network (Acyclic Digraph) A D BE G C H F Moralize A1A1 D8D8 B2B2 E3E3 G5G5 C4C4 H7H7 F6F6 Triangulate Clq6 D8D8 C4C4 G5G5 H7H7 C4C4 Clq5 G5G5 F6F6 E3E3 Clq4 G5G5 E3E3 C4C4 Clq3 A1A1 B2B2 Clq1 E3E3 C4C4 B2B2 Clq2 Find Maximal Cliques

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Inference by Clustering [2]: Junction Tree – Lauritzen & Spiegelhalter (1988) Input: list of cliques of triangulated, moralized graph G u Output: Tree of cliques Separators nodes S i, Residual nodes R i and potential probability  (Clq i ) for all cliques Algorithm: 1. S i = Clq i  (Clq 1  Clq 2  …  Clq i-1 ) 2. R i = Clq i - S i 3. If i >1 then identify a j < i such that Clq j is a parent of Clq i 4. Assign each node v to a unique clique Clq i that v  c(v)  Clq i 5. Compute  (Clq i ) =  f(v) Clqi = P(v | c(v)) {1 if no v is assigned to Clq i } 6. Store Clq i, R i, S i, and  (Clq i ) at each vertex in the tree of cliques Adapted from Neapolitan (1990), Guo (2000)

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Inference by Clustering [3]: Clique-Tree Operations Clq6 D8D8 C4C4G5G5 H7H7 C4C4 Clq5 G5G5 F6F6 E3E3 Clq4 G5G5 E3E3 C4C4 Clq3 A1A1 B2B2 Clq1 E3E3 C4C4 B2B2 Clq2  (Clq5) = P(H|C,G)  (Clq2) = P(D|C) Clq 1 Clq3 = {E,C,G} R3 = {G} S3 = { E,C } Clq1 = {A, B} R1 = {A, B} S1 = {} Clq2 = {B,E,C} R2 = {C,E} S2 = { B } Clq4 = {E, G, F} R4 = {F} S4 = { E,G } Clq5 = {C, G,H} R5 = {H} S5 = { C,G } Clq6 = {C, D} R5 = {D} S5 = { C}  (Clq 1 ) = P(B|A)P(A)  (Clq2) = P(C|B,E)  (Clq3) = 1  (Clq4) = P(E|F)P(G|F)P(F) AB BEC ECG EGF CGH CD B EC CGEG C R i : residual nodes S i : separator nodes  (Clq i ): potential probability of Clique i Clq 2 Clq 3 Clq 4 Clq 5 Clq 6 Adapted from Neapolitan (1990), Guo (2000)

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Inference by Loop Cutset Conditioning Split vertex in undirected cycle; condition upon each of its state values Number of network instantiations: Product of arity of nodes in minimal loop cutset Posterior: marginal conditioned upon cutset variable values X3X3 X4X4 X5X5 Exposure-To- Toxins Smoking Cancer X6X6 Serum Calcium X2X2 Gender X7X7 Lung Tumor X 1,1 Age = [0, 10) X 1,2 Age = [10, 20) X 1,10 Age = [100,  ) Deciding Optimal Cutset: NP-hard Current Open Problems –Bounded cutset conditioning: ordering heuristics –Finding randomized algorithms for loop cutset optimization

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Inference by Variable Elimination [1]: Intuition Adapted from slides by S. Russell, UC Berkeley

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Inference by Variable Elimination [2]: Factoring Operations Adapted from slides by S. Russell, UC Berkeley

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Inference by Variable Elimination [3]: Example A BC F G Season Sprinkler Rain Wet Slippery D Manual Watering P(A|G=1) = ? d = G D F B C A λ G (f) = Σ G=1 P(G|F) P(A), P(B|A), P(C|A), P(D|B,A), P(F|B,C), P(G|F) P(G|F) P(D|B,A) P(F|B,C) P(B|A) P(C|A) P(A) G=1 Adapted from Dechter (1996), Joehanes (2002)

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Tools for Building Graphical Models Commercial Tools: Ergo, Netica, TETRAD, Hugin Bayes Net Toolbox (BNT) – Murphy (1997-present) –Distribution page –Development group Bayesian Network tools in Java (BNJ) – Hsu et al. (1999-present) –Distribution page –Development group –Current (re)implementation projects for KSU KDD Lab Continuous state: Minka (2002) – Hsu, Guo, Perry, Boddhireddy Formats: XML BNIF (MSBN), Netica – Guo, Hsu Space-efficient DBN inference – Joehanes Bounded cutset conditioning – Chandak

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence References [1]: Graphical Models and Inference Algorithms Graphical Models –Bayesian (Belief) Networks tutorial – Murphy (2001) –Learning Bayesian Networks – Heckerman (1996, 1999) Inference Algorithms –Junction Tree (Join Tree, L-S, Hugin): Lauritzen & Spiegelhalter (1988) –(Bounded) Loop Cutset Conditioning: Horvitz & Cooper (1989) –Variable Elimination (Bucket Elimination, ElimBel): Dechter (1986) –Recommended Books Neapolitan (1990, 2003); see Pearl (1988), Jensen (2001) Castillo, Gutierrez, Hadi (1997) Cowell, Dawid, Lauritzen, Spiegelhalter (1999) –Stochastic Approximation

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence References [2]: Machine Learning, KDD, and Bioinformatics Machine Learning, Data Mining, and Knowledge Discovery –K-State KDD Lab: literature survey and resource catalog (1999-present) –Bayesian Network tools in Java (BNJ): Hsu, Barber, King, Meyer, Thornton (2002-present) –Machine Learning in Java (BNJ): Hsu, Louis, Plummer (2002) Bioinformatics –European Bioinformatics Institute Tutorial: Brazma et al. (2001) –Hebrew University: Friedman, Pe’er, et al. (1999, 2000, 2002) –K-State BMI Group: literature survey and resource catalog ( )