Menglong Li Ph.d of Industrial Engineering Dec 1st 2016

Slides:



Advertisements
Similar presentations
Markov Networks Alan Ritter.
Advertisements

CSE115/ENGR160 Discrete Mathematics 04/26/12 Ming-Hsuan Yang UC Merced 1.
Exploiting Sparse Markov and Covariance Structure in Multiresolution Models Presenter: Zhe Chen ECE / CMR Tennessee Technological University October 22,
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Probabilistic Inference Lecture 1
CWIT Robust Entropy Rate for Uncertain Sources: Applications to Communication and Control Systems Charalambos D. Charalambous Dept. of Electrical.
Observers and Kalman Filters
The Factor Graph Approach to Model-Based Signal Processing Hans-Andrea Loeliger.
Hidden Markov Models M. Vijay Venkatesh. Outline Introduction Graphical Model Parameterization Inference Summary.
Junction Tree Algorithm Brookes Vision Reading Group.
Convergent and Correct Message Passing Algorithms Nicholas Ruozzi and Sekhar Tatikonda Yale University TexPoint fonts used in EMF. Read the TexPoint manual.
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai.
SYSTEMS Identification
Data Basics. Data Matrix Many datasets can be represented as a data matrix. Rows corresponding to entities Columns represents attributes. N: size of the.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Pattern Recognition Topic 2: Bayes Rule Expectant mother:
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Graphs, relations and matrices
Separate multivariate observations
Diffusion Maps and Spectral Clustering
Maximum Likelihood Estimation
Applied Discrete Mathematics Week 10: Equivalence Relations
Bump Hunting The objective PRIM algorithm Beam search References: Feelders, A.J. (2002). Rule induction by bump hunting. In J. Meij (Ed.), Dealing with.
CSC2535 Spring 2013 Lecture 2a: Inference in factor graphs Geoffrey Hinton.
Principles of Pattern Recognition
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
CS654: Digital Image Analysis Lecture 12: Separable Transforms.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Presented by Jian-Shiun Tzeng 5/7/2009 Conditional Random Fields: An Introduction Hanna M. Wallach University of Pennsylvania CIS Technical Report MS-CIS
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
1 Matrix Algebra and Random Vectors Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of Networking.
Lecture 2: Statistical learning primer for biologists
An Introduction To The Kalman Filter By, Santhosh Kumar.
Chapter 7 Solving systems of equations substitution (7-1) elimination (7-1) graphically (7-1) augmented matrix (7-3) inverse matrix (7-3) Cramer’s Rule.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Graduate : Yu Cheng Chen Author: Lynette.
Markov Random Fields in Vision
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
An Algorithm to Learn the Structure of a Bayesian Network Çiğdem Gündüz Olcay Taner Yıldız Ethem Alpaydın Computer Engineering Taner Bilgiç Industrial.
Markov random fields. The Markov property Discrete time: A time symmetric version: A more general version: Let A be a set of indices >k, B a set of indices.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate accuracy.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Biointelligence Laboratory, Seoul National University
Linear Algebra Review.
ML estimation in Gaussian graphical models
Multi-task learning approaches to modeling context-specific networks
Review Problems Matrices
Matrices and Vectors Review Objective
The Multivariate Normal Distribution, Part 2
General Gibbs Distribution
Matrix Algebra and Random Vectors
Markov Random Fields Presented by: Vladan Radosavljevic.
5.4 General Linear Least-Squares
Linear Equations in Linear Algebra
数据的矩阵描述.
{(1, 1), (2, 4), (3, 9), (4, 16)} one-to-one
Biointelligence Laboratory, Seoul National University
1.8 Matrices.
1.8 Matrices.
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Linear Equations in Linear Algebra
Visual Algebra for Teachers
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Menglong Li Ph.d of Industrial Engineering Dec 1st 2016 Structure estimation for discrete graphical models: Generalized covariance matrices and their inverses Menglong Li Ph.d of Industrial Engineering Dec 1st 2016

Outline Recap: Gaussian graphical model Extend to general graphical model Model setting Results Application Statistical results

Gaussian graphical model Theorem: For if and only if encodes the pairwise Markov independencies, and we can recover the graph structure by adding edges with However, the question of whether a relationship exists between conditional independence and the structure of the inverse covariance matrix in a general graph remains unresolved.

How to extend? Example 1: Ising model: let the node potentials be for all and the edge potentials be for all ,

Continue of Example 1 This matrix is graph structured! The inverse of covariance matrix is This matrix is graph structured!

How to extend? Example 2:

Continue of Example 2 This matrix is not graph structured! The inverse of covariance matrix is This matrix is not graph structured!

How to extend? Example 3: The inverse of covariance matrix is graph structured!

Continue of Example 2 Instead of consider the original graph, we consider the triangulation graph of G as right we compute the covariance matrix of the augmented random vector where the extra term is represented by the dotted edge shown (the 2-clique )

Continue of Example 2 The inverse of this generalized covariance matrix takes the form If we add all the representation of power set of maximal cliques of triangulated G, compute the covariance of following vector:

Continue of Example 2 Empirically, we find that the 11 11 inverse of the matrix continues to respect aspects of the graph structure: in particular, there are zeros in position , corresponding to the associated functions and whenever and do not lie within the same maximal clique

Main results Theorem: Consider an arbitrary discrete graphical model of the form

Corollary

Statistical results Suppose we are given n samples drawn from a discrete graphical model, i.i.d. How to recover the graph structure? Algorithm(Graphical Lasso): This program is called graphical lasso.

Statistical results Mutual incoherence condition:

Reference Loh, P. L., & Wainwright, M. J. (2013). Structure estimation for discrete graphical models: Generalized covariance matrices and their inverses. The Annals of Statistics, 41(6), 3022-3049.