Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep 29 2009.

Similar presentations


Presentation on theme: "CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep 29 2009."— Presentation transcript:

1 CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep 29 2009

2 Review: Exponential Family Parametrization of positive MRFs, i.e. P[x]>0 for all x. Let denote a collection of potential functions defined on the cliques of G. Let be a vector of weights on these potent ials functions. An MRF with weight is defined by Where the log partition function is

3 Lemmas 1. 2. So the Hessian of the log partition function Z is equal to a covariance matrix, which is always positive definite.  Hence Z is convex as a function of.

4 Convex combinations Let denote the set of all spanning trees of G. Let be an exponential parameter vector that represents a tree T, i.e. only for vertices and edges of T. Let be a probability distribution over T(G):

5 Example 4/3 1 1 11 00 0 0 G

6 Upper bound on the log partition ftn By Jensen’s inequality we obtain that For all and such that Then how to choose and that minimize ?

7 Upper bound on the log partition ftn Optimizing over with fixed Since Z is convex and the constraint is linear, it has a global minimum, and it could be solved exactly by nonlinear programming. Note : number of spanning tree is large  ex Cayley’s formula says that # of spanning tree of a complete graph is  Hence we will solve the dual problem which has smaller # of variables.

8 Pseudo-marginals Consider a set of Pseudo-marginals We require the following constraints If G is a tree, LOCAL(G) is a complete description of the set of valid marginals.

9 Pseudo-marginals Let denote the projection of onto the spanning tree T: Then we can define an MRF

10 Lagrangian dual Let be the optimal primal solution. And let be the optimal dual solution. Then we have that, for any tree T, Hence, fully expresses for all tree T. Note that has dimension which is small.

11 Optimal Upper Bound (for fixed ) Where is the single node entropy. is mutual information between and. is the edge appearance prob. of the edge e. …(1)

12 Optimal Upper Bound (for changing ) Note that for a fixed, only matters. has large dimension (# of spanning trees of G), has small dimension (# of edges of G). is a convex function of. Use Conditional gradient method to compute optimal

13 Tree reweighted sum-product (for fixed ) Message passing implementation of the dual problem (1). Messages from vertex t to s are defined as follows.

14 Tree reweighted sum-product The Pseudo-marginals is computed by which maximizes

15 How the messages are defined Lagrangean associated with (1) is where Take derivatives w.r.t. and to obtain relations (that are used in the message update). Then define the message via

16 Self Avoiding Walk tree Comparison with computation tree

17 Self Avoiding Walk tree Theorem  Consider any binary pairwise MRF on a graph G=(V,E). For any vertex v, the marginal prob. computed at the root node of Tsaw(v) is equal to the marginal prob. for v in the original MRF.  Same theorem holds for MAP, i.e. for  Hence Tsaw can be used to compute exact marginal and MAP for graphs with small # of cycles.


Download ppt "CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep 29 2009."

Similar presentations


Ads by Google