10 December, 2008 CIMCA2008 (Vienna) 1 Statistical Inferences by Gaussian Markov Random Fields on Complex Networks Kazuyuki Tanaka, Takafumi Usui, Muneki.

Slides:



Advertisements
Similar presentations
Albert-László Barabási
Advertisements

Emergence of Scaling in Random Networks Albert-Laszlo Barabsi & Reka Albert.
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
3 March, 2003University of Glasgow1 Statistical-Mechanical Approach to Probabilistic Inference --- Cluster Variation Method and Generalized Loopy Belief.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Networks. Graphs (undirected, unweighted) has a set of vertices V has a set of undirected, unweighted edges E graph G = (V, E), where.
Scale-free networks Péter Kómár Statistical physics seminar 07/10/2008.
Graduate School of Information Sciences, Tohoku University
1 Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan.
Complex networks and random matrices. Geoff Rodgers School of Information Systems, Computing and Mathematics.
Global topological properties of biological networks.
Statistical Background
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 2nd Mathematical Preparations (1): Probability and statistics Kazuyuki Tanaka Graduate.
(Social) Networks Analysis III Prof. Dr. Daning Hu Department of Informatics University of Zurich Oct 16th, 2012.
1 物理フラクチュオマティクス論 Physical Fluctuomatics 応用確率過程論 Applied Stochastic Process 第 5 回グラフィカルモデルによる確率的情報処理 5th Probabilistic information processing by means of.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
3 September, 2009 SSP2009, Cardiff, UK 1 Probabilistic Image Processing by Extended Gauss-Markov Random Fields Kazuyuki Tanaka Kazuyuki Tanaka, Muneki.
Physics Fluctuomatics / Applied Stochastic Process (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 2nd Probability and its fundamental.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Part 1: Biological Networks 1.Protein-protein interaction networks 2.Regulatory networks 3.Expression networks 4.Metabolic networks 5.… more biological.
Physics Fluctuomatics / Applied Stochastic Process (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation Kazuyuki.
Emergence of Scaling and Assortative Mixing by Altruism Li Ping The Hong Kong PolyU
September 2007 IW-SMI2007, Kyoto 1 A Quantum-Statistical-Mechanical Extension of Gaussian Mixture Model Kazuyuki Tanaka Graduate School of Information.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
29 December, 2008 National Tsing Hua University, Taiwan 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate.
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
Physics Fluctuomatics/Applied Stochastic Process (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 3rd Random variable, probability.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Kazuyuki Tanaka Graduate School of Information Sciences,
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Phisical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 4th Maximum likelihood estimation and EM algorithm Kazuyuki Tanaka Graduate School.
Class 9: Barabasi-Albert Model-Part I
Lecture 10: Network models CS 765: Complex Networks Slides are modified from Networks: Theory and Application by Lada Adamic.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 3rd Random variable, probability distribution and probability density function Kazuyuki.
Properties of Growing Networks Geoff Rodgers School of Information Systems, Computing and Mathematics.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 2nd Probability and its fundamental properties Kazuyuki Tanaka Graduate School of Information.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
ICPR2004 (24 July, 2004, Cambridge) 1 Probabilistic image processing based on the Q-Ising model by means of the mean- field method and loopy belief propagation.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Graduate School of Information Sciences, Tohoku University
Network (graph) Models
Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki Tanaka Graduate School of Information Sciences,
Hiroki Sayama NECSI Summer School 2008 Week 2: Complex Systems Modeling and Networks Network Models Hiroki Sayama
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Sublinear Computational Time Modeling in Statistical Machine Learning Theory for Markov Random Fields Kazuyuki Tanaka GSIS, Tohoku University, Sendai,
Learning to Generate Networks
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University, Japan
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Modelling Structure and Function in Complex Networks
Graduate School of Information Sciences, Tohoku University
Physical Fluctuomatics 7th~10th Belief propagation
Graduate School of Information Sciences, Tohoku University
Advanced Mean Field Methods in Quantum Probabilistic Inference
Probabilistic image processing and Bayesian network
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Network Models Michael Goodrich Some slides adapted from:
Graduate School of Information Sciences, Tohoku University
Advanced Topics in Data Mining Special focus: Social Networks
Kazuyuki Tanaka Graduate School of Information Sciences
Presentation transcript:

10 December, 2008 CIMCA2008 (Vienna) 1 Statistical Inferences by Gaussian Markov Random Fields on Complex Networks Kazuyuki Tanaka, Takafumi Usui, Muneki Yasuda Graduate School of Information Sciences, Tohoku University

10 December, 2008CIMCA2008 (Vienna)2 Bayesian Network and Graphical model Image Processing Regular Graph Code Theory Random Graph Bipartite Graph Complete Graph Data Mining Machine Learning Probabilistic Inference Hypergraph Bayesian networks are formulated for statistical inferences as probabilistic models on various networks. The performances sometimes depend on the statistical properties in the network structures.

10 December, 2008 CIMCA2008 (Vienna) 3 Recently, various kinds of networks and their stochastically generating models are interested in the applications of statistical inferences. Complex Networks Degree Distribution {1,2} {2,3} {3,4} {2,4} {3,5} :Set of all the vertices :Set of all the edges The degree of each vertex plays an important role for the statistical properties in the structures of networks. Networks are classified by using the degree distributions. (The degree of vertex is the number of edges connected to the vertex)

10 December, 2008CIMCA2008 (Vienna)4 Degree Distribution in Complex Networks Poisson Distribution Power Law Distribution Scale Free Network: Random Network: P(d)P(d) Random Network Scale Free Network It is known that the degree distributions of random networks are according to the Poisson distributions. The scale free networks have some hub-vertices, their degree distributions are given by power law distributions. P(d)P(d)

10 December, 2008CIMCA2008 (Vienna)5 Purpose of the present talk In the present paper, we analyze the statistical performance of the Bayesian inferences on some complex networks including scale free networks. We adopt the Gauss Markov random field model as a probabilistic model in statistical inferences. The statistical quantities for the Gauss Markov random field model can be calculated by using the multi-dimensional Gaussian integral formulas.

10 December, 2008 CIMCA2008 (Vienna) 6 Prior Probability in Bayesian Inference {1,2} {2,3} {3,4} {2,4} {3,5} We adopt the Gauss Markov random field model as a prior probability of Bayesian statistics and the source signals are assumed to be generated by according to the prior probability. I: Unit Matrix

10 December, 2008 CIMCA2008 (Vienna) 7 Data Generating Process in Bayesian Statistical Inference {1,2} {2,3} {3,4} {2,4} {3,5} xixi yiyi Additive White Gaussian Noise As data generating processes, we assume that the observed data are generated from the source signals by adding the white Gaussian noise.

10 December, 2008CIMCA2008 (Vienna)8 Bayesian Statistics Source Signal Data Prior Probability Density Function Posterior Probability Density Function Data Generating Process

10 December, 2008CIMCA2008 (Vienna)9 Prior Probability Density Function Statistical Performance by Sample Average

10 December, 2008CIMCA2008 (Vienna)10 Prior Probability Density Function Statistical Performance by Sample Average Data Generating Process

10 December, 2008CIMCA2008 (Vienna)11 Prior Probability Density Function Posterior Probability Density Function Statistical Performance by Sample Average Data Generating Process

10 December, 2008CIMCA2008 (Vienna)12 Prior Probability Density Function Posterior Probability Density Function Statistical Performance by Sample Average Data Generating Process

10 December, 2008CIMCA2008 (Vienna)13 Statistical Performance Analysis Prior Probability Density Function Data Generating Process Posterior Probability Density Function

10 December, 2008CIMCA2008 (Vienna)14 The exact expression of the average for the mean square error with respect to the source signals and the observable data can be derived. Statistical Performance Analysis Data Generating Process Prior Probability Density Function

10 December, 2008CIMCA2008 (Vienna)15 Erdos and Renyi (ER) model The following procedures are repeated: Choose a pair of vertices {i, j} randomly. Add a new edge and connect to the selected vertices if the pair of vertices have no edge P(d)P(d) Poisson Distribution 0.5 Random Network

10 December, 2008CIMCA2008 (Vienna)16 Barabasi and Albert (BA) model The following procedures are repeated: Choose a vertex i with the probability which is proportional to the degree of vertex i. Add a new vertex with an edge and connect to the selected vertices. Scale Free Network P(d)P(d)

10 December, 2008CIMCA2008 (Vienna)17 Assign a fitness parameter  (i) to each vertex i using the uniform distribution on the interval [0, 1]. Ohkubo and Horiguchi (OH) model The following procedures are repeated: Select an edge {i, j} randomly. Select a vertex k preferentially with the probability that is proportional to (d k + 1)  (k) Rewire the edge {i, j} to {i,k} if {i,k} is not edge. Scale Free Network i j k k i j

10 December, 2008CIMCA2008 (Vienna)18 Statistical Performance for GMRF model on Complex Networks Random Network by ER model Scale Free Network by OH model Scale Free Network by BA model Remove all the isolated vertices

10 December, 2008CIMCA2008 (Vienna)19 Summary Statistical Performance of Probabilistic Inference by Gauss Markov Random field models has been derived for various complex networks. We have given some numerical calculations of statistical performances for various complex networks including Scale Free Networks as well as Random Networks.