10 December, 2008 CIMCA2008 (Vienna) 1 Statistical Inferences by Gaussian Markov Random Fields on Complex Networks Kazuyuki Tanaka, Takafumi Usui, Muneki Yasuda Graduate School of Information Sciences, Tohoku University
10 December, 2008CIMCA2008 (Vienna)2 Bayesian Network and Graphical model Image Processing Regular Graph Code Theory Random Graph Bipartite Graph Complete Graph Data Mining Machine Learning Probabilistic Inference Hypergraph Bayesian networks are formulated for statistical inferences as probabilistic models on various networks. The performances sometimes depend on the statistical properties in the network structures.
10 December, 2008 CIMCA2008 (Vienna) 3 Recently, various kinds of networks and their stochastically generating models are interested in the applications of statistical inferences. Complex Networks Degree Distribution {1,2} {2,3} {3,4} {2,4} {3,5} :Set of all the vertices :Set of all the edges The degree of each vertex plays an important role for the statistical properties in the structures of networks. Networks are classified by using the degree distributions. (The degree of vertex is the number of edges connected to the vertex)
10 December, 2008CIMCA2008 (Vienna)4 Degree Distribution in Complex Networks Poisson Distribution Power Law Distribution Scale Free Network: Random Network: P(d)P(d) Random Network Scale Free Network It is known that the degree distributions of random networks are according to the Poisson distributions. The scale free networks have some hub-vertices, their degree distributions are given by power law distributions. P(d)P(d)
10 December, 2008CIMCA2008 (Vienna)5 Purpose of the present talk In the present paper, we analyze the statistical performance of the Bayesian inferences on some complex networks including scale free networks. We adopt the Gauss Markov random field model as a probabilistic model in statistical inferences. The statistical quantities for the Gauss Markov random field model can be calculated by using the multi-dimensional Gaussian integral formulas.
10 December, 2008 CIMCA2008 (Vienna) 6 Prior Probability in Bayesian Inference {1,2} {2,3} {3,4} {2,4} {3,5} We adopt the Gauss Markov random field model as a prior probability of Bayesian statistics and the source signals are assumed to be generated by according to the prior probability. I: Unit Matrix
10 December, 2008 CIMCA2008 (Vienna) 7 Data Generating Process in Bayesian Statistical Inference {1,2} {2,3} {3,4} {2,4} {3,5} xixi yiyi Additive White Gaussian Noise As data generating processes, we assume that the observed data are generated from the source signals by adding the white Gaussian noise.
10 December, 2008CIMCA2008 (Vienna)8 Bayesian Statistics Source Signal Data Prior Probability Density Function Posterior Probability Density Function Data Generating Process
10 December, 2008CIMCA2008 (Vienna)9 Prior Probability Density Function Statistical Performance by Sample Average
10 December, 2008CIMCA2008 (Vienna)10 Prior Probability Density Function Statistical Performance by Sample Average Data Generating Process
10 December, 2008CIMCA2008 (Vienna)11 Prior Probability Density Function Posterior Probability Density Function Statistical Performance by Sample Average Data Generating Process
10 December, 2008CIMCA2008 (Vienna)12 Prior Probability Density Function Posterior Probability Density Function Statistical Performance by Sample Average Data Generating Process
10 December, 2008CIMCA2008 (Vienna)13 Statistical Performance Analysis Prior Probability Density Function Data Generating Process Posterior Probability Density Function
10 December, 2008CIMCA2008 (Vienna)14 The exact expression of the average for the mean square error with respect to the source signals and the observable data can be derived. Statistical Performance Analysis Data Generating Process Prior Probability Density Function
10 December, 2008CIMCA2008 (Vienna)15 Erdos and Renyi (ER) model The following procedures are repeated: Choose a pair of vertices {i, j} randomly. Add a new edge and connect to the selected vertices if the pair of vertices have no edge P(d)P(d) Poisson Distribution 0.5 Random Network
10 December, 2008CIMCA2008 (Vienna)16 Barabasi and Albert (BA) model The following procedures are repeated: Choose a vertex i with the probability which is proportional to the degree of vertex i. Add a new vertex with an edge and connect to the selected vertices. Scale Free Network P(d)P(d)
10 December, 2008CIMCA2008 (Vienna)17 Assign a fitness parameter (i) to each vertex i using the uniform distribution on the interval [0, 1]. Ohkubo and Horiguchi (OH) model The following procedures are repeated: Select an edge {i, j} randomly. Select a vertex k preferentially with the probability that is proportional to (d k + 1) (k) Rewire the edge {i, j} to {i,k} if {i,k} is not edge. Scale Free Network i j k k i j
10 December, 2008CIMCA2008 (Vienna)18 Statistical Performance for GMRF model on Complex Networks Random Network by ER model Scale Free Network by OH model Scale Free Network by BA model Remove all the isolated vertices
10 December, 2008CIMCA2008 (Vienna)19 Summary Statistical Performance of Probabilistic Inference by Gauss Markov Random field models has been derived for various complex networks. We have given some numerical calculations of statistical performances for various complex networks including Scale Free Networks as well as Random Networks.