Fundamental weights of a graph 1 Piet Van Mieghem in collaboration with Xiangrong Wang Spectra of graphs and applications May 18-20, 2016, Belgrade, Serbia.

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Introduction to Network Theory: Modern Concepts, Algorithms
Nonlinear Dimension Reduction Presenter: Xingwei Yang The powerpoint is organized from: 1.Ronald R. Coifman et al. (Yale University) 2. Jieping Ye, (Arizona.
Y.M. Hu, Assistant Professor, Department of Applied Physics, National University of Kaohsiung Matrix – Basic Definitions Chapter 3 Systems of Differential.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Lecture 3: Markov processes, master equation
DATA MINING LECTURE 12 Link Analysis Ranking Random walks.
Maths for Computer Graphics
Linear Transformations
Spectral Clustering Scatter plot of a 2D data set K-means ClusteringSpectral Clustering U. von Luxburg. A tutorial on spectral clustering. Technical report,
Spectral Clustering 指導教授 : 王聖智 S. J. Wang 學生 : 羅介暐 Jie-Wei Luo.
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Chapter 3 Determinants and Matrices
Preference Analysis Joachim Giesen and Eva Schuberth May 24, 2006.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Dirac Notation and Spectral decomposition
HCC class lecture 14 comments John Canny 3/9/05. Administrivia.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Wei Wang Xi’an Jiaotong University Generalized Spectral Characterization of Graphs: Revisited Shanghai Conference on Algebraic Combinatorics (SCAC), Shanghai,
Deep Learning – Fall 2013 Instructor: Bhiksha Raj Paper: T. D. Sanger, “Optimal Unsupervised Learning in a Single-Layer Linear Feedforward Neural Network”,
6 1 Linear Transformations. 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this.
Linear algebra: matrix Eigen-value Problems
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Spectral Analysis based on the Adjacency Matrix of Network Data Leting Wu Fall 2009.
P1 RJM 06/08/02EG1C2 Engineering Maths: Matrix Algebra 1 EG1C2 Engineering Maths : Matrix Algebra Dr Richard Mitchell, Department of Cybernetics AimDescribe.
Introduction to Matrices and Matrix Approach to Simple Linear Regression.
Spectral Sequencing Based on Graph Distance Rong Liu, Hao Zhang, Oliver van Kaick {lrong, haoz, cs.sfu.ca {lrong, haoz, cs.sfu.ca.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
KPS 2007 (April 19, 2007) On spectral density of scale-free networks Doochul Kim (Department of Physics and Astronomy, Seoul National University) Collaborators:
 6. Use matrices to represent and manipulate data, e.g., to represent payoffs or incidence relationships related in a network.  7. Multiply matrices.
Presentation: Genetic clustering of social networks using random walks ELSEVIER Computational Statistics & Data Analysis February 2007 Genetic clustering.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
CS654: Digital Image Analysis Lecture 13: Discrete Fourier Transformation.
Mathematical Tools of Quantum Mechanics
Sec 4.1 Matrices.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
geometric representations of graphs
Ljiljana Rajačić. Page Rank Web as a directed graph  Nodes: Web pages  Edges: Hyperlinks 2 / 25 Ljiljana Rajačić.
Presented by Alon Levin
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Transformation methods - Examples
Final Outline Shang-Hua Teng. Problem 1: Multiple Choices 16 points There might be more than one correct answers; So you should try to mark them all.
Properties and applications of spectra for networks 章 忠 志 复旦大学计算机科学技术学院 Homepage:
Density of States for Graph Analysis
Introduction to Vectors and Matrices
Random Walk for Similarity Testing in Complex Networks
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
Warm-up Problem Use the Laplace transform to solve the IVP.
Graph Representations
Matrices and vector spaces
Background on Classification
Markov Chains Mixing Times Lecture 5
Chapter 3 Formalism.
Matrices Definition: A matrix is a rectangular array of numbers or symbolic elements In many applications, the rows of a matrix will represent individuals.
Further Matrix Algebra
3.3 Network-Centric Community Detection
Symmetric Matrices and Quadratic Forms
Properties and applications of spectra for networks
Introduction to Vectors and Matrices
Detecting Important Nodes to Community Structure
第八届全国复杂网络学术会议 Spectra of transition matrix for networks: Computation and applications 章 忠 志 复旦大学计算机科学技术学院 Homepage:
Shan Lu, Jieqi Kang, Weibo Gong, Don Towsley UMASS Amherst
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Fundamental weights of a graph 1 Piet Van Mieghem in collaboration with Xiangrong Wang Spectra of graphs and applications May 18-20, 2016, Belgrade, Serbia In honor of Dragos Cvetkovic

Motivation Any simple graph on N nodes can be represented by N 2 – N bits The “Big Data Age” asks for “less” bits (storage) History in brief: Cvetkovic: introduction of “graph angles” Haemers and van Dam: the vector of eigenvalues, called the spectrum, is a good signature with aN bits (a = number of bits for a real number in a computer) Cospectral Graphs (e.g. iso/auto-morphisms, Godsil- McKay switching) Aim: search for most compact representation that allows to reconstruct the graph 2

Eigenvalues and eigenvectors 3

The orthogonal matrix X 4 Open: properties of the orthogonal X matrix Orthogonality of eigenvectors: A matrix and its inverse commute: Double orthogonality: Both column vectors (=eigenvectors of A) and row vectors of X are orthogonal

The eigenvector matrix 5 All components of an eigenvector

Total number of graphs: again bits 6 where A G(N) : average number of Autmorphisms among all graphs G(N) with N nodes Number of bits in the representation of a graph:

Contemplation: spectrum and again bits 7 There are N 2 orthogonality conditions for N 2 elements in X Unfortunately, quadratic equations in X ij Importance of X is weighted by  containing eigenvalues Number of bits: O(N 2 ) Can X be represented by o(N 2 ) ?

Fundamental weight vector and its dual 8 Van Mieghem, P., 2015, "Graph eigenvectors, fundamental weights and centrality metrics for nodes in networks", Delft University of Technology, report ( Graph angle (Cvetkovic): Fundamental weight: Dual fundamental weight: Corresponding vectors:

Properties of the adjacency matrix A 9 Norm: Scalar product: Walks: Regular graph: where

Properties of the Laplacian matrix Q 10 Since the all-one vector u is an eigenvector of any Q: Only the dual fundamental weight vector  contains graph info

Dual fundamental Laplacian weight 11 In Erdos-Renyi random graphs, we study the distribution of a randomly chosen component of the dual fundamental weight vector  ( which is not invariant to a node relabeling transformation as opposed to w! ) Wang, X. and P. Van Mieghem, 2015, "Orthogonal Eigenvector Matrix of the Laplacian", Fourth International IEEE Workshop on Complex Networks and their Applications, November 23-27, Bangkok, Thailand. For any uniformly at random chosen component:

Probability density function 12 N = 50

An accurate fit: super Gaussian 13 Simulations:

Conclusions Least number of bit representation problem of graphs seems open Four characteristic vectors (for the adjacency matrix A): 14 If sufficient, graph representation condensation to 4aN bits!

15 Plan: second edition by 2020 Any help (your new papers, results that I missed, comments) will be acknowledged

16 Thank You Piet Van Mieghem NAS, TUDelft