Structure Preserving Embedding Blake Shaw, Tony Jebara ICML 2009 (Best Student Paper nominee) Presented by Feng Chen.

Slides:



Advertisements
Similar presentations
A phylogenetic application of the combinatorial graph Laplacian Eric A. Stone Department of Statistics Bioinformatics Research Center North Carolina State.
Advertisements

Text mining Gergely Kótyuk Laboratory of Cryptography and System Security (CrySyS) Budapest University of Technology and Economics
M. Belkin and P. Niyogi, Neural Computation, pp. 1373–1396, 2003.
Least-squares Meshes Olga Sorkine and Daniel Cohen-Or Tel-Aviv University SMI 2004.
A Geometric Perspective on Machine Learning 何晓飞 浙江大学计算机学院 1.
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction Keywords: Dimensionality reduction, manifold learning, subspace learning,
Distance Metric Learning with Spectral Clustering By Sheil Kumar.
Information Networks Graph Clustering Lecture 14.
Management Science 461 Lecture 2b – Shortest Paths September 16, 2008.
Two-View Geometry CS Sastry and Yang
Graph Laplacian Regularization for Large-Scale Semidefinite Programming Kilian Weinberger et al. NIPS 2006 presented by Aggeliki Tsoli.
Ronald R. Coifman , Stéphane Lafon, 2006
One-Shot Multi-Set Non-rigid Feature-Spatial Matching
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Lecture 21: Spectral Clustering
Spectral Clustering Scatter plot of a 2D data set K-means ClusteringSpectral Clustering U. von Luxburg. A tutorial on spectral clustering. Technical report,
Spectral Clustering 指導教授 : 王聖智 S. J. Wang 學生 : 羅介暐 Jie-Wei Luo.
Inverse Regression Methods Prasad Naik 7 th Triennial Choice Symposium Wharton, June 16, 2007.
CS 584. Review n Systems of equations and finite element methods are related.
Spectral embedding Lecture 6 1 © Alexander & Michael Bronstein
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
Potential Fields for Maintaining Connectivity of Dynamic Graphs MEAM 620 Final Project Michael M. Zavlanos.
Spectral Embedding Alexander Bronstein, Michael Bronstein
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Lecture outline Support vector machines. Support Vector Machines Find a linear hyperplane (decision boundary) that will separate the data.
Distance Metric Learning: A Comprehensive Survey
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Diffusion Maps and Spectral Clustering
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Enhancing Tensor Subspace Learning by Element Rearrangement
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Spectral graph theory Network & Matrix Computations CS NMC David F. Gleich.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction Presented by Xianwang Wang Masashi Sugiyama.
Low-Rank Kernel Learning with Bregman Matrix Divergences Brian Kulis, Matyas A. Sustik and Inderjit S. Dhillon Journal of Machine Learning Research 10.
Spectral Analysis based on the Adjacency Matrix of Network Data Leting Wu Fall 2009.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Optimal Dimensionality of Metric Space for kNN Classification Wei Zhang, Xiangyang Xue, Zichen Sun Yuefei Guo, and Hong Lu Dept. of Computer Science &
Unsupervised Feature Selection for Multi-Cluster Data Deng Cai, Chiyuan Zhang, Xiaofei He Zhejiang University.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University.
June 25-29, 2006ICML2006, Pittsburgh, USA Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction Masashi Sugiyama Tokyo Institute of.
Efficient Semi-supervised Spectral Co-clustering with Constraints
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016.
 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.
GPU Accelerated Vessel Segmentation Using Laplacian Eigenmaps Lin Cheng, Hyunsu Cho and Peter A. Yoon Trinity College.
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
Optimal Trajectory for Network Establishment of Remote UAVs –1–1 Prachya Panyakeow, Ran Dai, and Mehran Mesbahi American Control Conference June 2013.
Geometric diffusions as a tool for harmonic analysis and structure definition of data By R. R. Coifman et al. The second-round discussion* on * The first-round.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
Multi-index Evaluation Algorithm Based on Locally Linear Embedding for the Node importance in Complex Networks Fang Hu
Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.
Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Columbia University Advanced Machine Learning & Perception – Fall 2006 Term Project Nonlinear Dimensionality Reduction and K-Nearest Neighbor Classification.
Spectral Methods for Dimensionality
Spectral partitioning works: Planar graphs and finite element meshes
INTRODUCTION TO Machine Learning 3rd Edition
Dipartimento di Ingegneria «Enzo Ferrari»,
Jianping Fan Dept of CS UNC-Charlotte
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Approximating the Community Structure of the Long Tail
3.3 Network-Centric Community Detection
Nonlinear Dimension Reduction:
Presentation transcript:

Structure Preserving Embedding Blake Shaw, Tony Jebara ICML 2009 (Best Student Paper nominee) Presented by Feng Chen

Outline Motivation Solution Experiments Conclusion

Motivation Graphs exist everywhere: web link networks, social networks, molecules networks, k-nearest neighbor graph, and more. Graph Embedding – Embed a graph in a low dimensional space. – What used for? Visualization, Compression, and More. – Existing methods: Planar Embedding (no crossing arcs); Spectral Embedding (preserve local distances); Laplacian Eigenmaps (preserve local distances).

Problem Statement Given only the connectivity of a graph, can we efficiently recover low-dimensional point coordinates for each node such that these points can be easily used to reconstruct the original structure of the network?

Optimization Formulation

Justifications Why can preserve the local distances?

Justifications Interpret the slack variable 

Implementation It is about finding a positive semi-definite matrix that can maximize an objective function, subject to some constraints about the matrix. This category of optimization problem is called “Semi-definite Programming” Available MATLAB tools include CSDP and SDP- LR.

Related Work Spectral Embedding – Find the d leading eigen-vectors of the adjacency matrix A as the coordinates. Laplacian Eigenmaps – Employ a spectral decomposition of the graph Laplacian L=D-A, where D=diag(A1), and use the d eigenvectors of L with the smallest nonzero eigenvalues. – Look weird? But this approach has the fundamental from manifold theories in the field of topology!!

Experiments

Experiment

Conclusion Innovative idea! Technically simple!

Thank You! Any Question?