Potential Fields for Maintaining Connectivity of Dynamic Graphs MEAM 620 Final Project Michael M. Zavlanos.

Slides:



Advertisements
Similar presentations
Complex Networks for Representation and Characterization of Images For CS790g Project Bingdong Li 9/23/2009.
Advertisements

2.2 Linear Equations.
Rate of change / Differentiation (3)
Modularity and community structure in networks
Information Networks Graph Clustering Lecture 14.
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
Clustering II CMPUT 466/551 Nilanjan Ray. Mean-shift Clustering Will show slides from:
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Limitation of Pulse Basis/Delta Testing Discretization: TE-Wave EFIE difficulties lie in the behavior of fields produced by the pulse expansion functions.
1 Modularity and Community Structure in Networks* Final project *Based on a paper by M.E.J Newman in PNAS 2006.
1 Fast Primal-Dual Strategies for MRF Optimization (Fast PD) Robot Perception Lab Taha Hamedani Aug 2014.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
One-Shot Multi-Set Non-rigid Feature-Spatial Matching
Lecture 21: Spectral Clustering
Spectral Clustering Scatter plot of a 2D data set K-means ClusteringSpectral Clustering U. von Luxburg. A tutorial on spectral clustering. Technical report,
CS 584. Review n Systems of equations and finite element methods are related.
Ch 7.8: Repeated Eigenvalues
Segmentation Graph-Theoretic Clustering.
Centrality Measures These measure a nodes importance or prominence in the network. The more central a node is in a network the more significant it is to.
Hypercubes and Neural Networks bill wolfe 9/21/2005.
Computer Graphics Recitation The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Structure Preserving Embedding Blake Shaw, Tony Jebara ICML 2009 (Best Student Paper nominee) Presented by Feng Chen.
Modal Shape Analysis beyond Laplacian (CAGP 2012) Klaus Hildebrandt, Christian Schulz, Christoph von Tycowicz, Konrad Polthier (brief) Presenter: ShiHao.Wu.
Therorem 1: Under what conditions a given matrix is diagonalizable ??? Jordan Block REMARK: Not all nxn matrices are diagonalizable A similar to (close.
Section 3.5 Lines and Planes in 3-Space. Let n = (a, b, c) ≠ 0 be a vector normal (perpendicular) to the plane containing the point P 0 (x 0, y 0, z 0.
DIAM About the number of vines on n nodes. TU Delft.
Spectral Analysis based on the Adjacency Matrix of Network Data Leting Wu Fall 2009.
CHAPTER Continuity Fundamental Theorem of Calculus In this lecture you will learn the most important relation between derivatives and areas (definite.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
Analysis of Social Media MLD , LTI William Cohen
Domain decomposition in parallel computing Ashok Srinivasan Florida State University.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Graph spectral analysis/
geometric representations of graphs
 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.
Course 5 Edge Detection. Image Features: local, meaningful, detectable parts of an image. edge corner texture … Edges: Edges points, or simply edges,
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Mesh Segmentation via Spectral Embedding and Contour Analysis Speaker: Min Meng
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
Optimal Trajectory for Network Establishment of Remote UAVs –1–1 Prachya Panyakeow, Ran Dai, and Mehran Mesbahi American Control Conference June 2013.
Normalized Cuts and Image Segmentation Patrick Denis COSC 6121 York University Jianbo Shi and Jitendra Malik.
1 Entropy Waves, The Zigzag Graph Product, and New Constant-Degree Expanders Omer Reingold Salil Vadhan Avi Wigderson Lecturer: Oded Levy.
CSE 554 Lecture 8: Alignment
Random Walk for Similarity Testing in Complex Networks
Standard and Slope-Intercept Form
Jordan Block Under what conditions a given matrix is diagonalizable ??? Therorem 1: REMARK: Not all nxn matrices are diagonalizable A similar to.
Chapter 3 Overview.
Space Vectors Problem 3.19 Determine the moment about the origin of the coordinate system, given a force vector and its distance from the origin. This.
Linear Equations Objectives: Find slope of a line
Segmentation Graph-Theoretic Clustering.
3.5 Write and Graph Equations of Lines
Approximating the Community Structure of the Long Tail
geometric representations of graphs
3.3 Network-Centric Community Detection
Great Theoretical Ideas In Computer Science
Linear Algebra Lecture 32.
Principal Component Analysis
Drawing Graphs The parabola x Example y
Announcements Project 1 is out today help session at the end of class.
Homogeneous Linear Systems
Detecting Important Nodes to Community Structure
GRADIENTS AND STRAIGHT LINE GRAPHS
Eigenvalues and Eigenvectors
Propagation in Graphs Modulo a Prime
“Traditional” image segmentation
SCATTERGRAPHS Plot the points Draw in the line of best fit
Linear Algebra Lecture 28.
Lin. indep eigenvectors One single eigenvector
Presentation transcript:

Potential Fields for Maintaining Connectivity of Dynamic Graphs MEAM 620 Final Project Michael M. Zavlanos

Problem Formulation n mobile agents in an obstacle free workspace, with single integrator dynamics: dx i /dt = u i. state – dependent graph G(x): Nodes correspond to the agents and we draw an Edge between two nodes if their pairwise distance is smaller than some threshold R. Adjacency matrix: A(x). Graph Laplacian: L(x) = D(x) – A(x). λ 1 (L(x)) = 0 with corresponding eigenvector 1. λ 2 (L(x)) > 0 => G(x) is connected.

Background Yoonsoo Kim and Mehran Mesbahi, “On Maximizing the Second Smallest Eigenvalue of a State – Dependent Graph Laplacian”, IEEE Transactions on Automatic Control (to appear). Let P be a nx(n-1) projection matrix to the space perpendicular to the vector 1. L(x) positive semi-def. => P T L(x)P positive semi-def. λ 2 (P T L(x)P) > 0  P T L(x)P > 0

Potential Field Approach Eigenvalues of L(x): 0 = λ 1 <= λ 2 <= … <= λ n Eigenvalues of P T L(x)P : 0 <= λ 2 <= … <= λ n λ 2 (P T L(x)P) > 0  det(P T L(x)P) > 0 Control Law: Connectivity modeled as an obstacle. u i = d/dx i ( 1/det(P T L(x)P) )

Gradient of det(P T L(x)P) Let M(x) = P T L(x)P. We can show that: And hence: