Napa Valley August 3, 2005. M. Cao A. S. Morse B. D. O. Anderson Yale University Yale University Australian National University Vicsek’s System with Integer.

Slides:



Advertisements
Similar presentations
Chapter 2 Revision of Mathematical Notations and Techniques
Advertisements

Algorithms (and Datastructures) Lecture 3 MAS 714 part 2 Hartmut Klauck.
5.4 Basis And Dimension.
5.1 Real Vector Spaces.
Chapter 4 Euclidean Vector Spaces
A. S. Morse Yale University University of Minnesota June 4, 2014 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
CSE115/ENGR160 Discrete Mathematics 04/26/12 Ming-Hsuan Yang UC Merced 1.
8.3 Representing Relations Connection Matrices Let R be a relation from A = {a 1, a 2,..., a m } to B = {b 1, b 2,..., b n }. Definition: A n m  n connection.
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
1 Slides based on those of Kenneth H. Rosen Slides by Sylvia Sorkin, Community College of Baltimore County - Essex Campus Graphs.
COORDINATION and NETWORKING of GROUPS OF MOBILE AUTONOMOUS AGENTS.
12.1 Systems of Linear Equations: Substitution and Elimination.
Entropy Rates of a Stochastic Process
1.2 Row Reduction and Echelon Forms
Linear Equations in Linear Algebra
Rutgers May 25, 2011 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AA A AAA A A A A A A DIMACS Workshop on Perspectives.
Matrices. Special Matrices Matrix Addition and Subtraction Example.
Introduction to Graphs
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
Linear Equations in Linear Algebra
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
INDR 262 INTRODUCTION TO OPTIMIZATION METHODS LINEAR ALGEBRA INDR 262 Metin Türkay 1.
5  Systems of Linear Equations: ✦ An Introduction ✦ Unique Solutions ✦ Underdetermined and Overdetermined Systems  Matrices  Multiplication of Matrices.
Introduction to Flocking {Stochastic Matrices} A. S. Morse Yale University Gif – sur - Yvette May 21, 2012 TexPoint fonts used in EMF. Read the TexPoint.
A. S. Morse Yale University University of Minnesota June 4, 2014 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
Chap. 1 Systems of Linear Equations
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Chapter 1 – Linear Equations
GRAPH Learning Outcomes Students should be able to:
MAKING COMPLEX DEClSlONS
 Row and Reduced Row Echelon  Elementary Matrices.
Graph Theoretic Concepts. What is a graph? A set of vertices (or nodes) linked by edges Mathematically, we often write G = (V,E)  V: set of vertices,
Zvi Kohavi and Niraj K. Jha 1 Memory, Definiteness, and Information Losslessness of Finite Automata.
Matrix Completion Problems for Various Classes of P-Matrices Leslie Hogben Department of Mathematics, Iowa State University, Ames, IA 50011
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
GRAPHS CSE, POSTECH. Chapter 16 covers the following topics Graph terminology: vertex, edge, adjacent, incident, degree, cycle, path, connected component,
Network Systems Lab. Korea Advanced Institute of Science and Technology No.1 Appendix A. Mathematical Background EE692 Parallel and Distribution Computation.
WEEK 8 SYSTEMS OF EQUATIONS DETERMINANTS AND CRAMER’S RULE.
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
1 1.3 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra VECTOR EQUATIONS.
Yaomin Jin Design of Experiments Morris Method.
Discrete Math for CS Binary Relation: A binary relation between sets A and B is a subset of the Cartesian Product A x B. If A = B we say that the relation.
1 CS104 : Discrete Structures Chapter V Graph Theory.
Week 11 - Monday.  What did we talk about last time?  Binomial theorem and Pascal's triangle  Conditional probability  Bayes’ theorem.
Data Structures & Algorithms Graphs
Chapter 1 Fundamental Concepts Introduction to Graph Theory Douglas B. West July 11, 2002.
Matrix Completion Problems for Various Classes of P-Matrices Leslie Hogben Department of Mathematics, Iowa State University, Ames, IA 50011
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
A. S. Morse Yale University University of Minnesota June 2, 2014 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
Relevant Subgraph Extraction Longin Jan Latecki Based on : P. Dupont, J. Callut, G. Dooms, J.-N. Monette and Y. Deville. Relevant subgraph extraction from.
2 2.1 © 2012 Pearson Education, Inc. Matrix Algebra MATRIX OPERATIONS.
CSCI 115 Chapter 8 Topics in Graph Theory. CSCI 115 §8.1 Graphs.
Unit II Discrete Structures Relations and Functions SE (Comp.Engg.)
Flow in Network. Graph, oriented graph, network A graph G =(V, E) is specified by a non empty set of nodes V and a set of edges E such that each edge.
GRAPHS. Graph Graph terminology: vertex, edge, adjacent, incident, degree, cycle, path, connected component, spanning tree Types of graphs: undirected,
Information and Control Architectures for Formation Maintenance Swarming in Natural and Engineered Systems Workshop Brian DO Anderson (work involves others)
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 60 Chapter 8 Markov Processes.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
UCLA March 2, 2006 IPAM Workshop on Swarming by Nature and by Design thanks to the organizers: A. Bertozzi D. Grunbaum P. S. Krishnaprasad I. Schwartz.
Theory of Computational Complexity Probability and Computing Lee Minseon Iwama and Ito lab M1 1.
1 GRAPH Learning Outcomes Students should be able to: Explain basic terminology of a graph Identify Euler and Hamiltonian cycle Represent graphs using.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Subject Four Graphs Data Structures. What is a graph? A data structure that consists of a set of nodes (vertices) and a set of edges that relate the nodes.
7.3 Linear Systems of Equations. Gauss Elimination
Matrix Representation of Graphs
Analytic Synchronization
5 Systems of Linear Equations and Matrices
Deterministic Gossiping
Linear Equations in Linear Algebra
Lecture 4: Algorithmic Methods for G/M/1 and M/G/1 type models
Presentation transcript:

Napa Valley August 3, 2005

M. Cao A. S. Morse B. D. O. Anderson Yale University Yale University Australian National University Vicsek’s System with Integer – Valued Measurement Delays Napa Valley August 3, 2005

Motivated by simulation results reported in the paper:

 i = heading ii s = speed s Vicsek et al. simulated a flock of n agents {particles} all moving in the plane at the same speed s, but with different headings  1,  2, ….  n Each agent’s heading is updated using a local rule based on the average of its own current heading plus the headings of its “neighbors.” Vicsek’s simulations demonstrated that these nearest neighbor rules can cause all agents to eventually move in the same direction despite the absence of a leader and/or centralized coordination and despite the fact that each agent’s set of neighbors changes with time. A theoretical explanation for this observed behavior can be found in Jadbabaie, Lin & Morse, IEEE TAC, June 2003

riri agent i neighbors of agent i each agent is a neighbor of itself

 i = heading ii s = speed s N i (t) = set of indices of agents i’s “neighbors” at time t n i (t) = number of indices in N i (t) HEADING UPDATE EQUATIONS Average at time t of headings of neighbors of agent i.

G p = { V, A p } - a directed graph with vertex set V and arc set A p (i, j) 2 A p if agent i is a neighbor of agent j in configuration p P = index set of all possible neighbor configurations. V = agent index set ={1, 2, …, n} NEIGHBOR CONFIGURATIONS For each p 2 P (1,2) All vertices have self-arcs G * = set of all directed graphs with vertex set V and self-arcs at all vertices G = set of all directed graphs with vertex set V

adjacency matrix A p =[a ij ] n £ n a ij (p) = 1 if i is a neighbor of j a ij (p) = 0 otherwise d i (p) = in-degree of vertex i Matrix Representation of G p = { V, A p } (1,2) in-degree = 1 in-degree = 2 D p = diagonal {d 1 (p), d 2 (p), …., d n (p)} n £ n

adjacency matrix A p =[a ij ] n £ n a ij (p) = 1 if i is a neighbor of j a ij (p) = 0 otherwise D p = diagonal {d 1 (p), d 2 (p), …., d n (p)} n £ n (1,2) State Space Equation  ( t ) = index in P of neighbor configuration at time t.

A switched linear system 1. non-negative entries 2. row sums all = 1 stochastic F p 1 = 1 1 Induced infinity norm of n £ n nonnegative matrix Q, written ||Q||, is largest of Q’s row sums ||F p || = 1

and so Problem reduces to determining conditions under which where For if this is so, then

S = set of all n £ n stochastic matrices. All n £ n flocking matrices F p are in S * Problem reduces to determining conditions under which S, S * are each closed under multiplication. Therefore its sufficient to determine conditions on sequence of matrices S 1, S 2,.... in S for S * = set of all n £ n stochastic matrices with positive diagonal elements.

S = set of all n £ n stochastic matrices. Therefore its sufficient to determine conditions on sequence of matrices S 1, S 2,.... in S for

S = set of all n £ n stochastic matrices. Therefore its sufficient to determine conditions on sequence of matrices S 1, S 2,.... in S for This is a well studied problem in the theory of non-homogeneous Markov Chains We will outline {de-mystify} old & new convergence results in terms of the graphs of the S i For S 2 S, graph(S) is that graph in G whose adjacency matrix is the transpose of the matrix which results when all non-zero entries in S are replaced by 1s. In general, graph(F p ) = G p, p 2 P

Can show that || u S j  S 1 v || · || u S j v ||  || u S 2 v || || u S 1 v || S = 1 b S c + u S v where b S c = the largest row vector c 1 £ n for which S - 1c is non-negative and S u S v = S - 1 b S c For any S 2 S one can write  S j  S 2 S 1 = 1 b  S j  S 2 S 1 c + u  S j  S 2 S 1 v Thus for any sequence of matrices S 1,S 2,.... in S if u S j  S 1 v converges to 0 Therefore the product  S j  S 1 converges to 1c where

if u S j  S 1 v converges to 0 Therefore the product  S j  S 1 converges to 1c if the u S i v are each contractive Can show that || u S j  S 1 v || · || u S j v ||  || u S 2 v || || u S 1 v ||

Therefore the product  S j  S 1 converges to 1c if the u S i v are each contractive

Call a graph in G strongly rooted if at least one vertex is a neighbor of every other vertex in the graph. Note that graph(S) is strongly rooted at vertex i iff the ith column of S is positive. S = 1 b S c + u S v Thus the row sums of u S v are less than 1 iff graph(S) is strongly rooted. Thus || u S v || < 1 iff graph(S) is strongly rooted. Therefore the product  S j  S 1 converges to 1c if the u S i v are each contractive  S j  S 1 converges to 1c if for all i, graph(S i ) is strongly rooted. Therefore

 S j  S 1 converges to 1c if for all i, graph(S i ) is strongly rooted. Therefore Recall that both S and S * are both closed under multiplication. Because of this any infinite product of stochastic matrices can be decomposed into an infinite product of stochastic matrices which are themselves each products of a given finite number of stochastic matrices. Thus establishing convergence to 1c of an infinite product of stochastic matrices from either S or S * boils down to determining when the graph of a finite product is strongly rooted. Transitioning from Matrices to Graphs This issue can be decided completely in terms of graphs without any reference at all to stochastic matrices, linearity or the like. This is especially important because it allows one to consider flocking {consensus} problems which are nonlinear – e.g., the elegant work of Tanner, Jadbabaie & Pappas which explains Reynold’s academy award winning animation algorithms.

Thus establishing convergence to 1c of an infinite product of stochastic matrices from either S or S * boils down to determining when the graph of a finite product is strongly rooted. Transitioning from Matrices to Graphs

When is the composition of a finite number of graphs in either G or G * strongly rooted? Transitioning from Matrices to Graphs Thus establishing convergence to 1c of an infinite product of stochastic matrices from either S or S * boils down to determining when the graph of a finite product is strongly rooted. By the composition of a graph G 1 with a graph G 2, written G 2 ± G 1, is meant that graph which has an arc from i to j just in case there is a vertex k for which G 1 has an arc from i to k and G 2 has an arc from k to j. graph(S 2 S 1 ) = graph(S 2 ) ± graph(S 1 ) Thus deciding when a finite product of stochastic matrices has a strongly rooted graph is the same problem as deciding when a finite composition of graphs is strongly rooted. So

When is the composition of a finite number of graphs in either G or G * strongly rooted?

By a rooted graph is meant any graph G 2 G which has at least one vertex v {called a root} for which, for each vertex i 2 V, there is a directed path from v to i. Every composition of n 2 or more rooted graphs in G * is strongly rooted. The set of rooted graphs in G * is closed under composition. The set of rooted graphs in G * is the largest set of graphs in G * for which every sufficiently long composition is strongly rooted.

Reaching a Common Heading Using the preceding it is easy to see that any sequence of rooted graphs in G * can be re-written as an infinite sequence of strongly rooted graphs. If each graph in the sequence of graphs G  (0), G  (1),... encountered along a trajectory of (1) is rooted, then at an exponentially fast convergence rate.

The Vicsek Flocking Problem with Integer-Valued Sensing Delays A modified version of the Vicsek flocking problem in which integer-valued delays occur in the sensing of each agent’s headings. Specifically, at each time t 2 {0, 1, 2,....}, the value of neighbor j’s heading which agent i senses is  j (t –d ij (t) ) where d ij (t ) is a delay whose value at t is some integer between 0 and m j – 1. Here m j is a positive integer.

N i (t) = set of indices of agents i’ s neighbors including itself, at time t n i (t) = number of indices in N i (t) There are n agents labeled 1 through n. The Vicsek Model with Sensing Delays At each time t, each agent i can sense  j (t - d ij (t)), where j is a neighbor of i, d ij (t) 2 {0,1,...m i -1} if j  i and d ij (t) = 0 if i = j.  i = heading ii s = speed s

State Space Modeling There is a fairly straight forward way to model the preceding as a switched state space system. Doing this one needs to define suitable graphs which characterize the temporal and spatial relationships implicit in the update equations What results ia a switched linear system which is similar to what one encounters in the delay free case, except for the type of graphs involved.

Let G denote the set of all directed graphs with vertex set where Let G * denote the subset of G consisting of those graphs which satisfy the following: 1. Each vertex in the agent vertex set has a self-arc. 2. For each i 2 {1,2,..., n }, each vertex v ij 2 V i except the last, has an arc to its successor v i(j+1) 2 V i Graphs for Modeling Delays 3. For each i 2 {1,2,..., n }, each vertex v ij 2 V i except the first, has in-degree

G is the set of all directed graphs with vertex set where State Space Model P = index set for G P * = index set for G * Fix the vertex ordering A p = adjacency matrix of G p, p 2 P * D p = diagonal in-degree matrix of G p, p 2 P *  (  ) = index in P * of configuration at time . G * is the subset of G consisting of those graphs needed to model the possible temporal and spatial relations implicit in the n update equations m = m 1 +m 2 +  + m n

G is the set of all directed graphs with vertex set where G * is the subset of G consisting of those graphs needed to model the possible temporal and spatial relations implicit in the n update equations State Space Model If all existing vertices of graphs in G * had self-arcs then existing results could be applied. For example, were this the case then convergence to a common heading would result if all graphs encountered along a trajectory were rooted. But there are lots of vertices in graphs in G * which do not have self -arcs In fact, G * is not even closed under composition.

Hierarchical Graphs A rooted graph with vertex set {1,2,..., n} is said to have a hierarchy {v 1,v 2,... v n } if it is possible to re-label the graph’s vertices as v 1,v 2,...,v n in such a way so that v 1 is a root with a self-arc and for i >1, v i has a neighbor v j lower in the hierarchy where by lower we mean j < i {1, 2, 3, 4, 5} {3, 1, 2, 4, 5} Hierarchies The set of graphs with the same hierarchy is closed under composition. The composition of n-1 or more graphs with the same hierarchy is strongly rooted. Two graphs with the same hierarchy need not be equal (Although hierarchical graphs have been devised to deal with the Vicsek delay problem, they are also potentially applicable to leader-follower formation problems.)

Delay Graphs By a delay graph G is meant any graph in G with the following properties for each i 2 {1,2,..., n}: 1. Every neighbor of V i which is not in V i is a neighbor of v i1 2. The sub-graph of G induced by V i has as a hierarchy. Every graph needed to model delays, namely those graphs in G * is a delay graph. The set of delay graphs in G is closed under composition - G * is not. Recall that the sub-graph of a graph G induced by a subset S of G ’s vertices is that graph whose arcs (i, j) are those of G for which both i and j are in S.

Graph for Modeling Delays It has three hierarchies It is a delay graph So is this

Delay Graphs By a delay graph G is meant any graph in G with the following properties for each i 2 {1,2,..., n}: 1. Every neighbor of V i which is not in V i is a neighbor of v i1 2. The sub-graph of G induced by V i has as a hierarchy. Every graph needed to model delays, namely those graphs in G * is a delay graph. The set of delay graphs in G is closed under composition - G * is not. By the quotient graph of G 2 G is meant that graph with vertex set {1,2,..., n} and arc set defined so that (i, j) is an arc just in case there is an arc from at least one vertex in V i to at least one vertex in V j Recall that the sub-graph of a graph G induced by a subset S of G ’s vertices is that graph whose arcs (i, j) are those of G for which both i and j are in S.

Quotient Graph Graph for Modeling Delays 2 3 1

Delay Graphs By a delay graph G is meant any graph in G with the following properties for each i 2 {1,2,..., n}: 1. Every neighbor of V i which is not in V i is a neighbor of v i1 2. The sub-graph of G induced by V i has as a hierarchy. Every graph needed to model delays, namely those graphs in G * is a delay graph. The set of delay graphs in G is closed under composition - G * is not. Let m * = the largest integer in {m 1, m 2,...., m n } By the quotient graph of G 2 G is meant that graph with vertex set {1,2,..., n} and arc set defined so that (i, j) is an arc just in case there is an arc from at least one vertex in V i to at least one vertex in V j The composition of any set of at least m * (n 2 + 1) delay graphs is strongly rooted if the quotient graph of each of the graphs in the composition is rooted. Recall that the sub-graph of a graph G induced by a subset S of G ’s vertices is that graph whose arcs (i, j) are those of G for which both i and j are in S.

Generalization Call a finite sequence of delay graphs G 1, G 2,…. G m jointly quotient rooted, if the quotient graph of the composition G m ±  ± G 1 is a rooted graph. Call an infinite sequence of graphs G 1, G 2,…. G j, … of delay graphs repeatedly jointly quotient rooted, if there is a finite integer q such that each successive subsequence G q(k-1) +1, G (q(k-1)+2), … G mk, k ¸ 1, is jointly quotient rooted. Using the preceding ideas it is easy to see that any repeatedly jointly quotient rooted sequence of delay graphs can be re-written as an infinite sequence of strongly rooted graphs. If the sequence of graphs G  (0), G  (1),... encountered along a trajectory of (1) is repeatedly jointly quotient rooted, then at an exponentially fast convergence rate.

In turn, establishing convergence an infinite product of stochastic matrices boils down to determining when the graph of a finite product of such matrices is strongly rooted. Concluding Remarks This issue can be decided completely in terms of graphs without any reference at all to stochastic matrices, linearity or the like. The key point is that this allows us to consider flocking {consensus} problems which are nonlinear Establishing that agents reach a common heading {i.e., consensus} amounts to establishing convergence of an infinite product of stochastic matrices to a rank one matrix. The central question reduces to: When is a finite composition of a sequence of directed graphs strongly rooted?

Generalization Call a finite sequence of graphs G 1, G 2,…. G m in G jointly rooted, if the composition G m ±  ± G 1 is a rooted graph. Call an infinite sequence of graphs G 1, G 2,…. G j, … in G repeatedly jointly rooted, if there is a finite integer m such that each successive subsequence G m(k-1) +1, G (m(k-1)+2), … G mk, k ¸ 1, is jointly rooted. Using the preceding ideas it is easy to see that any repeatedly jointly rooted sequence of graphs in G * can be re-written as an infinite sequence of strongly rooted graphs. If the sequence of graphs G  (0), G  (1),... encountered along a trajectory of (1) is repeatedly jointly rooted, then at an exponentially fast convergence rate.

When is the composition of a finite number of graphs in either G or G * strongly rooted? By a neighbor of a subset S of the vertex set V in a given graph G is meant any vertex i in V for which there is an arc from i to at least one vertex in S. By a Sarymsakov graph is meant any graph G in G with the property that for each pair of non-empty subsets S 1 and S 2 in V which have no neighbors in common, S 1 [S 2 is a strictly proper subset of the union of S 1 [S 2 with the set of neighbors of S 1 [S 2. The set of Sarymsakov graphs is closed under composition For m sufficiently large, any composition of m Sarymsakov graphs is strongly rooted. By a rooted graph is meant any graph G 2 G which has at least one vertex v {called a root} for which, for each vertex i 2 V, there is a directed path from v to i. Every composition of n 2 or more rooted graphs in G * is strongly rooted. Every Sarymsakov graph in G is rooted. Every rooted graph in G * is a Sarymsakov graph. By a neighbor-shared graph is meant any graph G 2 G for which each distinct pair of vertices in the graph share a commom neighbor. The set of neighbor-shared graphs in G * is closed under composition. The set of rooted graphs in G * is closed under composition. Every composition of n - 1 or more neighbor-shared graphs in G * is strongly rooted. Every neighbor-shared graph in G * is rooted. The graph of a scrambling stochastic matrix is neighbor-shared.

Feature of the original Vicsek model considered in the above paper Jadbabaie, Lin & Morse, IEEE TAC, June 2003 No leader All r i = r. Synchronous operation No delays in sensing of headings Main technical tools exploited Simple {undirected} graphs to describe neighbor relationships A theorem of J. Wolfowitz in Proc. AMS, 1963 which gives conditions for an infinite product of left stochastic matrices to converge to a rank one matrix. Algebraic graph theory Interesting property: Neighbor graphs change with time

No leader All r i = r. Synchronous operation No delays in sensing of headings Additional technical tools exploited include Subsequent research by B. Francis, L. Moreau, V. Blondell, J. Tsitsiklis, D. Angeli, D. Spielman, M. Cao, B. Anderson, G. Tanner, G. Pappas, R. Beard, and others. asynchronous operation r i  r sensing delays convergence rates directed graphs a special partial Lyapunov function tailored for Markov Chains – Doob {1953}, Senta {1981}, Tsitsiklis {1984}, Bertsekas & Tsitsiklis {1989}. scrambling matrices, Sarymsakov matrices, and random walks