Linear Programming 2011 1 Back to Cone  Motivation: From the proof of Affine Minkowski, we can see that if we know generators of a polyhedral cone, they.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

5.4 Basis And Dimension.
5.2 Rank of a Matrix. Set-up Recall block multiplication:
Ch 7.7: Fundamental Matrices
Lecture #3; Based on slides by Yinyu Ye
How should we define corner points? Under any reasonable definition, point x should be considered a corner point x What is a corner point?
5.3 Linear Independence.
The Structure of Polyhedra Gabriel Indik March 2006 CAS 746 – Advanced Topics in Combinatorial Optimization.
Eigenvalues and Eigenvectors
Chapter 5 Orthogonality
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Orthogonality and Least Squares
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Linear Algebra Chapter 4 Vector Spaces.
C&O 355 Mathematical Programming Fall 2010 Lecture 4 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Simplex method (algebraic interpretation)
Linear Programming System of Linear Inequalities  The solution set of LP is described by Ax  b. Gauss showed how to solve a system of linear.
Chapter 3. Pitfalls Initialization Ambiguity in an iteration
Linear Equations in Linear Algebra
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
Orthogonality and Least Squares
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Section 2.3 Properties of Solution Sets
Linear Programming (Convex) Cones  Def: closed under nonnegative linear combinations, i.e. K is a cone provided a 1, …, a p  K  R n, 1, …, p.
3.3 Implementation (1) naive implementation (2) revised simplex method
OR Backgrounds-Convexity  Def: line segment joining two points is the collection of points.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
I.4 Polyhedral Theory 1. Integer Programming  Objective of Study: want to know how to describe the convex hull of the solution set to the IP problem.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
X y x-y · 4 -y-2x · 5 -3x+y · 6 x+y · 3 Given x, for what values of y is (x,y) feasible? Need: y · 3x+6, y · -x+3, y ¸ -2x-5, and y ¸ x-4 Consider the.
OR Chapter 7. The Revised Simplex Method  Recall Theorem 3.1, same basis  same dictionary Entire dictionary can be constructed as long as we.
Hon Wai Leong, NUS (CS6234, Spring 2009) Page 1 Copyright © 2009 by Leong Hon Wai CS6234: Lecture 4  Linear Programming  LP and Simplex Algorithm [PS82]-Ch2.
Proving that a Valid Inequality is Facet-defining  Ref: W, p  X  Z + n. For simplicity, assume conv(X) bounded and full-dimensional. Consider.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
1 Chapter 4 Geometry of Linear Programming  There are strong relationships between the geometrical and algebraic features of LP problems  Convenient.
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Proving that a Valid Inequality is Facet-defining
Chap 9. General LP problems: Duality and Infeasibility
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
Chapter 6. Large Scale Optimization
Chap 3. The simplex method
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
Chapter 4. Duality Theory
§2-3 Observability of Linear Dynamical Equations
Linear Algebra Lecture 39.
Chapter 5. The Duality Theorem
System of Linear Inequalities
Affine Spaces Def: Suppose
I.4 Polyhedral Theory (NW)
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Back to Cone Motivation: From the proof of Affine Minkowski, we can see that if we know generators of a polyhedral cone, they can be used to describe.
I.4 Polyhedral Theory.
Proving that a Valid Inequality is Facet-defining
(Convex) Cones Def: closed under nonnegative linear combinations, i.e.
Chapter 2. Simplex method
BASIC FEASIBLE SOLUTIONS
Linear Equations in Linear Algebra
Chapter 6. Large Scale Optimization
Chapter 2. Simplex method
Presentation transcript:

Linear Programming Back to Cone  Motivation: From the proof of Affine Minkowski, we can see that if we know generators of a polyhedral cone, they can be used to describe the polyhedron in R n.  Which generators are important for generating a polyhedral cone?  Def: Given cone K, then for a  K \ {0}, the half line { ya: y  0} is called a ray of cone K. We term a  0 as a ray of K, but think { ya : y  0}

Linear Programming  Def: A ray of cone K is called an extreme ray if it cannot be written as a proper (weights are > 0) conical combination of two other distinct rays of K, i.e. a  K \ {0} is an extreme ray when a = y 1 a 1 + y 2 a 2, y 1, y 2 > 0, and a 1, a 2  K \ {0}  either  z 1 > 0 s.t. a 1 = z 1 a or  z 2 > 0 s.t. a 2 = z 2 a ex) O O O X 0 Note that the cone is generated by extreme rays. Then can we say that all cones are generated by extreme rays?

Linear Programming  Ex) Consider cone K = { (x 1, x 2 ) : x 2  0 } Consider vector (2, 0) below, (2, 0) = 3(1, 0) +1(-1, 0), i.e. (2, 0) is a positive scalar multiple of (1, 0) and it is impossible to express (2, 0) as proper conical combination without using a vector having the same direction as (2, 0), hence (2, 0) ( and (1, 0) ) is an extreme ray K (1, 0) (2, 0) (-1, 0) x1x1 x2x2 Note: 1.Not both of (1, 0) and (-1, 0) need to be positive multiple of (2, 0). 2.Extreme rays are (-1, 0) and (1, 0), but this K is not generated by these extreme rays. 0

Linear Programming  Def: The lineality space of cone K is K  (-K), where (-K) = { -a : a  K}, i.e. K  (-K) = { a : a  K, (-a)  K }. It is a subspace. Observe that for K = { x : Ax  0}, have lineality subspace { x : Ax = 0} The lineality of K is the rank of K  (-K) Cone K is said pointed provided K  (-K) = {0} K x1x1 x2x2 0 -K K  (-K) K -K K  (-K) Pointed cone

Linear Programming  Prop: For pointed cones, a  K \ {0} is an extreme ray when a = y 1 a 1 + y 2 a 2, y 1, y 2 > 0, and a 1, a 2  K \ {0}  both  z 1 > 0 s.t. a 1 = z 1 a and  z 2 > 0 s.t. a 2 = z 2 a Pf) Suppose a = y 1 a 1 + y 2 a 2, y 1, y 2 > 0, a 1, a 2  K \ {0} and  z 1 > 0 with a 1 = z 1 a. Then a = y 1 z 1 a + y 2 a 2, i.e. a( 1 – y 1 z 1 ) = y 2 a 2 (1) 1 – y 1 z 1 = 0  either y 2 = 0 or a 2 = 0, contradiction. (2) 1 – y 1 z 1 < 0  -a = { y 2 / ( y 1 z 1 -1) } a 2  -a  K, contradiction. From (1), (2)  (1 – y 1 z 1 ) > 0, i.e. a 2 = z 2 a 

Linear Programming  Thm: Suppose K is pointed, nontrivial (  {0},   ) polyhedral cone. Then K has finitely many extreme rays, say A = { a 1, …, a m }, and K is generated by A. Pf) Minkowski’s theorem guarantees K is finitely generated, say by A = { a 1, …, a l }. Suppose this set of generators is minimal in that A \ {a k } does not generate K. We will show A, A are the same set up to positive multiplication. When l = 1, K is a half line and conclusion clear. Suppose l > 1. We first show A  A. Pick any extreme ray a k  A. Since a k  K and A generates K, we have  y 1, …, y l  0 s.t. a k = y 1 a 1 + … + y l a l and some y j > 0 (since a k  A  a k  0), hence a k = y j a j + (  i  j y i a i ). Now, if  i  j y i a i = 0, then a k = y j a j and if  i  j y i a i  0, then a k = y j a j + a ( a  K \ {0} ) So, from a k extreme ray  a j = za k for some z > 0. Hence a positive multiple of a k appears in A. Thus A  A and so A is a finite set.

Linear Programming (continued) Now need to show A  A. Show this by supposing  a k  A \ A (i.e. a k is a generator but not an extreme ray), and derive contradiction. Since a k  A  a k is not an extreme ray. So  b 1, b 2  K \ {0}, s.t. a k = z 1 b 1 + z 2 b 2, z 1, z 2 > 0, b 1, b 2  a k. Since b 1, b 2  K   s i, t i  0 s.t. b 1 =  i=1 l s i a i, b 2 =  i=1 l t i a i Hence a k =  i=1 l (z 1 s i + z 2 t i ) a i  ( 1 - z 1 s k - z 2 t k ) a k =  i  k (z 1 s i + z 2 t i ) a i 3 cases: (1) (1 - z 1 s k - z 2 t k ) > 0  a k is positive combination of {a i : i  k}.  A \ {a k } generates K  A not minimal, contradiction. (2) (1 - z 1 s k - z 2 t k ) < 0  -a k =  i  k {(z 1 s i + z 2 t i ) / (z 1 s k + z 2 t k – 1) } a i  -a k  K  a k  lineality space of K  K not pointed, contradiction.

Linear Programming (continued) ( 1 - z 1 s k - z 2 t k ) a k =  i  k (z 1 s i + z 2 t i ) a i (3) (1 - z 1 s k - z 2 t k ) = 0 Observe that  j  k s.t. z 1 s j + z 2 t j > 0. Why? else s i = t i = 0  i  k  b 1 = s k a k, b 2 = t k a k  b 1, b 2 are positive multiplication of a k, contradiction. Then if j is unique, we have 0 = (1 - z 1 s k - z 2 t k )a k = (z 1 s j + z 2 t j ) a j Note that (z 1 s j + z 2 t j ) > 0, a j  0. So (z 1 s j + z 2 t j ) a j  0, contradiction. Hence (1 - z 1 s k - z 2 t k )a k - (z 1 s j + z 2 t j ) a j =  i  j, k (z 1 s i + z 2 t i ) a i  -a j =  i  j, k {(z 1 s i + z 2 t i ) / (z 1 s j + z 2 t j )} a i Note that (z 1 s i + z 2 t i )  0, (z 1 s j + z 2 t j ) > 0. So - a j  K  a j  lineality space of K  K not pointed, contradiction 

Linear Programming  The proof shows that the set of extreme rays is the unique (up to positive scalar multiplication) minimal set of generators for nontrivial, polyhedral, pointed cone K, i.e. we have shown For nontrivial, polyhedral cone K, pointedness of K  {extreme rays} = {generators} How about the converse? i.e. for K nontrivial, polyhedral cone, does K not pointed  K not generated by extreme rays?  For example, consider a line through the origin, which is not a pointed cone. Then there exist 2 extreme rays and these do generate all K. Hence, the converse of the Theorem is false.

Linear Programming  But consider K nontrivial, polyhedral, not a line. Then does K not pointed  K not generated by extreme rays? (yes) Pf) Since K is not pointed,  a  0 in the lineality space of K, i.e. a, - a  K. Consider any x  K \ {0}, but not in { ya: y  R}. Then x is not a positive scaling of either a or –a. Hence, neither are x+a, x-a. But x+a, x-a  K and x = ½(x+a) + ½(x-a) So this x is not an extreme ray of K. Therefore, a, -a are the only candidates for extreme rays of K, yet a, -a cannot generate K. 

Linear Programming  Prop (Cone decomposition): Let K be a convex cone with lineality space S. Then K = S + (K  S o ) and (K  S o ) is pointed. Pf) x  K   x’  S, x’’  S o s.t. x = x’ + x’’ (from HW) But x’  S  - x’  K  x + (-x’) = x’’  K. Hence x’  S, x’’  K  S o Alternatively, if x*  S  K and x**  K  S o, then x* + x**  K since both x*, x**  K. Together, have K = S + (K  S o ) To see that (K  S o ) is pointed, suppose a  (K  S o ) and -a  (K  S o ) a, -a  K  a  S, hence a  (S  S o )  a = 0 Hence (K  S o ) is pointed. 

Linear Programming  Note that the decomposition K = S + (K  S o ) is not a unique representation. K x1x1 x2x2 0 S SoSo K  S o a K = S + {ya: y  0} also.

Linear Programming Finding Generators of Cone K  Suppose given K = { x: Ax  0}, polyhedral cone, want to find generators of K. Then lineality space of K is S = { x : Ax = 0} First find a basis (rows of) B for S (using G-J elimination) ( recall that if rows of A generates a subspace, S = A o = { x: Ax = 0}. Suppose the columns of A are permuted so that AP = [ B : N ], where B is m  m and nonsingular. By elementary row operations, obtain EAP = [ I m : EN ], E = B -1. Then the columns of the matrix D  constitute a basis for S.) Here the basis matrix B = D’.

Linear Programming (continued) Then find extreme rays of K  S o, say rows of matrix C. Have K = { y’B + z’C: y  R p, z  R + q }, where rows of B are basis for S and rows of C are extreme rays of pointed cone K  S o. Once B known, then S o = { x : Bx = 0}  K  S o = { x : Ax  0, Bx  0, -Bx  0} ( recall that for S = { y’A: y  R m }, T = { x  R n : Ax = 0}, then have S o = T, T o = S ) Finding generators of K reduces to finding extreme rays of a pointed cone. Observe that { x : Ax = 0, Bx = 0, -Bx = 0} = {0} because K  S o is pointed. Hence rank of matrix is n (full column rank)

Linear Programming  Ex) Consider the cone K = { x  R 3 : x 1 + x 2 + x 3  0, x 2 + 2x 3  0} Hence, basis for S ( = {x: Ax = 0}, null space of A) is (1, -2, 1)’ S o = { x : x 1 – 2x 2 + x 3 = 0} (S o is the row space of A). Hence, K  S o = { x  R 3 : x 1 + x 2 + x 3  0, x 2 + 2x 3  0, x 1 – 2x 2 +x 3  0, - x 1 + 2x 2 - x 3  0 }  Yet we don’t know how to identify generators (extreme rays) of pointed cone K  S o.

Linear Programming  Geometric view H2H2 H1H1 H3H3 H4H4 a1a1 a2a2 a3a3 a4a4 0

Linear Programming Characterizing Extreme Rays  Thm: Suppose K = { x: Ax  0}, where A: m  n has rank n. Let x  K and reorder rows of A as A =, where Ux = 0, Vx < 0. Then x generates extreme ray of K  U has rank n-1. Pf) Prove contraposition in both directions.  ) Suppose rank(U)  n-1. If rank(U) = n  x = 0 (since Ux = 0)  x not an extreme ray. If rank(U) < n-1  rank < n   vector a  0 s.t. Ua = 0 and xa = 0 (latter  a not a multiple of x) Consider x +  a, x -  a, for  > 0 and small.Then U( x   a) = 0. Also Vx 0, V(x +  a) 0).

Linear Programming (continued) But x = ½( x +  a) + ½( x -  a), and neither ½( x +  a) nor ½( x -  a) is a positive multiple of x. Hence x not extreme in K.  ) Suppose x does not generate an extreme ray of K. Then either x = 0, in which case U = A (definition of U) and rank(A) = n  rank(U) = n  n-1. So suppose x  0 and not extreme in K, then must have x = y 1 a 1 + y 2 a 2 with y 1, y 2 > 0, a 1, a 2  K \ {0} and neither a 1 nor a 2 is a positive multiplication of x. Observe that a 1, a 2 are linearly indep., else contradiction. ( Why? a 1 =  a 2 with  > 0  x =  ’a 2,  ’> 0, contradiction to above. If  0 (contradiction) (3) – x  K  K not pointed, contradiction.)

Linear Programming (continued) Also 0 = Ux = y 1 (Ua 1 ) + y 2 (Ua 2 ) Note that y 1 > 0, Ua 1  0, y 2 > 0, Ua 2  0, so Ua 1 = Ua 2 = 0  Ua 1 = Ua 2 = 0  rank(U)  n-2, i.e. Rank(U)  n-1 

Linear Programming  So to find extreme rays of a pointed cone, list all submatrices of A which have rank n-1. Suppose that those submatrices are B i, i  I, | I | < + . Now rank B i = n-1  rank {x: B i x = 0} = 1. The solution set is a line, i.e. of form L i  { yb i : y  R} for some b i  R n ( find b i from B i by G-J elimination.) Now 3 things can happen ( Observe L i  K because K pointed) Consider L i = L i +  L i -, where L i + = {yb i : y  0} L i - = {y(-b i ) : y  0} (1) L i +  K, then b i generates extreme ray of K. (2) L i -  K, then - b i generates extreme ray of K. (3) neither (1) nor (2), then b i or – b i fails to generate extreme ray of K.

Linear Programming  Ex) H2H2 H1H1 H3H3 H4H4 a1a1 a2a2 a3a3 a4a4 0 H 1 and H 2 generate a 2, H 1 and H 4 generate a 1. But H 1 and H 3 intersect outside of K, hence fail to generate extreme ray of K.

Linear Programming  In summary, given K = { x : Ax  0}  K = S + K  S o ( S = { x : Ax = 0} )  Find basis matrix B of lineality space S using G-J. S o = { x : Bx = 0 } ( S o is subspace generated by rows of A)  Let K’ = K  S o  K’ = { x: Ax  0, Bx  0, - Bx  0 } = { x: Ax  0, Bx = 0}  To find extreme ray of K’, choose n-1 independent rows of the above constraints to obtain B i matrix. Then find +, - basis of the null space of B i, i.e. { x: B i x = 0} using G-J Plug in +, - basis to K’ to check if the vector is in K’. If the vector is in K’, it is an extreme ray.

Linear Programming  (Ex-continued) Consider the cone K = { x  R 3 : x 1 + x 2 + x 3  0, x 2 + 2x 3  0} basis for S ( = {x: Ax = 0}, null space of A) is (1, -2, 1)’ S o = { x : x 1 – 2x 2 + x 3 = 0} (S o is the row space of A). K  S o = { x  R 3 : x 1 + x 2 + x 3  0, x 2 + 2x 3  0, x 1 – 2x 2 +x 3 = 0} Consider the combinations 1 st –2 nd (unnecessary), 1 st -3 rd, 2 nd -3 rd constraints.

Linear Programming

Linear Programming  Conversely, how can we find the constrained form of a cone given that generators of the cone are known? Use K = K ++ iff K is finitely generated nonempty cone. Recall that for K = { y’A: y  0}, L = { x: Ax  0}, we have K + = L, L + = K. Hence, given K = { y’A: y  0}, we first construct K + which is given by K + = L = { x: Ax  0}. We then find generators of K + using previous results. Since K + now is described by generators, we take the polar cone of K + again to get K ++ (= K) which is now described as a constrained system. (Note that in K = S + (K  S o ), S is described by linear combinations of a basis. By taking  basis as generators of S, we can describe K as conical combinations of  basis of S and generators of the pointed cone (K  S o ) )

Linear Programming K=K ++ (1, 2) (2, 1) Example) K+K+ (-2, 1) (1, -2) x1x1 x2x2

Linear Programming  (ex-continued) Cone K is generated by two vectors (1, 2) and (2, 1). Hence its polar is K + = { x: x 1 + 2x 2  0, 2x 1 + x 2  0}. Lineality space of K + = {0}, so K + is a pointed cone and its extreme rays are generators of K +. To find generators of K +, we set n-1 = 1 of its constraints at equality and find the one dimensional line satisfying the equality. From x 1 + 2x 2 = 0, we get two vectors (2, -1) and (-2,1). Among these two vectors, (-2, 1) is in the cone, hence is an extreme ray of K +. Similarly, we get extreme ray (1, -2) from 2x 1 + x 2 = 0. These two vectors are all extreme rays of K +. Hence its polar cone is described as K ++ = K = { x: -2x 1 + x 2  0, x 1 – 2x 2  0}.

Linear Programming Back to Projection  Consider the projection of P = { (x, y)  R n+p : Ax + Gy  b} onto the x space Pr x (P) = { x  R n : (x, y)  P for some y  R p }.  Prop: Let C = { p  R m : p’G = 0, p  0 } and E be the set of extreme rays of C. Then Pr x (P) = { x: (p’A)x  p’b, for all p  E } pf) Note that C is a pointed cone, hence extreme rays are generators. x  Pr x (P)  Gy  (b-Ax) feasible for given x  p  0, p’G = 0, p’(b-Ax) < 0 infeasible   p  0, p’G = 0, we have p’(b-Ax)  0  p’Ax  p’b for all p  E 

Linear Programming  Note: Another form of theorem of alternatives: (I) there exists x such that Ax  b (II) there exists p  0 such that p’A = 0, p’b < 0 Pf) Express system (I) as A(x + - x - ) + Iy = b, x +, x -, y  0 By Farkas’ lemma, system (II) is p’A  0’, -p’A  0’, p’  0’, p’b < 0, which is the same as system (II) above. This may be proved using LP duality (later). 