OR-1 20111 Backgrounds-Convexity  Def: line segment joining two points is the collection of points.

Slides:



Advertisements
Similar presentations
MATHEMATICS 3 Operational Analysis Štefan Berežný Applied informatics Košice
Advertisements

Autar Kaw Humberto Isaza
Lecture 7 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
3_3 An Useful Overview of Matrix Algebra
Chapter 5 Orthogonality
Lecture 6 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
Chapter 2 Basic Linear Algebra
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
CSC5160 Topics in Algorithms Tutorial 1 Jan Jerry Le
Chapter 2 Matrices Definition of a matrix.
Orthogonality and Least Squares
III. Reduced Echelon Form
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
INDR 262 INTRODUCTION TO OPTIMIZATION METHODS LINEAR ALGEBRA INDR 262 Metin Türkay 1.
Copyright © Cengage Learning. All rights reserved. 7.6 The Inverse of a Square Matrix.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Chapter 1 – Linear Equations
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Copyright © 2011 Pearson, Inc. 7.3 Multivariate Linear Systems and Row Operations.
MA2213 Lecture 5 Linear Equations (Direct Solvers)
Notation OR Backgrounds OR Convex sets Nonconvex set.
Presentation by: H. Sarper
Chapter 2 Simultaneous Linear Equations (cont.)
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Simplex method (algebraic interpretation)
Linear Programming System of Linear Inequalities  The solution set of LP is described by Ax  b. Gauss showed how to solve a system of linear.
We will use Gauss-Jordan elimination to determine the solution set of this linear system.
Sec 3.5 Inverses of Matrices Where A is nxn Finding the inverse of A: Seq or row operations.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Elementary Linear Algebra Anton & Rorres, 9th Edition
5.5 Row Space, Column Space, and Nullspace
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
I.4 Polyhedral Theory 1. Integer Programming  Objective of Study: want to know how to describe the convex hull of the solution set to the IP problem.
Chapter 1 Linear Algebra S 2 Systems of Linear Equations.
Copyright © Cengage Learning. All rights reserved. 2 SYSTEMS OF LINEAR EQUATIONS AND MATRICES Read pp Stop at “Inverse of a Matrix” box.
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Linear Programming Back to Cone  Motivation: From the proof of Affine Minkowski, we can see that if we know generators of a polyhedral cone, they.
2.5 – Determinants and Multiplicative Inverses of Matrices.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
Notation OR Backgrounds OR Convex sets Nonconvex set.
2 - 1 Chapter 2A Matrices 2A.1 Definition, and Operations of Matrices: 1 Sums and Scalar Products; 2 Matrix Multiplication 2A.2 Properties of Matrix Operations;
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
Matrices, Vectors, Determinants.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
College Algebra Chapter 6 Matrices and Determinants and Applications
Systems of First Order Linear Equations
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
Chap 3. The simplex method
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Linear Algebra Lecture 39.
System of Linear Inequalities
Affine Spaces Def: Suppose
I.4 Polyhedral Theory (NW)
Back to Cone Motivation: From the proof of Affine Minkowski, we can see that if we know generators of a polyhedral cone, they can be used to describe.
I.4 Polyhedral Theory.
Sec 3.5 Inverses of Matrices
Chapter 2. Simplex method
Prepared by Po-Chuan on 2016/05/24
Chapter 2. Simplex method
CHAPTER 4 Vector Spaces Linear combination Sec 4.3 IF
Presentation transcript:

OR Backgrounds-Convexity  Def: line segment joining two points is the collection of points

OR  Def: is called convex set iff whenever Convex sets Nonconvex set

OR  Def: The convex hull of a set S is the set of all points that are convex combinations of points in S, i.e. conv(S)={x: x =  i = 1 k i x i, k  1, x 1,…, x k  S, 1,..., k  0,  i = 1 k i = 1}  Picture: 1 x + 2 y + 3 z, i  0,  i = 1 3 i = 1 1 x + 2 y + 3 z = ( ){ 1 /( )x + 2 /( )y} + 3 z (assuming  0) x y z

OR Proposition: Let be a convex set and for, define Then is a convex set. Pf) If k = 0, kC is convex. Suppose For any x, y  kC,  Hence the property of convexity of a set is preserved under scalar multiplication. Consider other operations that preserve convexity.

OR Convex function  Def: Function is called a convex function if for all x 1 and x 2, f satisfies Also called strictly convex function if

 Meaning: The line segment joining (x 1, f(x 1 )) and (x 2, f(x 2 )) is above or on the locus of points of function values. OR

7  Def: f: R n  R. Define epigraph of f as epi(f) = { (x,  )  R n+1 :   f(x) }  Equivalent definition: f: R n  R is a convex function if and only if epi(f) is a convex set.  Def: f is a concave function if –f is a convex function.  Def: x  C is an extreme point of a convex set C if x cannot be expressed as y + (1- )z, 0 < < 1 for distinct y, z  C ( x  y, z ) (equivalently, x does not lie on any line segment that joins two other points in the set C.) : extreme points

OR Review-Linear Algebra  inner product of two column vectors x, y  R n : x’y =  i = 1 n x i y i If x’y = 0, x, y  0, then x, y are said to be orthogonal. In 3-D, the angle between the two vectors is 90 degrees. ( Vectors are column vectors unless specified otherwise. But, our text does not differentiate it.)

OR  Submatrices multiplication

OR  submatrices multiplications which will be frequently used.

OR  Def: is said to be linearly dependent if , not all equal to 0, such that ( i.e., there exists a vector in which can be expressed as a linear combination of the other vectors. )  Def: linearly independent if not linearly dependent. In other words, (i.e., none of the vectors in can be expressed as a linear combination of the remaining vectors.)  Def: Rank of a set of vectors : maximum number of linearly independent vectors in the set.  Def: Basis for a set of vectors : collection of linearly independent vectors from the set such that every vector in the set can be expressed as a linear combination of them. (maximal linearly independent subset, minimal generator of the set)

OR  Thm) r linearly independent vectors form a basis if and only if the set has rank r.  Def: row rank of a matrix : rank of its set of row vectors column rank of a matrix : rank of its set of column vectors  Thm) for a matrix A, row rank = column rank  Def : nonsingular matrix : rank = number of rows = number of columns. Otherwise, called singular  Thm) If A is nonsingular, then unique inverse exists.

OR Simutaneous Linear Equations Thm: Ax = b has at least one solution iff rank(A) = rank( [A, b] ) Pf)  ) rank( [A, b] )  rank(A). Suppose rank( [A, b] ) > rank(A). Then b is lin. ind. of the column vectors of A, i,e., b can’t be expressed as a linear combination of columns of A. Hence Ax = b does not have a solution.  ) There exists a basis in columns of A which generates b. So Ax = b has a solution.  Suppose A: m  n, rank(A) = rank [A, b] = r. Then Ax = b has a unique solution if r = n. Pf) Let y, z be any two solutions of Ax = b. Then Ay = Az = b, or Ay – Az = A(y-z) = 0. A(y-z) =  j=1 n A j (y j – z j ) = 0. Since column vectors of A are linearly independent, we have y j – z j = 0 for all j. Hence y = z. (Note that m may be greater than n.)

 OR

OR  Operations that do not change the solution set of the linear equations (Elementary row operations)  Change the position of the equations  Multiply a nonzero scalar k to both sides of an equation  Multiply a scalar k to an equation and add it to another equation Hence X = Y. Solution sets are same.  The operations can be performed only on the coefficient matrix [A, b], for Ax = b.

OR Solving systems of linear equations (Gauss-Jordan Elimination, 변수의 치환 ) (will be used in the simplex method to solve LP problems)

OR  Infinitely many solutions

OR

OR

OR  Elementary row operations are equivalent to premultiplying a nonsingular square matrix to both sides of the equations Ax = b

OR

OR

OR  So if we multiply all elementary row operation matrices, we get the matrix having the information about the elementary row operations we performed

OR  Finding inverse of a nonsingular matrix A. Perform elementary row operations (premultiply elementary row operation matrices) to make [A : I ] to [ I : B ] Let the product of the elementary row operations matrices be C. Then C [ A : I ] = [ CA : C ] = [ I : B] Hence CA = I  C = A -1 and B = A -1.

OR