CSE 203B: Convex Optimization Week 2 Discuss Session

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

5.4 Basis And Dimension.
5.2 Rank of a Matrix. Set-up Recall block multiplication:
3D Geometry for Computer Graphics
Ch 7.7: Fundamental Matrices
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Section 4.6 (Rank).
Solving Systems of Linear Equations Part Pivot a Matrix 2. Gaussian Elimination Method 3. Infinitely Many Solutions 4. Inconsistent System 5. Geometric.
Dan Witzner Hansen  Groups?  Improvements – what is missing?
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Chapter 1 Systems of Linear Equations
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
10.1 Gaussian Elimination Method
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
CHAPTER SIX Eigenvalues
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Square n-by-n Matrix.
MA2213 Lecture 5 Linear Equations (Direct Solvers)
Linear Algebra Lecture 25.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
Domain Range definition: T is a linear transformation, EIGENVECTOR EIGENVALUE.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Meeting 19 System of Linear Equations. Linear Equations A solution of a linear equation in n variables is a sequence of n real numbers s 1, s 2,..., s.
Chapter 1 Systems of Linear Equations Linear Algebra.
Arab Open University Faculty of Computer Studies M132: Linear Algebra
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Eigenvalues and Eigenvectors
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Chapter 1 Linear Equations and Vectors
Elementary Linear Algebra Anton & Rorres, 9th Edition
Eigenvalues and Eigenvectors
Systems of First Order Linear Equations
CS485/685 Computer Vision Dr. George Bebis
Symmetric Matrices and Quadratic Forms
I.4 Polyhedral Theory (NW)
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Linear Algebra Lecture 24.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 4-5, Tuesday 18th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN.
Elementary Linear Algebra Anton & Rorres, 9th Edition
I.4 Polyhedral Theory.
Eigenvalues and Eigenvectors
Matrices are identified by their size.
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
CS5321 Numerical Optimization
Lecture 20 SVD and Its Applications
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

CSE 203B: Convex Optimization Week 2 Discuss Session Xinyuan Wang 01/16/2019

Contents Preparation for HW1 Review and Exercise Properties of dual cone

Q1: Range an Null space The range of matrix 𝐴∈ ℝ 𝑚×𝑛 is ℛ 𝐴 = 𝐴𝑥 | 𝑥∈ ℝ 𝑛 or column subspace 𝑐𝑜𝑙 𝐴 Row subspace 𝑟𝑜𝑤 𝐴 = 𝐴 𝑇 𝑦 𝑦∈ ℝ 𝑚 } The nullspace of 𝐴 is defined to be 𝒩 𝐴 = 𝑥 | 𝐴𝑥=0 𝒩 𝐴 ⊥𝑟𝑜𝑤 𝐴 dim⁡(𝒩 𝐴 )+dim⁡(𝑟𝑜𝑤 𝐴 )=𝑛 Analogy for column space How to find the range and null space of a matrix?

Q1: Range an Null space Reduced Row Echelon Form Operation at column k Determines whether 𝐴𝑥=𝑏 is solvable and describes all the solutions Gaussian elimination on rows Pivot: the leading coefficient (first nonzero element from the left) is scaled to 1 Each column containing a leading 1 has zeros at other elements Operation at column k

Q1: Range an Null space 𝐴 1 = 1 0 2 1 1 1 5 2 1 2 8 4 An example 𝐴 1 = 1 0 2 1 1 1 5 2 1 2 8 4 An example At column 1, choose 𝒂 𝟏𝟏 as pivot R2-R1, R3-R1 𝐴 1 → 𝐴 2 = 1 0 2 1 0 1 3 1 0 2 6 3 At column 2, choose 𝒂 𝟑𝟐 as pivot 𝐴 2 → 1 0 2 1 0 2 6 3 0 1 3 1 → 1 0 2 1 0 1 3 3/2 0 1 3 1 → 𝐴 3 = 1 0 2 1 0 1 3 3/2 0 0 0 −1/2 R2⇔R3 R2/2 R3-R2

Q1: Range an Null space No pivot at col 3. At column 4, choose 𝒂 𝟑𝟒 as pivot R1-R3, R2-R3*3/2 𝐴 3 → 1 0 2 1 0 1 3 3/2 0 0 0 1 → 𝐴 4 = 1 0 2 0 0 1 3 0 0 0 0 1 R3/-0.5 Columns 1, 2 and 4 are independent The range 𝐴 1 𝑥= 𝑥 1 𝐶𝑜𝑙1+…+ 𝑥 4 𝐶𝑜𝑙4 ℛ 𝐴 1 = 1 1 1 , 0 1 2 , 1 2 4 The column spaces of 𝐴 1 and 𝐴 4 can be different, but their dimensions must be the same.

Q1: Range an Null space Proposition: given a matrix 𝐴∈ 𝑅 𝑚×𝑛 and 𝑏∈ 𝑅 𝑚 , after any sequence of elementary row operations 𝐴 ′ , 𝑏 ′ =𝑃(𝐴,𝑏), then solutions of 𝐴𝑥 =𝑏 are the same as 𝐴′𝑥=𝑏′. The null space of of 𝐴 is defined as 𝒩 𝐴 = 𝑥 | 𝐴𝑥=0 𝐴 1 𝑥=0⟺ 𝐴 4 𝑥=0 where 𝑥 1 +2 𝑥 3 =0 𝑥 2 +3 𝑥 3 =0 𝑥 4 =0 𝐴 4 = 1 0 2 0 0 1 3 0 0 0 0 1 For example, a nonzero solution 𝑥= −2 −3 1 0 ∈𝒩 𝐴 Which spans the null space because dim(𝒩 𝐴 )=1. Solve

Q1: Range an Null space What if there exist many variables? The space of 𝑥∈𝒩(𝐴) could be format with the constrained and free variables. 𝑥 1 =−2 𝑥 3 𝑥 2 =−3 𝑥 3 𝑥 3 = 𝑥 3 𝑥 4 =0 𝑥 1 +2 𝑥 3 =0 𝑥 2 +3 𝑥 3 =0 𝑥 4 =0 𝐴 4 = 1 0 2 0 0 1 3 0 0 0 0 1 Free variable 𝑥= 𝑥 1 𝑥 2 𝑥 3 𝑥 4 = 𝑥 3 −2 −3 1 0 , so 𝒩 𝐴 ={ −2 −3 1 0 }

Q2:Eigenvalue Decomposition Let 𝐴∈ ℝ 𝑛×𝑛 , 𝐴𝑥=𝜆𝑥 leads to det 𝐴−𝜆𝐼 =0. Then it could be factorized as 𝐴=𝑄Λ 𝑄 −1 where 𝑄∈ ℝ 𝑛×𝑛 contains the eigenvector 𝒒 𝒊 of 𝐴, and Λ=diag 𝜆 1 , …, 𝜆 𝑛 whose diagonal elements are the corresponding eigenvalues. The eigenvectors are usually normalized. If 𝐴 is symmetric then 𝑄 is orthogonal. Defective matrix: cannot find 𝑛 independent eigenvectors, so it is not diagonalizable 𝑄= 𝒒 𝟏 ⋯ 𝒒 𝒏 , Λ= 𝜆 1 ⋯ 0 ⋱ ⋮ 𝜆 𝑟 ⋮ ⋱ 0 ⋯ 0 Nil-potent

Q2:Eigenvalue Decomposition Could we tell the range and null space of matrix from its eigenvalue decomposition? 𝐴𝑥=𝜆𝑥 If 𝜆 𝑖 ≠0, then 𝑞 𝑖 ∈ℛ(𝐴) If 𝜆 𝑖 =0, 𝐴 𝑞 𝑖 =0 then 𝑞 𝑖 ∈𝒩 𝐴 Example: 𝐴= 1 0 2 1 1 5 1 2 8 → 𝑄= 0.1969 0.9123 −0.5345 0.5154 0.3484 −0.8018 0.8240 −0.2154 0.2673 , Λ= 9.4721 0.5279 0.0000

Q3:Singular Value Decomposition Let 𝐴∈ ℝ 𝑚×𝑛 , 𝐴𝑥=𝜆𝑥 leads to det 𝐴−𝜆𝐼 =0. Then it could be factorized as 𝐴=𝑈𝛴 𝑉 ∗ where 𝑈∈ ℝ 𝑚×𝑚 and 𝑉∈ ℝ 𝑛×𝑛 are unitary matrices which are orthogonal if 𝐴 is real. Σ is a diagonal matrix with non-negative real diagonal, called singular values. The columns of 𝑉 are eigenvectors of 𝐴 ∗ 𝐴 𝐴 ∗ 𝐴=𝑉𝛴 𝑈 ∗ 𝑈 𝛴 𝑉 ∗ =𝑉 𝛴 2 𝑉 ∗ The columns of 𝑈 are eigenvectors of 𝐴 𝐴 ∗ 𝐴 𝐴 ∗ =𝑈𝛴 𝑉 ∗ 𝑉 𝛴 𝑈 ∗ =𝑈 𝛴 2 𝑈 ∗ The non-zero elements of Σ are the square roots of the non-zero eigenvalues of 𝐴 ∗ 𝐴 and 𝐴 𝐴 ∗ Related to eigenvalue decomposition, can we figure out the range and null space of a matrix from its SVD results?

Review: Dual Cone Properties Definition: Let 𝐾 be a cone, the set 𝐾 ∗ = 𝑦 𝑥 𝑇 𝑦≥0 for all 𝑥∈𝐾} is called the dual cone of 𝐾. A convex cone 𝐾 is proper cone if 𝐾 is closed (contains its boundary) 𝐾 is solid (has nonempty interior) 𝐾 is pointed (contains no line) Let 𝐾 ∗ be the dual cone of a convex cone 𝐾. (see exercise 2.31) 𝐾 ∗ is indeed a convex cone. 𝐾 1 ⊆ 𝐾 2 implies 𝐾 2 ∗ ⊆ 𝐾 1 ∗ . 𝐾 ∗ is closed. The interior of 𝐾 ∗ is given by 𝐢𝐧𝐭 𝐾 ∗ = 𝑦 𝑦 𝑇 𝑥>0 for all 𝑥∈𝐜𝐥 𝐾}. If 𝐾 has nonempty interior then 𝐾 ∗ is pointed. 𝐾 ∗∗ is the closure of 𝐾. (Hence if 𝐾 is closed, 𝐾 ∗∗ =𝐾.) If the closure of 𝐾 is pointed then 𝐾 ∗ has nonempty interior.

Review: Dual Cone Properties How to prove that 𝐾 ∗ is indeed a convex cone? See the definition of 𝐾 ∗ 𝑦 𝑥 𝑇 𝑦≥0 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑥∈𝐾}, 𝐾 ∗ is the intersection of a set of homogeneous halfspace ( 𝑥 𝑇 𝑦≥0) which include the origin as a boundary point. So 𝐾 ∗ is a closed convex cone. 𝐾 1 ⊆ 𝐾 2 implies 𝐾 2 ∗ ⊆ 𝐾 1 ∗ From the definition of 𝐾 ∗ , if 𝑦∈ 𝐾 2 ∗ then 𝑥 𝑇 𝑦≥0 for all 𝑥∈ 𝐾 2 . So 𝑥 𝑇 𝑦≥0 for all 𝑥∈ 𝐾 1 , 𝑦 is also in 𝐾 1 ∗ . 𝐾 ∗∗ is the closure of 𝐾. (Hence if 𝐾 is closed, 𝐾 ∗∗ =𝐾.) See the definition of 𝐾 ∗ , 𝑦≠0 is the normal vector of a homogeneous halfspace containing 𝐾 iff 𝑦∈ 𝐾 ∗ . The intersection of all the homogeneous halfspaces containing a convex cone 𝐾 is the closure of 𝐾. Therefore the closure of 𝐾 is 𝒄𝒍 𝐾= 𝑦∈ 𝐾 ∗ 𝑥 𝑦 𝑇 𝑥≥0}={𝑥 𝑦 𝑇 𝑥≥0 for all 𝑦∈ 𝐾 ∗ = 𝐾 ∗∗ Try to prove the other properties.