Download presentation
Presentation is loading. Please wait.
1
Orthogonality and Least Squares
THE GRAM-SCHMIDT PROCESS Β© 2016 Pearson Education, Inc.
2
THE GRAM-SCHMIDT PROCESS
Theorem 11: The Gram-Schmidt Process Given a basis {x1, , xp} for a nonzero subspace W of β π , define π£1=π₯1 π£2=π₯2β π₯2βπ£1 π£1βπ£1 v1 π£3=π₯3β π₯3βπ£1 π£1βπ£1 v1β π₯3βπ£2 π£2βπ£2 v2 π£π=π₯πβ π₯πβπ£1 π£1βπ£1 v1β π₯πβπ£2 π£2βπ£2 v2β β π₯πβπ£ πβ1 π£πβ1βπ£πβ1 vπβ1 Then {v1 , , vp} is an orthogonal basis for W. In addition Span{v1 , , vk} = Span{x1 , , xk} for 1β€πβ€π . (1) Β© 2016 Pearson Education, Inc.
3
THE GRAM-SCHMIDT PROCESS
Proof For 1β€πβ€π, let Wk = Span{x1 , , xk}. Set v1 = x1, so that Span{v1} = Span {x1}. Suppose, for some k < p, we have constructed v1, , vk so that {v1 , , vk} is an orthogonal basis for Wk. Define vk+1 = xk+1 β projWkxk+1 By the Orthogonal Decomposition Theorem, vk+1 is orthogonal to Wk. Furthermore, vk+1 β 0 because xk+1 is not in Wk = Span{x1 , , xk} Hence {v1 , , vk} is an orthogonal set of nonzero vectors in the (k + 1)-dimensional space Wk+1. By the Basis Theorem in Section 4.5, this set is an orthogonal basis for Wk+1. Hence Wk+1 = Span{v1 , , vk+1}. When k + 1 = p, the process stops. (2) Β© 2016 Pearson Education, Inc.
4
ORTHONORMAL BASES Example 3 Example 1 constructed the orthogonal basis
π£1= , π£2= An orthonormal basis is π’1= 1 π£1 v1= = 1/ 5 2/ 5 0 π’2= 1 π£2 v2= Β© 2016 Pearson Education, Inc.
5
QR FACTORIZATION OF MATRICES
Theorem 12: The QR Factorization If A is an m Γ n matrix with linearly independent columns, then A can be factored as A = QR, where Q is an m Γ n matrix whose columns form an orthonormal basis for Col A and R is an n Γ n upper triangular invertible matrix with positive entries on its diagonal. Proof The columns of A form a basis {x1, , xn} for Col A. Construct an orthonormal basis {u1, , un} for W = Col A with property (1) in Theorem 11. This basis may be constructed by the Gram-Schmidt process or some other means. Β© 2016 Pearson Education, Inc.
6
QR FACTORIZATION OF MATRICES
Let π=[u1 u un] For k = 1 , , xk is in Span{x1, , xk} = Span{u1, , uk}. So there are constants, r1k , , rkk, such that π₯π=π1ππ’1+β¦+ππππ’π+0βπ’π+1+β¦+0βπ’π We may assume that rkk β₯ 0. This shows that xk is a linear combination of the columns of Q using as weights the entries in the vector Β© 2016 Pearson Education, Inc.
7
QR FACTORIZATION OF MATRICES
ππ= π1π πππ That is, xk = Qrk for k = 1, , n. Let R = [r rn]. Then A = [x xn] = [Qr Qrn] = QR The fact that R is invertible follows easily from the fact that the columns of A are linearly independent. Since R is clearly upper triangular, its nonnegative diagonal entries must be positive. Β© 2016 Pearson Education, Inc.
8
QR FACTORIZATION OF MATRICES
Example 4 Find a QR factorization of A = Solution The columns of A are the vectors x1, x2, and x3 in Example 2. An orthogonal basis for Col A = Span{x1, x2, x3} was found in that example: π£1= , π£2= β , π£3= 0 β2/3 1/3 1/3 Β© 2016 Pearson Education, Inc.
9
QR FACTORIZATION OF MATRICES
To simplify the arithmetic that follows, scale v3 by letting v3 = 3v3. Then normalize the three vectors to obtain u1, u2, and u3, and use these vectors as the columns of Q: Q = 1/2 β3/ /2 1/ 12 β2/ /2 1/2 1/ / / 6 1/ By construction, the first k columns of Q are an orthonormal basis of Span{x1 , , xk}. Β© 2016 Pearson Education, Inc.
10
QR FACTORIZATION OF MATRICES
From the proof of Theorem 12, A = QR for some R. To find R, observe that QTQ = I, because the columns of Q are orthonormal. Hence πππ΄=ππ ππ
=πΌπ
=π
and R = 1/2 1/2 1/2 β3/ / / β2/ 6 1/ /2 1/ / = 2 3/ / / / 12 Β© 2016 Pearson Education, Inc.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.