Download presentation
Presentation is loading. Please wait.
Published byDustin Bates Modified over 9 years ago
1
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
2
In the previous slide Error estimation in system of equations –vector/matrix norms LU decomposition –split a matrix into the product of a lower and a upper triangular matrices –efficient in dealing with a lots of right-hand-side vectors Direct factorization –as an systems of n 2 +n equations –Crout decomposition –Dollittle decomposition 2
3
In this slide Special matrices –Strictly diagonally dominant matrix –Symmetric positive definite matrix Cholesky decomposition –Tridiagonal matrix Iterative techniques –Jacobi, Gauss-Seidel and SOR methods –conjugate gradient method Nonlinear systems of equations Exercise 3 3
4
3.7 4 Special Matrices
5
Special matrices Linear systems –which arise in practice and/or in numerical methods –the coefficient matrices often have special properties or structure Strictly diagonally dominant matrix Symmetric positive definite matrix Tridiagonal matrix 5
6
Strictly diagonally dominant 6
7
7
8
Symmetric positive definite 8
9
Symmetric positive definite Theorems for verification 9
10
10
11
Symmetric positive definite Relations to Eigenvalues Leading principal sub-matrix 11
12
Cholesky decomposition For symmetric positive definite matrices –greater efficiency can be obtained –consider the symmetric of the matrix Rather than LU form, we factor the matrix into the form – A=LL T 12
13
13
14
14
15
Tridiagonal Only 8n-7 operations –factor step 3n-3 –solve step 5n-4 15
16
16
17
Any Questions? 17 3.7 Special Matrices
18
Before entering 3.8 So far, we have learnt three methods algorithms in Chapter 3 –Gaussian elimination –LU decomposition –direct factorization Are they algorithms? What’s the differences to those algorithms in Chapter 2? –they report exact solutions rather than approximate solutions 18 question further question answer
19
3.8 19 Iterative Techniques for Linear Systems
20
Iterative techniques Analytic techniques is slow – O(n3)– O(n3) Especially for systems with large but sparse coefficient matrices As an added bonus, iterative techniques are less insensitive to roundoff error 20
21
Iterative techniques Basic idea 21
22
Iteration matrix Immediate questions When does T guarantee a unique solution? When does T guarantee convergence? How quick does {x (k) } converge? How to generate T ? 22
23
Assume that I-T is singular, there exists a nonzero vector x such that (T-1I)x=0 –1 is a eigenvalue of T –but ρ(T)<1, contradiction 23
24
24
25
25
26
(in section 2.3 with proof) 26 http://www.dianadepasquale.com/ThinkingMonkey.jpg Recall that
27
27 http://www.dianadepasquale.com/ThinkingMonkey.jpg Recall that
28
Iteration matrix For these questions We know that when ρ(T)<1, {x (k) } from x (k+1) =Tx (k) +c will converge linearly to a unique solution with any initial vector x (0) What is missing? –remember the problem is to solve Ax=b How to generate T ? –like f(x) and g(x), different algorithms construct different iteration matrix 28 question hint answer
29
Splitting Methods 29
30
Splitting methods Ax=b (M–N)x=b Mx=Nx+b x=M -1 Nx+M -1 b T=M -1 N and c=M -1 b A class of iteration methods –Jacobi method –Gauss-Seidel method –SOR method 30
31
31
32
32
33
Gauss-Seidel method 33
34
Gauss-Seidel method Iteration matrix 34
35
35 The SOR method (successive overrelaxatoin)
36
Any Questions? 36 Iterative Techniques for Linear Systems
37
3.9 Conjugate Gradient Method 37
38
Conjugate gradient method Not all iterative methods are based on the splitting concept The minimization of an associated quadratic functional 38
39
Conjugate gradient method Quadratic functional 39
40
40 http://fuzzy.cs.uni-magdeburg.de/~borgelt/doc/somd/parabola.gif
41
41
42
Minimizing quadratic functional 42
43
Choose the search direction d (m) –as the tangent line in Newton’s method –the gradient of f at x (m) Choose the step size –as the root of the tangent line – 43
44
44 Global optimization problem http://www.mathworks.com/cmsimages/op_main_wl_3250.jpg
45
Any Questions? 45 Conjugate Gradient Method
46
3.10 46 Nonlinear Systems of Equations
47
Nonlinear systems of equations 47
48
Generalization of root-finding 48
49
Generalization Newton’s method 49
50
Generalization of Newton’s method Jacobian matrix 50
51
51
52
52 A lots of equations bypassed… http://www.math.ucdavis.edu/~tuffley/sammy/LinAlgDEs1.jpg
53
53 And this is a friendly textbook :)
54
Any Questions? 54 Nonlinear Systems of Equations
55
Exercise 3 55 2010/5/5 9:00am Email to darby@ee.ncku.edu.tw or hand over in class. You may arbitrarily pick one problem among the first three, which means this exercise contains only five problems.darby@ee.ncku.edu.tw
56
56
57
57
58
58
59
Implement LU decomposition 59
60
60
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.