Download presentation
Presentation is loading. Please wait.
Published byClaire Malone Modified over 8 years ago
1
1 Solving the algebraic equations A x = B =
2
2 Direct solution x = A -1 B = = Applicable only to small problems For the vertical in the spectral technique where x is a one-column vector (decoupled equations in the horizontal)
3
3 Gauss elimination Tridiagonal matrices: Large one-dimensional problems substitute in the 2 nd equation extract x2 substitute in the 3 rd equation …. and so on we arrive at a single equation for x n solve and substitute in the (n-1) th eq. solve for x n-1 and substitute in the (n-2) th eq. etc …….. Pivots: a 11, a 22 -a 21 /a 11, … not too small (might need to rearrange order)
4
4 Iterative methods Guess a solution - Correct from the value of - continue until is small enough The method converges if
5
5 General iterative procedure pre-condition system add and substract ifis the true solution Convergence * continuous equivalent of * the general solution of this equation is: where the λ’s are the eigenvalues of matrix it approaches the stationary solution k if Re(λ) < 0 (elliptic problem)
6
6 Example of iterative procedure Helmholtz equation in finite differences we have taken Δx=1 for simplicity take * then where means all x from iteration n except x i,j from iteration n+1 this is the Jacobi method if we take x i-1,j and x i,j-1 from iteration n+1, we have the Gauss-Seidel method multiplying the correction in * by a factor μ>1, we have the overrelaxation method
7
7 The overrelaxation method R x j x x x
8
8 Multigrid methods An iterative scheme is slow if the corrections from the initial guess are long-range corrections but very fast if they are local Multigrid methods first relax on a subset of the grid (therefore long-range corrections cover a lesser number of grid-points and are seen as more local) and then refine, relaxing on the original grid (or an intermediate one …) and the switching between grids is iterated This procedure is much more efficient than the straightforward relaxation and can compete with direct methods It is even more efficient in multiprocessor machines Adaptive multigrid methods only refine in the areas where the error is larger than a given threshold
9
9 Multigrid methods (2) R xx R x xx short-range errors long-range errors and sampling
10
10 Decoupling the equations Assume we have a 3-D problem Simplest case that is tensor product auxiliary vectors solve for each (m,n). Then solve for each (i,n). Finally solve for each (i,j). Total O(I.J.K) 3 operations
11
11 Decoupling the equations (cont) Use of the eigenvector matrix Consider the Poisson equation in 3 dimensions Using centered finite diff. In the vertical: where Is a matrix of rank K (N o of levels)
12
12 Decoupling the equations (cont) Let be the eigenvectors of callingthe matrix formed by the eigenvectors being the diagonal matrix of eigenvalues The discretized equation can then be written as: K decoupled equations projections of φ along the eigenvectors
13
13 Fourier transform method Consider the 2-dimensional Poisson equation in finite-differences or where here U n : grid-point values of U in row n
14
14 Fourier transform method (cont) A is a tridiagonal symmetric matrix whose eigenvalues are j=1, 2, …, M and the eigenvectors are the Fourier basis functions The same holds for any other matrix of the form (Helmholtz equation)
15
15 Fourier transform method (cont2) Call and The original system may be written as follows: Discrete Fourier transform of vector of grid-points at row k+1 decoupled system of equations for the Fourier components (k=row number) The projection to Fourier space and back can be done by FFT
16
16 Thank you Any question?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.