Introduction to Scientific Computing II Institut für Informatik Scientific Computing In Computer Science Introduction to Scientific Computing II Relaxation Methods Dr. Miriam Mehl
Gauss-Seidel – Convergence twice as fast as Jacobi (in our case!!!) number of iterations: O(1/h)2
Jacobi/GS – Costs per Iteration A sparse O(1) nonzero entries per line O(1/h)2 operations in 2D O(1/h)3 operations in 3D
Jacobi/GS – Costs number of iterations O(1/h)2 (both 2D and 3D) costs per iteration O(1/h)2 in 2D O(1/h)3 in 3D total O(1/h)4 or O(1/h)5
Implementational Aspects Gauss-Seidel: no memory overhead sequential Jacobi: memory overhead parallel Red-Black GS
Successive Overrelaxation (SOR) start from Gauss-Seidel introduce w:
Successive Overrelaxation (SOR) optimal w: convergence?
SOR – Convergence worse for small h number of iterations: O(1/h)
SOR – Costs number of iterations O(1/h)2 (both 2D and 3D) costs per iteration O(1/h)2 in 2D O(1/h)3 in 3D total O(1/h)4 or O(1/h)5
Towards Multigrid optimal costs O(1/h)2 or O(1/h)3 Gauss-Seidel: reduction of different frequencies multigrid idea: high frequencies => fine grids low frequencies => coarse grids