Review on Iterative methods, Preconditioning

Review on
Iterative methods, Preconditioning, Multigrid and Krylov
methods
Ecole des Mines de Paris, Sophia Antipolis
[email protected]
Used reference:
Prof. Gilbert Strang, MIT Course Number 18.086,
http://ocw.mit.edu/courses/mathematics/18-086-mathematical-methods-forengineers-ii-spring-2006/
Nab Raj Roshyara, KRYLOV SUBSPACE ITERATION, 2005
M. HESTENES AND E. STIEFEL, Methods of conjugate gradients for solving
linear systems, J. Res. Nat. Bur. Stand., 49 (1952), pp. 409-436.
W. ARNOLDI, The principle of minimized iterations in the solution of the
matrix eigenvalue problem, Quart. Appl. Math., 9 (1951), pp. 17-29
Jonathan Richard Shewchuk, An Introduction to the Conjugate Gradient Method
Without the Agonizing Pain, under the web http://www.cs.ucdavis.edu/
bai/ECS231/schewchukcgpaper.pdf
Yousef Saad, Iterative methods for Sparse Linearsystems
Henk A. van der Vorst, The article on \KRYLOV SUBSPACE ITERATION" of
the journal Computing in Science and Engineering; 2000.
P = diag ( A)
P = triangular part of A = D +
P = LappUapp
How to decide?
ek = M k e0
P = diag ( A)
MultiGrid
MultiGrid v-cycle for AX=b
(fine to coarse)
(coarse to fine)
x = gmres(A,b)
Step –1
We start inputting the matrix A, the
vector b, and the initial guess x0. Then continue
by computing the initial residual, r0, based on the
initial input. The goal is to minimize this residual
through the algorithm. b is the 2-norm of the
residual and v1 is the normalized residual vector.
This is the vector from which we build the Krylov
subspace
Step 2 :
initializes an upper Hessenberg matrix Hm. This
matrix has a direct relation to A. In fact if m
equals n, and removing the last row of Hm, the
relation is a similarity transformation, which
preserves eigenvalues. Also the transformation
matrix is Krylov basis composed of the vj vectors.