TOPICS
Search

Conjugate Gradient Squared Method


In the biconjugate gradient method, the residual vector r^((i)) can be regarded as the product of r^((0)) and an ith degree polynomial in A, i.e.,

 r^((i))=P_i(A)r^((0)).
(1)

This same polynomial satisfies

 r^~^((i))=P_i(A^(T))r^~^((0))
(2)

so that

rho_i=(r^~^((i)),r^((i)))
(3)
=(P_i(A^(T))r^~^((0)),P_i(A)r^((0)))
(4)
=(r^~^((0)),P_i^2(A)r^((0))).
(5)

This suggests that if P_i(A) reduces r^((0)) to a smaller vector r^((i)), then it might be advantageous to apply this "contraction" operator twice, and compute P_i^2(A)r^((0)). The iteration coefficients can still be recovered from these vectors (as shown above), and it turns out to be easy to find the corresponding approximations for x. This approach is the conjugate gradient squared (CGS) method (Sonneveld 1989).

Often one observes a speed of convergence for CGS that is about twice as fast as for the biconjugate gradient method, which is in agreement with the observation that the same "contraction" operator is applied twice. However, there is no reason that the contraction operator, even if it really reduces the initial residual r^((0)), should also reduce the once reduced vector r^((k))=P_k(A)r^((0)). This is evidenced by the often highly irregular convergence behavior of CGS. One should be aware of the fact that local corrections to the current solution may be so large that cancellation effects occur. This may lead to a less accurate solution than suggested by the updated residual (van der Vorst 1992). The method tends to diverge if the starting guess is close to the solution.

CGS requires about the same number of operations per iteration as the biconjugate gradient method, but does not involve computations with A^(T). Hence, in circumstances where computation with A^(T) is impractical, CGS may be attractive.


See also

Biconjugate Gradient Method, Chebyshev Iteration, Conjugate Gradient Method on the Normal Equations Conjugate Gradient Method, Flexible Generalized Minimal Residual Method, Generalized Minimal Residual Method, Linear System of Equations, Minimal Residual Method, Nonstationary Iterative Method, Preconditioner, Quasi-Minimal Residual Method Stationary Iterative Method, Symmetric LQ Method, Transpose-Free Quasi-Minimal Residual Method

This entry contributed by Noel Black and Shirley Moore, adapted from Barrett et al. (1994) (author's link)

Explore with Wolfram|Alpha

References

Barrett, R.; Berry, M.; Chan, T. F.; Demmel, J.; Donato, J.; Dongarra, J.; Eijkhout, V.; Pozo, R.; Romine, C.; and van der Vorst, H. Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods, 2nd ed. Philadelphia, PA: SIAM, 1994. http://www.netlib.org/linalg/html_templates/Templates.html.Sonneveld, P. "CGS: A Fast Lanczos-Type Solver for Nonsymmetric Linear Systems." SIAM J. Sci. Statist. Comput. 10, 36-52, 1989.van der Vorst, H. "Bi-CGSTAB: A Fast and Smoothly Converging Variant of Bi-CG for the Solution of Nonsymmetric Linear Systems." SIAM J. Sci. Statist. Comput. 13, 631-644, 1992.

Referenced on Wolfram|Alpha

Conjugate Gradient Squared Method

Cite this as:

Black, Noel and Moore, Shirley. "Conjugate Gradient Squared Method." From MathWorld--A Wolfram Web Resource, created by Eric W. Weisstein. https://mathworld.wolfram.com/ConjugateGradientSquaredMethod.html

Subject classifications