Accession Number:
AD0728474
Title:
On the Convergence and Rate of Convergence of the Conjugate Gradient Method.
Descriptive Note:
Technical summary rept.,
Corporate Author:
WISCONSIN UNIV MADISON MATHEMATICS RESEARCH CENTER
Personal Author(s):
Report Date:
1971-06-01
Pagination or Media Count:
27.0
Abstract:
For the problem of minimizing an unconstrained function, the Conjugate Gradient Algorithm is shown to be convergent. If the function is uniformly strictly convex the ultimate rate of convergence is shown to be n-step superlinear. If the Hessian matrix is Lipschitz continuous the rate of convergence is shown to be nearly n-step quadratic. Comparison with other known results is given. Author
Subject Categories:
- Theoretical Mathematics
- Operations Research