Accession Number:

AD0744260

Title:

On the Convergence and Rate of Convergence of the Conjugate Gradient Method,

Descriptive Note:

Corporate Author:

GEORGE WASHINGTON UNIV WASHINGTON D C PROGRAM IN LOGISTICS

Personal Author(s):

Report Date:

1972-05-16

Pagination or Media Count:

36.0

Abstract:

For the problem of minimizing an unconstrained function the Conjugate Gradient Method is shown to be convergent. If the function is uniformly strictly convex the ultimate rate of convergence is shown to be n-step superlinear. If the Hessian matrix is Lipschitz continuous the rate of convergence is shown to be n-step quadratic. All results are obtained for the reset version of the method and with a relaxed requirement on the solution of the step-size problem. Comparison with other published results is made. Author

Subject Categories:

  • Operations Research

Distribution Statement:

APPROVED FOR PUBLIC RELEASE