Accession Number:

AD0774248

Title:

Practical Convergence Conditions for Restarted Conjugate Gradient Methods.

Descriptive Note:

Technical summary rept.,

Corporate Author:

WISCONSIN UNIV MADISON MATHEMATICS RESEARCH CENTER

Personal Author(s):

Report Date:

1973-12-01

Pagination or Media Count:

29.0

Abstract:

OREMS, OptimizationConjugate gradient methods, EigenvaluesConvergence properties of restarted conjugate gradient methods are investigated for the case where the usual requirement that an exact line search be performed at each iteration is relaxed. The objective function is assumed to have continuous second derivatives and the eigenvalues of the Hessian are assumed to be bounded above and below by positive constants. It is further assumed that a Lipschitz condition on the second derivatives is satisfied at the location of the minimum. A class of descent methods is described which exhibit n-step quadratic convergence when restarted even though errors are permitted in the line search. It is then shown that two conjugate gradient methods belong to this class. Author

Subject Categories:

  • Numerical Mathematics
  • Operations Research

Distribution Statement:

APPROVED FOR PUBLIC RELEASE