Accession Number:

ADA158150

Title:

Global Optimization Using Automatic Differentiation and Interval Iteration.

Descriptive Note:

Technical summary rept.,

Corporate Author:

WISCONSIN UNIV-MADISON MATHEMATICS RESEARCH CENTER

Personal Author(s):

Report Date:

1985-06-01

Pagination or Media Count:

34.0

Abstract:

A standard problem in scientific computation is the optimization of a smooth at least twice differentiable function, possibly subject to smooth constraints. Here, optimization means finding the maximum or minimum value of the function in some region, or perhaps its stationary points. The algorithm presented in this paper solves this problem for regions which are rectangular in form. The method used is a version of interval iteration which finds one or all of the critical points of the function in the given region the point or points at which its gradient vector vanishes, or just the critical points at which the function has its maximum or minimum values. The required values of the gradient vector and Hessian matrix of the function are obtained by automatic differentiation, which does not involve symbolic differentiation or numerical approximation of the derivatives of the function being investigated. Unlike algorithms which sample function values only at a discrete set of points, the ones given here use interval computation to furnish guaranteed bounds for all quantities of interest over the required subregions of the initial region. The results given by these optimization algorithms are thus self-validating. The algorithms presented here have been programmed using standard implementations of automatic differentiation and interval computation. Numerical examples are given to illustrate the effectiveness of this approach to the type of optimization problem considered.

Subject Categories:

  • Theoretical Mathematics

Distribution Statement:

APPROVED FOR PUBLIC RELEASE