Global Optimization Using Automatic Differentiation and Interval Iteration.
Technical summary rept.,
WISCONSIN UNIV-MADISON MATHEMATICS RESEARCH CENTER
Pagination or Media Count:
A standard problem in scientific computation is the optimization of a smooth at least twice differentiable function, possibly subject to smooth constraints. Here, optimization means finding the maximum or minimum value of the function in some region, or perhaps its stationary points. The algorithm presented in this paper solves this problem for regions which are rectangular in form. The method used is a version of interval iteration which finds one or all of the critical points of the function in the given region the point or points at which its gradient vector vanishes, or just the critical points at which the function has its maximum or minimum values. The required values of the gradient vector and Hessian matrix of the function are obtained by automatic differentiation, which does not involve symbolic differentiation or numerical approximation of the derivatives of the function being investigated. Unlike algorithms which sample function values only at a discrete set of points, the ones given here use interval computation to furnish guaranteed bounds for all quantities of interest over the required subregions of the initial region. The results given by these optimization algorithms are thus self-validating. The algorithms presented here have been programmed using standard implementations of automatic differentiation and interval computation. Numerical examples are given to illustrate the effectiveness of this approach to the type of optimization problem considered.
- Theoretical Mathematics