FUNCTION MINIMIZATION WITHOUT DERIVATIVES BY A SEQUENCE OF QUADRATIC PROGRAMMING PROBLEMS.

reportActive / Technical Report | Accession Number: AD0659294 | Need Help?

Abstract:

An algorithm is described for minimizing an arbitrary scalar cost function cx with respect to an n-vector x. At each stage of the minimization, the cost function is approximated by a quadratic form in the region about the current lowest-cost point. The next trial point is taken as the minimum of this quadratic form within a hypercube in n-space centered at the current lowest-cost point. The procedure has quadratic convergence, but differs from other quadratically convergent minimization algorithms in that 1 minimization is over a sequence of n-dimensional regions rather than over a sequence of one-dimensional straight lines 2 the local approximation to the cost surface need not be positive definite 3 each approximation depends only on true cost values and is independent of prior approximations 4 after each evaluation of cost at a trial point, the trial point is added, and a point distant from the current lowest-cost point is deleted, from the set of points to which the next quadratic form will interpolate. The algorithm takes relatively large steps, and is forced by 4 to learn from its failures. Test results are presented for N - 2 using Rosenbrocks parabolic valley as the cost function. Author

Security Markings

DOCUMENT & CONTEXTUAL SUMMARY

Distribution:
Approved For Public Release

RECORD

Collection: TR
Identifying Numbers
Subject Terms