Rapidly Convergent Algorithms for Nonsmooth Optimization.
Final scientific rept. 15 Jul 83-14 Jun 88,
WASHINGTON STATE UNIV PULLMAN
Pagination or Media Count:
The research supported by this grant has continued the development of efficient methods for solving optimization problems involving implicitly defined functions that are not everywhere differentiable. Research on a rapidly convergent algorithm for the constrained single variable case where generalized derivatives are known has been completed. Significant process has been made in extending this work to the n-variable case via the definition of better than linear convergence. Safeguarding techniques have been developed which ensure first order convergence on problems with semismooth functions, but do not prevent better than linear convergence on piecewise second order smooth functions. For the constrained case a scale-free automatic penalty technique has been devised. A new stable method for solving certain quadratic programming problems has been developed which includes a technique for resolving degeneracy. JHD
- Operations Research