Rapidly Convergent Algorithms for Nonsmooth Optimization.
Annual scientific rept. 15 Jul 84-14 Jul 85,
WASHINGTON STATE UNIV PULLMAN
Pagination or Media Count:
This research has led to new developments for solving nonlinear optimization problems involving functions that are not everywhere differentiable andor are implicitly defined, such as those that arise from dual formulations of optimization models. A rapidly convergent, both in the theoretical and the practical sense, algorithm has been developed for the single variable case where generalized derivatives are available. It is being extended to the case where only function values are known. Some of the single variable results, including the concept of better than linear convergence, have been extended to the multivariable case. In order to solve efficiently the particular quadratic programming subproblems generated by the n-variable method a specialized QP algorithm has been developed. Additional keywords Nondifferential programming FORTRAN. Author
- Statistics and Probability