A STOCHASTIC APPROXIMATION SCHEME WITH ACCELERATED CONVERGENCE PROPERTIES.
HARVARD UNIV CAMBRIDGE MASS DIV OF ENGINEERING AND APPLIED PHYSICS
Pagination or Media Count:
An examination of the first order steepest descent method for minimizing deterministic functions leads to the specification of sufficient conditions assuring the convergence with probability one of a new stochastic approximation algorithm. The algorithm is interpreted as a stochastic descent method using a constant, as opposed to a continually decreasing steep length. The crucial convergence condition is that the variance of the gradient should be zero as well as a minimum at the solution point. The rather narrow class of problems to which the new algorithm is applicable can be extended to include many useful problems by employing accelerator methods adapted from the field of Monte-Carlo techniques, both to satisfy the condition, and to accelerate convergence. Simple illustrative examples demonstrating the effectiveness of the new algorithm are presented. Author
- Statistics and Probability