Accession Number:

ADP007159

Title:

Note on Learning Rate Schedules for Stochastic Optimization,

Personal Author(s):

Corporate Author:

YALE UNIV NEW HAVEN CT DEPT OF COMPUTER SCIENCE

Report Date:

1992-01-01

Abstract:

We present and compare learning rate schedules for stochastic gradient descent, a general algorithm which includes LMS, on-line back-propagation and k-means clustering as special cases. We introduce search-then-converge type schedules which outperform the classical constant and running average lt schedules both in speed of convergence and quality of solution.

Supplementary Note:

This article is from 'Computing Science and Statistics: Proceedings of the Symposium on the Interface Critical Applications of Scientific Computing: Biology, Engineering, Medicine, Speech Held in Seattle, Washington on 21-24 April 1991,' AD-A252 938, p313-317.

Pages:

0005

Identifiers:

Subject Categories:

File Size:

0.00MB

Full text not available:

Request assistance