Accession Number:

ADA202682

Title:

Learning Algorithms for the Multilayer Perceptron

Descriptive Note:

Technical rept.

Corporate Author:

MASSACHUSETTS INST OF TECH LEXINGTON LINCOLN LAB

Personal Author(s):

Report Date:

1988-10-28

Pagination or Media Count:

34.0

Abstract:

Central to the development of adaptive pattern processing algorithms adaptive filters for random problems - problems where statistics are unknown a priori andor explicit rules governing behavior cannot be extracted in a reductionist manner - is the pursuit of adaptive architectures for associating arbitrary inputs to outputs. Such associative memories are important for providing the mathematical mapping transfer function relating inputs to outputs arising from implicit relationships found in a given training ensemble. The adoption of these filters or architectures during training is guided by a learning algorithm, mathematically derived from an objective function to ensure good association properties. The subject of this paper is an investigation of a class of learning algorithms for the highly parallel multilayer perception architecture used in an associative memory context. By controlling the scheduling of patterns presented during training, a generalized class of learning algorithms are shown to result. Specific realizations of the generalized algorithm include steepest descent parameters adapted following presentation of all training patterns, Rumelhart back propagation parameters adapted following presentation of each pattern, and a new algorithm which captures in part the benefits of both, less parameter adaption and faster convergence, by gradually varying the number of patterns presented per parameter adaption.

Subject Categories:

  • Electrical and Electronic Equipment
  • Operations Research

Distribution Statement:

APPROVED FOR PUBLIC RELEASE