A Probabilistic Computational Framework for Neural Network Models

reportActive / Technical Report | Accession Number: ADA218969 | Open PDF

Abstract:

Information retrieval in a connectionist or neural network is viewed as computing the most probable value of the information to be retrieved with respect to a probability density function, P. With a minimal number of assumptions, the energy function that a neural network minimizes during information retrieval is shown to uniquely specify P. Inspection of the form of P indicates the class of probabilistic environments that can be learned. Learning algorithms can be analyzed and designed by using maximum likelihood estimation techniques to estimate the parameters of P. The large class of nonlinear auto-associative networks analyzed by Cohen and Grossberg 1983, nonlinear associative multi-layer back-propagation networks Rumelhart, Hinton, Williams, 1986, and certain classes of nonlinear multi-stage networks are analyzed within the proposed computational framework. Keywords Artificial intelligence, Connectionism, Non-linear associator.

Security Markings

DOCUMENT & CONTEXTUAL SUMMARY

Distribution:
Approved For Public Release
Distribution Statement:
Approved For Public Release; Distribution Is Unlimited.

RECORD

Collection: TR
Identifying Numbers
Subject Terms