Computation and Generalization in Neural Networks
Final rept. 1 Jul 1988-30 Jun 1991
BROWN UNIV PROVIDENCE RI INST FOR BRAIN AND NEURAL SYSTEMS
Pagination or Media Count:
During this contract period, our research into backward propagation has led to a number of new theoretical and empirical results. We have developed a generalized version of backward propagation. In our generalized network, both gains and synapses are modified by a backward propagation procedure. Synapses are modified in proportion to the negative gradient of the energy with respect to the synaptic weight as in ordinary backward propagation, and gains are modified in proportion to the negative partial derivative with respect to gain. Since the resulting error signals for the gain and synaptic weights are proportional to one another, the computational complexity of our generalized network is comparable to that of the original backward propagation model.... Back propagation, Gain modification, Multilayer perceptrons.