Neural Networks for Localized Function Approximation.
Final rept. 15 Feb 93-14 Jul 96,
CALIFORNIA STATE UNIV LOS ANGELES
Pagination or Media Count:
We studied the complexity problem for neural network used in function approximation i.e., the problem of estimating the number of neurons needed to provide a given accuracy of approximation for any function, unknown except for a few a priori assumptions. We developed a unified theory, applicable to the traditional neural networks, radial basis function networks, and generalized regularization networks. While our main objective was to provide a solid theoretical foundation for the subject, we have also developed new training paradigms, where no optimization based technique such as back-propagation is required. Thus, the training of our networks is very simple and entirely free of all the traditional shortcomings, such as local minima. Our ideas were tested to develop neural networks for prediction of time series, and beamforming in phased array antennas. In both cases, we obtained spectacular improvements over previously known results. Our work has resulted in 14 publications. In addition, the grant has facilitated the completion of our book on weighted approximation as well as the fulfillment of our obligations as an invited guest editor for a special issue of Advances in Computational Mathematics on Mathematical Aspects of Neural Networks
- Statistics and Probability