Accession Number : ADA260121
Title : When Networks Disagree: Ensemble Methods for Hybrid Neural Networks
Descriptive Note : Technical rept.
Corporate Author : BROWN UNIV PROVIDENCE RI INST FOR BRAIN AND NEURAL SYSTEMS
Personal Author(s) : Perrone, Michael P ; Cooper, Leon N
Report Date : 27 Oct 1992
Pagination or Media Count : 18
Abstract : This paper presents a general theoretical framework for ensemble methods of constructing significantly improved regression estimates. Given a population of regression estimators, the authors construct a hybrid estimator that is as good or better in the mean square error sense than any estimator in the population. They argue that the ensemble method presented has several properties: (1) it efficiently uses all the networks of a population -- none of the networks need to be discarded; (2) it efficiently uses all of the available data for training without over-fitting; (3) it inherently performs regularization by smoothing in functional space, which helps to avoid over-fitting; (4) it utilizes local minima to construct improved estimates whereas other neural network algorithms are hindered by local minima; (5) it is ideally suited for parallel computation; (6) it leads to a very useful and natural measure of the number of distinct estimators in a population; and (7) the optimal parameters of the ensemble estimator are given in closed form. Experimental results show that the ensemble method dramatically improves neural network performance on difficult real-world optical character recognition tasks.
Descriptors : *ESTIMATES , *HYBRID SYSTEMS , *NEURAL NETS , *OPTICAL CHARACTER RECOGNITION , *REGRESSION ANALYSIS , *STATISTICAL SAMPLES , ALGORITHMS , COMPUTATIONS , EFFICIENCY , EXPERIMENTAL DATA , FUNCTIONS(MATHEMATICS) , IMAGE PROCESSING , SMOOTHING(MATHEMATICS) , TRAINING , VALIDATION
Subject Categories : Information Science
Statistics and Probability
Distribution Statement : APPROVED FOR PUBLIC RELEASE