Accession Number:
ADA297408
Title:
The Mathematics of Measuring Capabilities of Artificial Neural Networks.
Descriptive Note:
Doctoral thesis,
Corporate Author:
AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH
Personal Author(s):
Report Date:
1995-06-01
Pagination or Media Count:
121.0
Abstract:
Researchers rely on the mathematics of Vapnik and Chervonenkis to capture quantitatively the capabilities of specific artificial neural network ANN architectures. The quantifier is known as the V-C dimension, and is defined on functions or sets. Its value is the largest cardinality 1 of a set of vectors in Rd such that there is at least one set of vectors of cardinality 1 such that all dichotomies of that set into two sets can be implemented by the function or set. Stated another way, the V-C dimension of a set of functions is the largest cardinality of a set, such that there exists one set of that cardinality which can be shattered by the set of functions. A set of functions is said to shatter a set if each dichotomy of that set can be implemented by a function in the set. There is an abundance of research on determining the value of V-C dimensions of ANNs. In this document, research on V-C dimension is refined and extended yielding formulas for evaluating V-C dimension for the set of functions representable by a feed-forward, single hidden-layer perceptron artificial neural network.The fundamental thesis of this research is that the V-C dimension is not an appropriate quantifier of ANN capabilities. KAR P. 11
Descriptors:
Subject Categories:
- Numerical Mathematics
- Computer Systems