Accession Number : ADA547793


Title :   Automatic Target Recognition: Statistical Feature Selection of Non-Gaussian Distributed Target Classes


Descriptive Note : Master's thesis


Corporate Author : NAVAL POSTGRADUATE SCHOOL MONTEREY CA DEPT OF ELECTRICAL AND COMPUTER ENGINEERING


Personal Author(s) : Wilder, Matthew J


Full Text : https://apps.dtic.mil/dtic/tr/fulltext/u2/a547793.pdf


Report Date : Jun 2011


Pagination or Media Count : 149


Abstract : Target and pattern recognition systems are in widespread use. Efforts have been made in all areas of pattern recognition to increase the performance of these systems. Feature extraction, feature selection, and classification are the major aspects of a target recognition system. This research proposes algorithms for selecting useful statistical features in pattern/target classification problems in which the features are non-Gaussian distributed. In engineering practice, it is common to either not perform any feature selection procedure or to use a feature selection algorithm that assumes the features are Gaussian distributed. These results can be far from optimal if the features are non-Gaussian distributed, as they often are. This research has the goal of mitigating that problem by creating algorithms that are useful in practice. This work focuses on the performance of three common feature selection algorithms: the Branch and Bound, the Sequential Forward Selection, and Exhaustive Search algorithms. Ordinarily, the performance index used to measure the class separation in feature space involves assuming the data are Gaussian and deriving tractable performance indices that can be calculated without estimating the probability density functions of the class data. The advantage of this approach is that it produces feature selection algorithms that have low computational complexity and do not require knowledge of the data densities. The disadvantage is that these algorithms may not perform reasonably when the data are non-Gaussian. This research examines the use of information-theoretic class separability measures that can deal with the non-Gaussian case. In particular, this work shows that the Hellinger Distance (a type of divergence) has very desirable mathematical properties and can be useful for feature selection when accompanied by a suitable density estimator.


Descriptors :   *PATTERN RECOGNITION , *TARGET CLASSIFICATION , *TARGET RECOGNITION , ALGORITHMS , AUTOMATIC , CLASSIFICATION , COMPUTATIONS , DATA STORAGE SYSTEMS , DENSITY , DISTRIBUTION , ENGINEERING , ESTIMATES , FEATURE EXTRACTION , FORWARD AREAS , INDEXES , INDEXES(RATIOS) , MATHEMATICS , PATTERNS , PROBABILITY DENSITY FUNCTIONS , SEARCHING , SELECTION , SEPARATION , SEQUENCES , STATISTICS , TARGETS , THESES , TRACTABLE


Subject Categories : Target Direction, Range and Position Finding


Distribution Statement : APPROVED FOR PUBLIC RELEASE