Fusion of Images from Dissimilar Sensor Systems
NAVAL POSTGRADUATE SCHOOL MONTEREY CA
Pagination or Media Count:
Different sensors exploit different regions of the electromagnetic spectrum therefore a multi-sensor image fusion system can take full advantage of the complementary capabilities of individual sensors in the suit to produce information that cannot be obtained by viewing the images separately. In this thesis a framework for the multi resolution fusion of the night vision devices and thermal infrared imagery is presented. It encompasses a wavelet-based approach that supports both pixel-level and region-based fusion and aims to maximize scene content by incorporating spectral information from both the source images. In pixel-level fusion source images are decomposed into different scales and salient directional features are extracted and selectively fused together by comparing the corresponding wavelet coefficients. To increase the degree of subject relevance in the fusion process a region-based approach which uses a multiresolution segmentation algorithm to partition the image domain at different scales is proposed. The regions characteristics are then determined and used to guide the fusion process. The experimental results obtained demonstrate the feasibility of the approach. Potential applications of this development include improvements in night piloting navigation and target discrimination law enforcement etc.
- Infrared Detection and Detectors
- Miscellaneous Detection and Detectors