Multi-sensor Image Interpretation Using Laser Radar and Thermal Images
Interim technical rept.
TEXAS UNIV AT AUSTIN COMPUTER AND VISION RESEARCH CENTER
Pagination or Media Count:
This paper presents a knowledge-based system to interpret registered laser radar and thermal images. The object is to detect and recognize man-made objects at kilometer range in outdoor scenes. The multisensor fusion approach is applied to various sensing modalities range, intensity, velocity, and thermal to improve both image segmentation and interpretation. The ability to use multiple sensors greatly helps an intelligent platform to understand and interact with its environment. The knowledge-based interpretation system, AIMS, is constructed using KEE and Lisp. Low-level attributes of image segments regions are computed by the segmentation modules and then converted into the KEE format. The interpretation system applies forward chaining in a bottom-up fashion to derive object-level interpretations from data bases generated by low- level processing modules. Segments are grouped into objects and then objects are classified into pre-defined categories. AIMS employs a two-tiered software structure. The efficiency of AIMS is enhanced by transferring non-symbolic processing tasks to a concurrent service manager program. Therefore, tasks with different characteristics are executed using different software tools and methodologies.
- Infrared Detection and Detectors
- Active and Passive Radar Detection and Equipment