Paper
10 May 2012 Content-dependent on-the-fly visual information fusion for battlefield scenarios
Author Affiliations +
Abstract
We report on cooperative research program between Army Research Laboratory (ARL), Night Vision and Electronic Sensors Directorate (NVESD), and University of Maryland (UMD). The program aims to develop advanced on-the-fly atmospheric image processing techniques based on local information fusion from a single or multiple monochrome and color live video streams captured by imaging sensors in combat or reconnaissance situations. Local information fusion can be based on various local metrics including local image quality, local image-area motion, spatio-temporal characteristics of image content, etc. Tools developed in this program are used to identify and fuse critical information to enhance target identification and situational understanding in conditions of severe atmospheric turbulence.
© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Mathieu Aubailly, Mikhail A. Vorontsov, Gary Carhart, J. Jiang Liu, and Richard Espinola "Content-dependent on-the-fly visual information fusion for battlefield scenarios", Proc. SPIE 8368, Photonic Applications for Aerospace, Transportation, and Harsh Environment III, 83680J (10 May 2012); https://doi.org/10.1117/12.918681
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image fusion

Image quality

Laser range finders

High dynamic range imaging

Information fusion

RGB color model

Visualization

Back to Top