A novel framework for processing forward looking infrared imagery with application to buried threat detection


Conference Paper

Forward Looking Infrared (FLIR) cameras have recently been studied as a sensing modality for use in buried threat detection systems. FLIR-based detection systems benefit from larger standoff distances and faster rates of advance than other sensing modalities, but they also present significant signal processing challenges. FLIR imagery typically yields multiple looks at each surface area, each of which is obtained from a different relative camera pose and position. This multi-look imagery can be exploited for improved performance, however open questions remain as to the best ways to process and fuse such data. Further, the utility of each look in the multi-look imagery is also unclear: How many looks are needed, from what poses, etc? In this work we propose a general framework for processing FLIR imagery wherein FLIR imagery is partitioned according to the particular relative camera pose from which it was collected. Each partition is then projected into a common spatial coordinate system resulting in several distinct images of the surface area. Buried threat detection algorithms can then be applied to each of these resulting images independently, or in aggregate. The proposed framework is evaluated using several detection algorithms on an FLIR dataset collected at a Western US test site and the results indicate that the framework offers significant improvement over detection in the original FLIR imagery. Further experiments using this framework suggest that multiple looks by the FLIR camera can be used to improve detection performance. © 2013 SPIE.

Full Text

Duke Authors

Cited Authors

  • Malof, JM; Morton, KD; Collins, LM; Torrione, PA

Published Date

  • August 8, 2013

Published In

Volume / Issue

  • 8709 /

Electronic International Standard Serial Number (EISSN)

  • 1996-756X

International Standard Serial Number (ISSN)

  • 0277-786X

International Standard Book Number 13 (ISBN-13)

  • 9780819495006

Digital Object Identifier (DOI)

  • 10.1117/12.2016113

Citation Source

  • Scopus