Improving convolutional neural networks for buried target detection in ground penetrating radar using transfer learning via pre-training

Published

Conference Paper

© 2017 SPIE. The Ground Penetrating Radar (GPR) is a remote sensing modality that has been used to collect data for the task of buried threat detection. The returns of the GPR can be organized as images in which the characteristic visual patterns of threats can be leveraged for detection using visual descriptors. Recently, convolutional neural networks (CNNs) have been applied to this problem, inspired by their state-of-the-art-performance on object recognition tasks in natural images. One well known limitation of CNNs is that they require large amounts of data for training (i.e., parameter inference) to avoid overfitting (i.e., poor generalization). This presents a major challenge for target detection in GPR because of the (relatively) few labeled examples of targets and non-target GPR data. In this work we use a popular transfer learning approach for CNNs to address this problem. In this approach we train two CNN on other, much larger, datasets of grayscale imagery for different problems. Specifically, we pre-train our CNNs on (i) the popular Cifar10 dataset, and (ii) a dataset of high resolution aerial imagery for detecting solar photovoltaic arrays. We then use varying subsets of the parameters from these two pre-trained CNNs to initialize the training of our buried threat detection networks for GPR data. We conduct experiments on a large collection of GPR data and demonstrate that these approaches improve the performance of CNNs for buried target detection in GPR data.

Full Text

Duke Authors

Cited Authors

  • Bralich, J; Reichman, D; Collins, LM; Malof, JM

Published Date

  • January 1, 2017

Published In

Volume / Issue

  • 10182 /

Electronic International Standard Serial Number (EISSN)

  • 1996-756X

International Standard Serial Number (ISSN)

  • 0277-786X

International Standard Book Number 13 (ISBN-13)

  • 9781510608658

Digital Object Identifier (DOI)

  • 10.1117/12.2263112

Citation Source

  • Scopus