Skip to main content

How do we choose the best model? The impact of cross-validation design on model evaluation for buried threat detection in ground penetrating radar

Publication ,  Conference
Malof, JM; Reichman, D; Collins, LM
Published in: Proceedings of SPIE - The International Society for Optical Engineering
January 1, 2018

A great deal of research has been focused on the development of computer algorithms for buried threat detection (BTD) in ground penetrating radar (GPR) data. Most recently proposed BTD algorithms are supervised, and therefore they employ machine learning models that infer their parameters using training data. Cross-validation (CV) is a popular method for evaluating the performance of such algorithms, in which the available data is systematically split into N disjoint subsets, and an algorithm is repeatedly trained on N-1 subsets and tested on the excluded subset. There are several common types of CV in BTD, which vary principally upon the spatial criterion used to partition the data: site-based, lane-based, region-based, etc. The performance metrics obtained via CV are often used to suggest the superiority of one model over others, however, most studies utilize just one type of CV, and the impact of this choice is unclear. Here we employ several types of CV to evaluate algorithms from a recent large-scale BTD study. The results indicate that the rank-order of the performance of the algorithms varies substantially depending upon which type of CV is used. For example, the rank-1 algorithm for region-based CV is the lowest ranked algorithm for site-based CV. This suggests that any algorithm results should be interpreted carefully with respect to the type of CV employed. We discuss some potential interpretations of performance, given a particular type of CV.

Duke Scholars

Published In

Proceedings of SPIE - The International Society for Optical Engineering

DOI

EISSN

1996-756X

ISSN

0277-786X

ISBN

9781510617674

Publication Date

January 1, 2018

Volume

10628

Related Subject Headings

  • 5102 Atomic, molecular and optical physics
  • 4009 Electronics, sensors and digital hardware
  • 4006 Communications engineering
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Malof, J. M., Reichman, D., & Collins, L. M. (2018). How do we choose the best model? The impact of cross-validation design on model evaluation for buried threat detection in ground penetrating radar. In Proceedings of SPIE - The International Society for Optical Engineering (Vol. 10628). https://doi.org/10.1117/12.2305793
Malof, J. M., D. Reichman, and L. M. Collins. “How do we choose the best model? The impact of cross-validation design on model evaluation for buried threat detection in ground penetrating radar.” In Proceedings of SPIE - The International Society for Optical Engineering, Vol. 10628, 2018. https://doi.org/10.1117/12.2305793.
Malof JM, Reichman D, Collins LM. How do we choose the best model? The impact of cross-validation design on model evaluation for buried threat detection in ground penetrating radar. In: Proceedings of SPIE - The International Society for Optical Engineering. 2018.
Malof, J. M., et al. “How do we choose the best model? The impact of cross-validation design on model evaluation for buried threat detection in ground penetrating radar.” Proceedings of SPIE - The International Society for Optical Engineering, vol. 10628, 2018. Scopus, doi:10.1117/12.2305793.
Malof JM, Reichman D, Collins LM. How do we choose the best model? The impact of cross-validation design on model evaluation for buried threat detection in ground penetrating radar. Proceedings of SPIE - The International Society for Optical Engineering. 2018.

Published In

Proceedings of SPIE - The International Society for Optical Engineering

DOI

EISSN

1996-756X

ISSN

0277-786X

ISBN

9781510617674

Publication Date

January 1, 2018

Volume

10628

Related Subject Headings

  • 5102 Atomic, molecular and optical physics
  • 4009 Electronics, sensors and digital hardware
  • 4006 Communications engineering