Comparison of a distance-based likelihood ratio test and k-nearest neighbor classification methods

Published

Conference Paper

Several studies of the k-nearest neighbor (KNN) classifier have proposed the use of non-uniform weighting on the k neighbors. It has been suggested that the distance to each neighbor can be used to calculate the individual weights in a weighted KNN approach; however, a consensus has not yet been reached on the best method or framework for calculating weights using the distances. In this paper, a distance likelihood ratio test will be discussed and evaluated using simulated data. The distance likelihood ratio test (DLRT) shares several characteristics with the distance-weighted k-nearest neighbor methods but approaches the use of distance from a different perspective. Results illustrate the ability of the distance likelihood ratio test to approximate the likelihood ratio and compare the DLRT to two other k-neighborhood classification rules that utilize distance-weighting. The DLRT performs favorably in comparisons of the classification performance using the simulated data and provides an alternative nonparametric classification method for consideration when designing a distance-weighted KNN classification rule. ©2008 IEEE.

Full Text

Duke Authors

Cited Authors

  • Remus, JJ; Morton, KD; Torrione, PA; Tantum, SL; Collins, LM

Published Date

  • December 1, 2008

Published In

  • Proceedings of the 2008 Ieee Workshop on Machine Learning for Signal Processing, Mlsp 2008

Start / End Page

  • 362 - 367

International Standard Book Number 13 (ISBN-13)

  • 9781424423767

Digital Object Identifier (DOI)

  • 10.1109/MLSP.2008.4685507

Citation Source

  • Scopus