A novel kernel-based maximum a posteriori classification method.

Journal Article (Journal Article)

Kernel methods have been widely used in pattern recognition. Many kernel classifiers such as Support Vector Machines (SVM) assume that data can be separated by a hyperplane in the kernel-induced feature space. These methods do not consider the data distribution and are difficult to output the probabilities or confidences for classification. This paper proposes a novel Kernel-based Maximum A Posteriori (KMAP) classification method, which makes a Gaussian distribution assumption instead of a linear separable assumption in the feature space. Robust methods are further proposed to estimate the probability densities, and the kernel trick is utilized to calculate our model. The model is theoretically and empirically important in the sense that: (1) it presents a more generalized classification model than other kernel-based algorithms, e.g., Kernel Fisher Discriminant Analysis (KFDA); (2) it can output probability or confidence for classification, therefore providing potential for reasoning under uncertainty; and (3) multi-way classification is as straightforward as binary classification in this model, because only probability calculation is involved and no one-against-one or one-against-others voting is needed. Moreover, we conduct an extensive experimental comparison with state-of-the-art classification methods, such as SVM and KFDA, on both eight UCI benchmark data sets and three face data sets. The results demonstrate that KMAP achieves very promising performance against other models.

Full Text

Duke Authors

Cited Authors

  • Xu, Z; Huang, K; Zhu, J; King, I; Lyu, MR

Published Date

  • September 2009

Published In

Volume / Issue

  • 22 / 7

Start / End Page

  • 977 - 987

PubMed ID

  • 19167865

Electronic International Standard Serial Number (EISSN)

  • 1879-2782

International Standard Serial Number (ISSN)

  • 0893-6080

Digital Object Identifier (DOI)

  • 10.1016/j.neunet.2008.11.005

Language

  • eng