Skip to main content
Journal cover image

Arbitrary norm support vector machines.

Publication ,  Journal Article
Huang, K; Zheng, D; King, I; Lyu, MR
Published in: Neural computation
February 2009

Support vector machines (SVM) are state-of-the-art classifiers. Typically L2-norm or L1-norm is adopted as a regularization term in SVMs, while other norm-based SVMs, for example, the L0-norm SVM or even the L(infinity)-norm SVM, are rarely seen in the literature. The major reason is that L0-norm describes a discontinuous and nonconvex term, leading to a combinatorially NP-hard optimization problem. In this letter, motivated by Bayesian learning, we propose a novel framework that can implement arbitrary norm-based SVMs in polynomial time. One significant feature of this framework is that only a sequence of sequential minimal optimization problems needs to be solved, thus making it practical in many real applications. The proposed framework is important in the sense that Bayesian priors can be efficiently plugged into most learning methods without knowing the explicit form. Hence, this builds a connection between Bayesian learning and the kernel machines. We derive the theoretical framework, demonstrate how our approach works on the L0-norm SVM as a typical example, and perform a series of experiments to validate its advantages. Experimental results on nine benchmark data sets are very encouraging. The implemented L0-norm is competitive with or even better than the standard L2-norm SVM in terms of accuracy but with a reduced number of support vectors, -9.46% of the number on average. When compared with another sparse model, the relevance vector machine, our proposed algorithm also demonstrates better sparse properties with a training speed over seven times faster.

Duke Scholars

Published In

Neural computation

DOI

EISSN

1530-888X

ISSN

0899-7667

Publication Date

February 2009

Volume

21

Issue

2

Start / End Page

560 / 582

Related Subject Headings

  • Reference Values
  • Neural Networks, Computer
  • Learning
  • Humans
  • Databases, Factual
  • Bayes Theorem
  • Artificial Intelligence & Image Processing
  • Artificial Intelligence
  • Algorithms
  • 52 Psychology
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Huang, K., Zheng, D., King, I., & Lyu, M. R. (2009). Arbitrary norm support vector machines. Neural Computation, 21(2), 560–582. https://doi.org/10.1162/neco.2008.12-07-667
Huang, Kaizhu, Danian Zheng, Irwin King, and Michael R. Lyu. “Arbitrary norm support vector machines.Neural Computation 21, no. 2 (February 2009): 560–82. https://doi.org/10.1162/neco.2008.12-07-667.
Huang K, Zheng D, King I, Lyu MR. Arbitrary norm support vector machines. Neural computation. 2009 Feb;21(2):560–82.
Huang, Kaizhu, et al. “Arbitrary norm support vector machines.Neural Computation, vol. 21, no. 2, Feb. 2009, pp. 560–82. Epmc, doi:10.1162/neco.2008.12-07-667.
Huang K, Zheng D, King I, Lyu MR. Arbitrary norm support vector machines. Neural computation. 2009 Feb;21(2):560–582.
Journal cover image

Published In

Neural computation

DOI

EISSN

1530-888X

ISSN

0899-7667

Publication Date

February 2009

Volume

21

Issue

2

Start / End Page

560 / 582

Related Subject Headings

  • Reference Values
  • Neural Networks, Computer
  • Learning
  • Humans
  • Databases, Factual
  • Bayes Theorem
  • Artificial Intelligence & Image Processing
  • Artificial Intelligence
  • Algorithms
  • 52 Psychology