Imbalanced learning with a biased minimax probability machine.

Journal Article (Letter)

Imbalanced learning is a challenged task in machine learning. In this context, the data associated with one class are far fewer than those associated with the other class. Traditional machine learning methods seeking classification accuracy over a full range of instances are not suitable to deal with this problem, since they tend to classify all the data into a majority class, usually the less important class. In this correspondence, the authors describe a new approach named the biased minimax probability machine (BMPM) to deal with the problem of imbalanced learning. This BMPM model is demonstrated to provide an elegant and systematic way for imbalanced learning. More specifically, by controlling the accuracy of the majority class under all possible choices of class-conditional densities with a given mean and covariance matrix, this model can quantitatively and systematically incorporate a bias for the minority class. By establishing an explicit connection between the classification accuracy and the bias, this approach distinguishes itself from the many current imbalanced-learning methods; these methods often impose a certain bias on the minority data by adapting intermediate factors via the trial-and-error procedure. The authors detail the theoretical foundation, prove its solvability, propose an efficient optimization algorithm, and perform a series of experiments to evaluate the novel model. The comparison with other competitive methods demonstrates the effectiveness of this new model.

Full Text

Duke Authors

Cited Authors

  • Huang, K; Yang, H; King, I; Lyu, MR

Published Date

  • August 2006

Published In

Volume / Issue

  • 36 / 4

Start / End Page

  • 913 - 923

PubMed ID

  • 16903374

Electronic International Standard Serial Number (EISSN)

  • 1941-0492

International Standard Serial Number (ISSN)

  • 1083-4419

Digital Object Identifier (DOI)

  • 10.1109/tsmcb.2006.870610

Language

  • eng