Skip to main content

The rate of convergence of AdaBoost

Publication ,  Journal Article
Mukherjee, I; Rudin, C; Schapire, RE
Published in: Journal of Machine Learning Research
August 1, 2013

The AdaBoost algorithm was designed to combine many "weak" hypotheses that perform slightly better than random guessing into a "strong" hypothesis that has very low error. We study the rate at which AdaBoost iteratively converges to the minimum of the "exponential loss." Unlike previous work, our proofs do not require a weak-learning assumption, nor do they require that minimizers of the exponential loss are finite. Our first result shows that the exponential loss of AdaBoost's computed parameter vector will be at most e more than that of any parameter vector of l1-norm bounded by B in a number of rounds that is at most a polynomial in B and 1/ε. We also provide lower bounds showing that a polynomial dependence is necessary. Our second result is that within C/ε iterations, AdaBoost achieves a value of the exponential loss that is at most e more than the best possible value, where C depends on the data set. We show that this dependence of the rate on ε is optimal up to constant factors, that is, at least Ω(1/ε) rounds are necessary to achieve within e of the optimal exponential loss. © 2013 Indraneel Mukherjee, Cynthia Rudin and Robert E. Schapire.

Duke Scholars

Published In

Journal of Machine Learning Research

EISSN

1533-7928

ISSN

1532-4435

Publication Date

August 1, 2013

Volume

14

Start / End Page

2315 / 2347

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 17 Psychology and Cognitive Sciences
  • 08 Information and Computing Sciences
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Mukherjee, I., Rudin, C., & Schapire, R. E. (2013). The rate of convergence of AdaBoost. Journal of Machine Learning Research, 14, 2315–2347.
Mukherjee, I., C. Rudin, and R. E. Schapire. “The rate of convergence of AdaBoost.” Journal of Machine Learning Research 14 (August 1, 2013): 2315–47.
Mukherjee I, Rudin C, Schapire RE. The rate of convergence of AdaBoost. Journal of Machine Learning Research. 2013 Aug 1;14:2315–47.
Mukherjee, I., et al. “The rate of convergence of AdaBoost.” Journal of Machine Learning Research, vol. 14, Aug. 2013, pp. 2315–47.
Mukherjee I, Rudin C, Schapire RE. The rate of convergence of AdaBoost. Journal of Machine Learning Research. 2013 Aug 1;14:2315–2347.

Published In

Journal of Machine Learning Research

EISSN

1533-7928

ISSN

1532-4435

Publication Date

August 1, 2013

Volume

14

Start / End Page

2315 / 2347

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 17 Psychology and Cognitive Sciences
  • 08 Information and Computing Sciences