On the dynamics of boosting
In order to understand AdaBoost's dynamics, especially its ability to maximize margins, we derive an associated simplified nonlinear iterated map and analyze its behavior in low-dimensional cases. We find stable cycles for these cases, which can explicitly be used to solve for Ada- Boost's output. By considering AdaBoost as a dynamical system, we are able to prove R̈atsch and Warmuth's conjecture that AdaBoost may fail to converge to a maximal-margin combined classifier when given a 'nonoptimal' weak learning algorithm. AdaBoost is known to be a coordinate descent method, but other known algorithms that explicitly aim to maximize the margin (such as AdaBoost and arc-gv) are not. We consider a differentiable function for which coordinate ascent will yield a maximum margin solution. We then make a simple approximation to derive a new boosting algorithm whose updates are slightly more aggressive than those of arcgv.
Duke Scholars
Published In
ISSN
Publication Date
Related Subject Headings
- 4611 Machine learning
- 1702 Cognitive Sciences
- 1701 Psychology
Citation
Published In
ISSN
Publication Date
Related Subject Headings
- 4611 Machine learning
- 1702 Cognitive Sciences
- 1701 Psychology