Skip to main content

Fast Sparse Decision Tree Optimization via Reference Ensembles.

Publication ,  Journal Article
McTavish, H; Zhong, C; Achermann, R; Karimalis, I; Chen, J; Rudin, C; Seltzer, M
Published in: Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence
January 2022

Sparse decision tree optimization has been one of the most fundamental problems in AI since its inception and is a challenge at the core of interpretable machine learning. Sparse decision tree optimization is computationally hard, and despite steady effort since the 1960's, breakthroughs have been made on the problem only within the past few years, primarily on the problem of finding optimal sparse decision trees. However, current state-of-the-art algorithms often require impractical amounts of computation time and memory to find optimal or near-optimal trees for some real-world datasets, particularly those having several continuous-valued features. Given that the search spaces of these decision tree optimization problems are massive, can we practically hope to find a sparse decision tree that competes in accuracy with a black box machine learning model? We address this problem via smart guessing strategies that can be applied to any optimal branch-and-bound-based decision tree algorithm. The guesses come from knowledge gleaned from black box models. We show that by using these guesses, we can reduce the run time by multiple orders of magnitude while providing bounds on how far the resulting trees can deviate from the black box's accuracy and expressive power. Our approach enables guesses about how to bin continuous features, the size of the tree, and lower bounds on the error for the optimal decision tree. Our experiments show that in many cases we can rapidly construct sparse decision trees that match the accuracy of black box models. To summarize: when you are having trouble optimizing, just guess.

Duke Scholars

Published In

Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence

DOI

EISSN

2374-3468

ISSN

2159-5399

Publication Date

January 2022

Volume

36

Issue

9

Start / End Page

9604 / 9613
 

Citation

APA
Chicago
ICMJE
MLA
NLM
McTavish, H., Zhong, C., Achermann, R., Karimalis, I., Chen, J., Rudin, C., & Seltzer, M. (2022). Fast Sparse Decision Tree Optimization via Reference Ensembles. Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence, 36(9), 9604–9613. https://doi.org/10.1609/aaai.v36i9.21194
McTavish, Hayden, Chudi Zhong, Reto Achermann, Ilias Karimalis, Jacques Chen, Cynthia Rudin, and Margo Seltzer. “Fast Sparse Decision Tree Optimization via Reference Ensembles.Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence 36, no. 9 (January 2022): 9604–13. https://doi.org/10.1609/aaai.v36i9.21194.
McTavish H, Zhong C, Achermann R, Karimalis I, Chen J, Rudin C, et al. Fast Sparse Decision Tree Optimization via Reference Ensembles. Proceedings of the . AAAI Conference on Artificial Intelligence AAAI Conference on Artificial Intelligence. 2022 Jan;36(9):9604–13.
McTavish, Hayden, et al. “Fast Sparse Decision Tree Optimization via Reference Ensembles.Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence, vol. 36, no. 9, Jan. 2022, pp. 9604–13. Epmc, doi:10.1609/aaai.v36i9.21194.
McTavish H, Zhong C, Achermann R, Karimalis I, Chen J, Rudin C, Seltzer M. Fast Sparse Decision Tree Optimization via Reference Ensembles. Proceedings of the . AAAI Conference on Artificial Intelligence AAAI Conference on Artificial Intelligence. 2022 Jan;36(9):9604–9613.

Published In

Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence

DOI

EISSN

2374-3468

ISSN

2159-5399

Publication Date

January 2022

Volume

36

Issue

9

Start / End Page

9604 / 9613