Skip to main content

On the Existence of Simpler Machine Learning Models

Publication ,  Conference
Semenova, L; Rudin, C; Parr, R
Published in: ACM International Conference Proceeding Series
June 21, 2022

It is almost always easier to find an accurate-but-complex model than an accurate-yet-simple model. Finding optimal, sparse, accurate models of various forms (linear models with integer coefficients, decision sets, rule lists, decision trees) is generally NP-hard. We often do not know whether the search for a simpler model will be worthwhile, and thus we do not go to the trouble of searching for one. In this work, we ask an important practical question: can accurate-yet-simple models be proven to exist, or shown likely to exist, before explicitly searching for them? We hypothesize that there is an important reason that simple-yet-accurate models often do exist. This hypothesis is that the size of the Rashomon set is often large, where the Rashomon set is the set of almost-equally-accurate models from a function class. If the Rashomon set is large, it contains numerous accurate models, and perhaps at least one of them is the simple model we desire. In this work, we formally present the Rashomon ratio as a new gauge of simplicity for a learning problem, depending on a function class and a data set. The Rashomon ratio is the ratio of the volume of the set of accurate models to the volume of the hypothesis space, and it is different from standard complexity measures from statistical learning theory. Insight from studying the Rashomon ratio provides an easy way to check whether a simpler model might exist for a problem before finding it, namely whether several different machine learning methods achieve similar performance on the data. In that sense, the Rashomon ratio is a powerful tool for understanding why and when an accurate-yet-simple model might exist. If, as we hypothesize in this work, many real-world data sets admit large Rashomon sets, the implications are vast: it means that simple or interpretable models may often be used for high-stakes decisions without losing accuracy.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

ACM International Conference Proceeding Series

DOI

Publication Date

June 21, 2022

Start / End Page

1827 / 1858
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Semenova, L., Rudin, C., & Parr, R. (2022). On the Existence of Simpler Machine Learning Models. In ACM International Conference Proceeding Series (pp. 1827–1858). https://doi.org/10.1145/3531146.3533232
Semenova, L., C. Rudin, and R. Parr. “On the Existence of Simpler Machine Learning Models.” In ACM International Conference Proceeding Series, 1827–58, 2022. https://doi.org/10.1145/3531146.3533232.
Semenova L, Rudin C, Parr R. On the Existence of Simpler Machine Learning Models. In: ACM International Conference Proceeding Series. 2022. p. 1827–58.
Semenova, L., et al. “On the Existence of Simpler Machine Learning Models.” ACM International Conference Proceeding Series, 2022, pp. 1827–58. Scopus, doi:10.1145/3531146.3533232.
Semenova L, Rudin C, Parr R. On the Existence of Simpler Machine Learning Models. ACM International Conference Proceeding Series. 2022. p. 1827–1858.

Published In

ACM International Conference Proceeding Series

DOI

Publication Date

June 21, 2022

Start / End Page

1827 / 1858