Exploring the cloud of variable importance for the set of all good models

Journal Article (Journal Article)

Variable importance is central to scientific studies, including the social sciences and causal inference, healthcare and other domains. However, current notions of variable importance are often tied to a specific predictive model. This is problematic: what if there were multiple well-performing predictive models, and a specific variable is important to some of them but not to others? In that case, we cannot tell from a single well-performing model if a variable is always important, sometimes important, never important or perhaps only important when another variable is not important. Ideally, we would like to explore variable importance for all approximately equally accurate predictive models within the same model class. In this way, we can understand the importance of a variable in the context of other variables, and for many good models. This work introduces the concept of a variable importance cloud, which maps every variable to its importance for every good predictive model. We show properties of the variable importance cloud and draw connections to other areas of statistics. We introduce variable importance diagrams as a projection of the variable importance cloud into two dimensions for visualization purposes. Experiments with criminal justice, marketing data and image classification tasks illustrate how variables can change dramatically in importance for approximately equally accurate predictive models.

Full Text

Duke Authors

Cited Authors

  • Dong, J; Rudin, C

Published Date

  • December 1, 2020

Published In

Volume / Issue

  • 2 / 12

Start / End Page

  • 810 - 824

Electronic International Standard Serial Number (EISSN)

  • 2522-5839

Digital Object Identifier (DOI)

  • 10.1038/s42256-020-00264-0

Citation Source

  • Scopus