Skip to main content

What you see may not be what you get: a brief, nontechnical introduction to overfitting in regression-type models.

Publication ,  Journal Article
Babyak, MA
Published in: Psychosom Med
2004

Statistical models, such as linear or logistic regression or survival analysis, are frequently used as a means to answer scientific questions in psychosomatic research. Many who use these techniques, however, apparently fail to appreciate fully the problem of overfitting, ie, capitalizing on the idiosyncrasies of the sample at hand. Overfitted models will fail to replicate in future samples, thus creating considerable uncertainty about the scientific merit of the finding. The present article is a nontechnical discussion of the concept of overfitting and is intended to be accessible to readers with varying levels of statistical expertise. The notion of overfitting is presented in terms of asking too much from the available data. Given a certain number of observations in a data set, there is an upper limit to the complexity of the model that can be derived with any acceptable degree of uncertainty. Complexity arises as a function of the number of degrees of freedom expended (the number of predictors including complex terms such as interactions and nonlinear terms) against the same data set during any stage of the data analysis. Theoretical and empirical evidence--with a special focus on the results of computer simulation studies--is presented to demonstrate the practical consequences of overfitting with respect to scientific inference. Three common practices--automated variable selection, pretesting of candidate predictors, and dichotomization of continuous variables--are shown to pose a considerable risk for spurious findings in models. The dilemma between overfitting and exploring candidate confounders is also discussed. Alternative means of guarding against overfitting are discussed, including variable aggregation and the fixing of coefficients a priori. Techniques that account and correct for complexity, including shrinkage and penalization, also are introduced.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Psychosom Med

DOI

EISSN

1534-7796

Publication Date

2004

Volume

66

Issue

3

Start / End Page

411 / 421

Location

United States

Related Subject Headings

  • Statistics as Topic
  • Research Design
  • Regression Analysis
  • Psychosomatic Medicine
  • Psychophysiology
  • Psychiatry
  • Models, Statistical
  • Humans
  • Data Interpretation, Statistical
  • Computer Simulation
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Babyak, M. A. (2004). What you see may not be what you get: a brief, nontechnical introduction to overfitting in regression-type models. Psychosom Med, 66(3), 411–421. https://doi.org/10.1097/01.psy.0000127692.23278.a9
Babyak, Michael A. “What you see may not be what you get: a brief, nontechnical introduction to overfitting in regression-type models.Psychosom Med 66, no. 3 (2004): 411–21. https://doi.org/10.1097/01.psy.0000127692.23278.a9.
Babyak, Michael A. “What you see may not be what you get: a brief, nontechnical introduction to overfitting in regression-type models.Psychosom Med, vol. 66, no. 3, 2004, pp. 411–21. Pubmed, doi:10.1097/01.psy.0000127692.23278.a9.

Published In

Psychosom Med

DOI

EISSN

1534-7796

Publication Date

2004

Volume

66

Issue

3

Start / End Page

411 / 421

Location

United States

Related Subject Headings

  • Statistics as Topic
  • Research Design
  • Regression Analysis
  • Psychosomatic Medicine
  • Psychophysiology
  • Psychiatry
  • Models, Statistical
  • Humans
  • Data Interpretation, Statistical
  • Computer Simulation