Skip to main content

On Statistical Efficiency in Learning

Publication ,  Journal Article
Ding, J; Diao, E; Zhou, J; Tarokh, V
Published in: IEEE Transactions on Information Theory
April 1, 2021

A central issue of many statistical learning problems is to select an appropriate model from a set of candidate models. Large models tend to inflate the variance (or overfitting), while small models tend to cause biases (or underfitting) for a given fixed dataset. In this work, we address the critical challenge of model selection to strike a balance between model fitting and model complexity, thus gaining reliable predictive power. We consider the task of approaching the theoretical limit of statistical learning, meaning that the selected model has the predictive performance that is as good as the best possible model given a class of potentially misspecified candidate models. We propose a generalized notion of Takeuchi's information criterion and prove that the proposed method can asymptotically achieve the optimal out-sample prediction loss under reasonable assumptions. It is the first proof of the asymptotic property of Takeuchi's information criterion to our best knowledge. Our proof applies to a wide variety of nonlinear models, loss functions, and high dimensionality (in the sense that the models' complexity can grow with sample size). The proposed method can be used as a computationally efficient surrogate for leave-one-out cross-validation. Moreover, for modeling streaming data, we propose an online algorithm that sequentially expands the model complexity to enhance selection stability and reduce computation cost. Experimental studies show that the proposed method has desirable predictive power and significantly less computational cost than some popular methods.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

IEEE Transactions on Information Theory

DOI

EISSN

1557-9654

ISSN

0018-9448

Publication Date

April 1, 2021

Volume

67

Issue

4

Start / End Page

2488 / 2506

Related Subject Headings

  • Networking & Telecommunications
  • 4613 Theory of computation
  • 4006 Communications engineering
  • 1005 Communications Technologies
  • 0906 Electrical and Electronic Engineering
  • 0801 Artificial Intelligence and Image Processing
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Ding, J., Diao, E., Zhou, J., & Tarokh, V. (2021). On Statistical Efficiency in Learning. IEEE Transactions on Information Theory, 67(4), 2488–2506. https://doi.org/10.1109/TIT.2020.3047620
Ding, J., E. Diao, J. Zhou, and V. Tarokh. “On Statistical Efficiency in Learning.” IEEE Transactions on Information Theory 67, no. 4 (April 1, 2021): 2488–2506. https://doi.org/10.1109/TIT.2020.3047620.
Ding J, Diao E, Zhou J, Tarokh V. On Statistical Efficiency in Learning. IEEE Transactions on Information Theory. 2021 Apr 1;67(4):2488–506.
Ding, J., et al. “On Statistical Efficiency in Learning.” IEEE Transactions on Information Theory, vol. 67, no. 4, Apr. 2021, pp. 2488–506. Scopus, doi:10.1109/TIT.2020.3047620.
Ding J, Diao E, Zhou J, Tarokh V. On Statistical Efficiency in Learning. IEEE Transactions on Information Theory. 2021 Apr 1;67(4):2488–2506.

Published In

IEEE Transactions on Information Theory

DOI

EISSN

1557-9654

ISSN

0018-9448

Publication Date

April 1, 2021

Volume

67

Issue

4

Start / End Page

2488 / 2506

Related Subject Headings

  • Networking & Telecommunications
  • 4613 Theory of computation
  • 4006 Communications engineering
  • 1005 Communications Technologies
  • 0906 Electrical and Electronic Engineering
  • 0801 Artificial Intelligence and Image Processing