Skip to main content

Rate-distortion bounds on Bayes risk in supervised learning

Publication ,  Conference
Nokleby, M; Beirami, A; Calderbank, R
Published in: IEEE International Symposium on Information Theory - Proceedings
August 10, 2016

An information-theoretic framework is presented for estimating the number of labeled samples needed to train a classifier in a parametric Bayesian setting. Ideas from rate-distortion theory are used to derive bounds for the average L1 or L∞ distance between the learned classifier and the true maximum a posteriori classifier in terms of familiar information-theoretic quantities and the number of training samples available. The maximum a posteriori classifier is viewed as a random source, labeled training data are viewed as a finite-rate encoding of the source, and the L1 or L∞ Bayes risk is viewed as the average distortion. The result is a framework dual to the well-known probably approximately correct (PAC) framework. PAC bounds characterize worst-case learning performance of a family of classifiers whose complexity is captured by the Vapnik-Chervonenkis (VC) dimension. The rate-distortion framework, on the other hand, characterizes the average-case performance of a family of data distributions in terms of a quantity called the interpolation dimension, which represents the complexity of the family of data distributions. The resulting bounds do not suffer from the pessimism typical of the PAC framework, particularly when the training set is small.

Duke Scholars

Published In

IEEE International Symposium on Information Theory - Proceedings

DOI

ISSN

2157-8095

Publication Date

August 10, 2016

Volume

2016-August

Start / End Page

2099 / 2103
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Nokleby, M., Beirami, A., & Calderbank, R. (2016). Rate-distortion bounds on Bayes risk in supervised learning. In IEEE International Symposium on Information Theory - Proceedings (Vol. 2016-August, pp. 2099–2103). https://doi.org/10.1109/ISIT.2016.7541669
Nokleby, M., A. Beirami, and R. Calderbank. “Rate-distortion bounds on Bayes risk in supervised learning.” In IEEE International Symposium on Information Theory - Proceedings, 2016-August:2099–2103, 2016. https://doi.org/10.1109/ISIT.2016.7541669.
Nokleby M, Beirami A, Calderbank R. Rate-distortion bounds on Bayes risk in supervised learning. In: IEEE International Symposium on Information Theory - Proceedings. 2016. p. 2099–103.
Nokleby, M., et al. “Rate-distortion bounds on Bayes risk in supervised learning.” IEEE International Symposium on Information Theory - Proceedings, vol. 2016-August, 2016, pp. 2099–103. Scopus, doi:10.1109/ISIT.2016.7541669.
Nokleby M, Beirami A, Calderbank R. Rate-distortion bounds on Bayes risk in supervised learning. IEEE International Symposium on Information Theory - Proceedings. 2016. p. 2099–2103.

Published In

IEEE International Symposium on Information Theory - Proceedings

DOI

ISSN

2157-8095

Publication Date

August 10, 2016

Volume

2016-August

Start / End Page

2099 / 2103