Skip to main content

CLUB: A contrastive log-ratio upper bound of mutual information

Publication ,  Conference
Cheng, P; Hao, W; Dai, S; Liu, J; Gan, Z; Carin, L
Published in: 37th International Conference on Machine Learning, ICML 2020
January 1, 2020

Mutual information (MI) minimization has gained considerable interests in various machine learning tasks. However, estimating and minimizing MI in high-dimensional spaces remains a challenging problem, especially when only samples, rather than distribution forms, are accessible. Previous works mainly focus on MI lower bound approximation, which is not applicable to MI minimization problems. In this paper, we propose a novel Contrastive Log-ratio Upper Bound (CLUB) of mutual information. We provide a theoretical analysis of the properties of CLUB and its variational approximation. Based on this upper bound, we introduce a MI minimization training scheme and further accelerate it with a negative sampling strategy. Simulation studies on Gaussian distributions show the reliable estimation ability of CLUB. Real-world MI minimization experiments, including domain adaptation and information bottleneck, demonstrate the effectiveness of the proposed method. The code is at https://github.com/Linear95/CLUB.

Duke Scholars

Published In

37th International Conference on Machine Learning, ICML 2020

Publication Date

January 1, 2020

Volume

PartF168147-3

Start / End Page

1757 / 1766
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Cheng, P., Hao, W., Dai, S., Liu, J., Gan, Z., & Carin, L. (2020). CLUB: A contrastive log-ratio upper bound of mutual information. In 37th International Conference on Machine Learning, ICML 2020 (Vol. PartF168147-3, pp. 1757–1766).
Cheng, P., W. Hao, S. Dai, J. Liu, Z. Gan, and L. Carin. “CLUB: A contrastive log-ratio upper bound of mutual information.” In 37th International Conference on Machine Learning, ICML 2020, PartF168147-3:1757–66, 2020.
Cheng P, Hao W, Dai S, Liu J, Gan Z, Carin L. CLUB: A contrastive log-ratio upper bound of mutual information. In: 37th International Conference on Machine Learning, ICML 2020. 2020. p. 1757–66.
Cheng, P., et al. “CLUB: A contrastive log-ratio upper bound of mutual information.” 37th International Conference on Machine Learning, ICML 2020, vol. PartF168147-3, 2020, pp. 1757–66.
Cheng P, Hao W, Dai S, Liu J, Gan Z, Carin L. CLUB: A contrastive log-ratio upper bound of mutual information. 37th International Conference on Machine Learning, ICML 2020. 2020. p. 1757–1766.

Published In

37th International Conference on Machine Learning, ICML 2020

Publication Date

January 1, 2020

Volume

PartF168147-3

Start / End Page

1757 / 1766