Skip to main content

Demystifying dropout

Publication ,  Conference
Gao, H; Pei, J; Huang, H
Published in: 36th International Conference on Machine Learning, ICML 2019
January 1, 2019

Dropout is a popular technique to train large-scale deep neural networks to alleviate the overfitting problem. To disclose the underlying reason for its gain, numerous works have tried to explain it from different perspectives. In this paper, unlike existing works, we explore it from a new perspective to provide new insight into this line of research. In detail, we disentangle the forward and backward pass of dropout. Then, we find that these two passes need different levels of noise to improve the generalization performance of deep neural networks. Based on this observation, we propose the augmented dropout, which employs different dropping strategies in the forward and backward pass, to improve the standard dropout. Experimental results have verified the effectiveness of our proposed method.

Duke Scholars

Published In

36th International Conference on Machine Learning, ICML 2019

ISBN

9781510886988

Publication Date

January 1, 2019

Volume

2019-June

Start / End Page

3692 / 3701
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Gao, H., Pei, J., & Huang, H. (2019). Demystifying dropout. In 36th International Conference on Machine Learning, ICML 2019 (Vol. 2019-June, pp. 3692–3701).
Gao, H., J. Pei, and H. Huang. “Demystifying dropout.” In 36th International Conference on Machine Learning, ICML 2019, 2019-June:3692–3701, 2019.
Gao H, Pei J, Huang H. Demystifying dropout. In: 36th International Conference on Machine Learning, ICML 2019. 2019. p. 3692–701.
Gao, H., et al. “Demystifying dropout.” 36th International Conference on Machine Learning, ICML 2019, vol. 2019-June, 2019, pp. 3692–701.
Gao H, Pei J, Huang H. Demystifying dropout. 36th International Conference on Machine Learning, ICML 2019. 2019. p. 3692–3701.

Published In

36th International Conference on Machine Learning, ICML 2019

ISBN

9781510886988

Publication Date

January 1, 2019

Volume

2019-June

Start / End Page

3692 / 3701