Skip to main content

DEPTH SEPARATION WITH MULTILAYER MEAN-FIELD NETWORKS

Publication ,  Conference
Ren, Y; Zhou, M; Ge, R
Published in: 11th International Conference on Learning Representations, ICLR 2023
January 1, 2023

Depth separation-why a deeper network is more powerful than a shallower one-has been a major problem in deep learning theory. Previous results often focus on representation power. For example, Safran et al. (2019) constructed a function that is easy to approximate using a 3-layer network but not approximable by any 2-layer network. In this paper, we show that this separation is in fact algorithmic: one can learn the function constructed by Safran et al. (2019) using an overparameterized network with polynomially many neurons efficiently. Our result relies on a new way of extending the mean-field limit to multilayer networks, and a decomposition of loss that factors out the error introduced by the discretization of infinite-width mean-field networks.

Duke Scholars

Published In

11th International Conference on Learning Representations, ICLR 2023

Publication Date

January 1, 2023
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Ren, Y., Zhou, M., & Ge, R. (2023). DEPTH SEPARATION WITH MULTILAYER MEAN-FIELD NETWORKS. In 11th International Conference on Learning Representations, ICLR 2023.
Ren, Y., M. Zhou, and R. Ge. “DEPTH SEPARATION WITH MULTILAYER MEAN-FIELD NETWORKS.” In 11th International Conference on Learning Representations, ICLR 2023, 2023.
Ren Y, Zhou M, Ge R. DEPTH SEPARATION WITH MULTILAYER MEAN-FIELD NETWORKS. In: 11th International Conference on Learning Representations, ICLR 2023. 2023.
Ren, Y., et al. “DEPTH SEPARATION WITH MULTILAYER MEAN-FIELD NETWORKS.” 11th International Conference on Learning Representations, ICLR 2023, 2023.
Ren Y, Zhou M, Ge R. DEPTH SEPARATION WITH MULTILAYER MEAN-FIELD NETWORKS. 11th International Conference on Learning Representations, ICLR 2023. 2023.

Published In

11th International Conference on Learning Representations, ICLR 2023

Publication Date

January 1, 2023