Skip to main content
Journal cover image

Neural network approximation: Three hidden layers are enough.

Publication ,  Journal Article
Shen, Z; Yang, H; Zhang, S
Published in: Neural networks : the official journal of the International Neural Network Society
September 2021

A three-hidden-layer neural network with super approximation power is introduced. This network is built with the floor function (⌊x⌋), the exponential function (2x), the step function (1x≥0), or their compositions as the activation function in each neuron and hence we call such networks as Floor-Exponential-Step (FLES) networks. For any width hyper-parameter N∈N+, it is shown that FLES networks with width max{d,N} and three hidden layers can uniformly approximate a Hölder continuous function f on [0,1]d with an exponential approximation rate 3λ(2d)α2-αN, where α∈(0,1] and λ>0 are the Hölder order and constant, respectively. More generally for an arbitrary continuous function f on [0,1]d with a modulus of continuity ωf(⋅), the constructive approximation rate is 2ωf(2d)2-Nf(2d2-N). Moreover, we extend such a result to general bounded continuous functions on a bounded set E⊆Rd. As a consequence, this new class of networks overcomes the curse of dimensionality in approximation power when the variation of ωf(r) as r→0 is moderate (e.g., ωf(r)≲rα for Hölder continuous functions), since the major term to be concerned in our approximation rate is essentially d times a function of N independent of d within the modulus of continuity. Finally, we extend our analysis to derive similar approximation results in the Lp-norm for p∈[1,∞) via replacing Floor-Exponential-Step activation functions by continuous activation functions.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Neural networks : the official journal of the International Neural Network Society

DOI

EISSN

1879-2782

ISSN

0893-6080

Publication Date

September 2021

Volume

141

Start / End Page

160 / 173

Related Subject Headings

  • Neural Networks, Computer
  • Deep Learning
  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 4602 Artificial intelligence
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Shen, Z., Yang, H., & Zhang, S. (2021). Neural network approximation: Three hidden layers are enough. Neural Networks : The Official Journal of the International Neural Network Society, 141, 160–173. https://doi.org/10.1016/j.neunet.2021.04.011
Shen, Zuowei, Haizhao Yang, and Shijun Zhang. “Neural network approximation: Three hidden layers are enough.Neural Networks : The Official Journal of the International Neural Network Society 141 (September 2021): 160–73. https://doi.org/10.1016/j.neunet.2021.04.011.
Shen Z, Yang H, Zhang S. Neural network approximation: Three hidden layers are enough. Neural networks : the official journal of the International Neural Network Society. 2021 Sep;141:160–73.
Shen, Zuowei, et al. “Neural network approximation: Three hidden layers are enough.Neural Networks : The Official Journal of the International Neural Network Society, vol. 141, Sept. 2021, pp. 160–73. Epmc, doi:10.1016/j.neunet.2021.04.011.
Shen Z, Yang H, Zhang S. Neural network approximation: Three hidden layers are enough. Neural networks : the official journal of the International Neural Network Society. 2021 Sep;141:160–173.
Journal cover image

Published In

Neural networks : the official journal of the International Neural Network Society

DOI

EISSN

1879-2782

ISSN

0893-6080

Publication Date

September 2021

Volume

141

Start / End Page

160 / 173

Related Subject Headings

  • Neural Networks, Computer
  • Deep Learning
  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 4602 Artificial intelligence