Skip to main content
Journal cover image

Optimal approximation rate of ReLU networks in terms of width and depth

Publication ,  Journal Article
Shen, Z; Yang, H; Zhang, S
Published in: Journal des Mathematiques Pures et Appliquees
January 1, 2022

This paper concentrates on the approximation power of deep feed-forward neural networks in terms of width and depth. It is proved by construction that ReLU networks with width O(max⁡{d⌊N1/d⌋,N+2}) and depth O(L) can approximate a Hölder continuous function on [0,1]d with an approximation rate O(λd(N2L2ln⁡N)−α/d), where α∈(0,1] and λ>0 are Hölder order and constant, respectively. Such a rate is optimal up to a constant in terms of width and depth separately, while existing results are only nearly optimal without the logarithmic factor in the approximation rate. More generally, for an arbitrary continuous function f on [0,1]d, the approximation rate becomes O(dωf((N2L2ln⁡N)−1/d)), where ωf(⋅) is the modulus of continuity. We also extend our analysis to any continuous function f on a bounded set. Particularly, if ReLU networks with depth 31 and width O(N) are used to approximate one-dimensional Lipschitz continuous functions on [0,1] with a Lipschitz constant λ>0, the approximation rate in terms of the total number of parameters, W=O(N2), becomes [Formula presented], which has not been discovered in the literature for fixed-depth ReLU networks.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Journal des Mathematiques Pures et Appliquees

DOI

ISSN

0021-7824

Publication Date

January 1, 2022

Volume

157

Start / End Page

101 / 135

Related Subject Headings

  • General Mathematics
  • 4901 Applied mathematics
  • 0102 Applied Mathematics
  • 0101 Pure Mathematics
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Shen, Z., Yang, H., & Zhang, S. (2022). Optimal approximation rate of ReLU networks in terms of width and depth. Journal Des Mathematiques Pures et Appliquees, 157, 101–135. https://doi.org/10.1016/j.matpur.2021.07.009
Shen, Z., H. Yang, and S. Zhang. “Optimal approximation rate of ReLU networks in terms of width and depth.” Journal Des Mathematiques Pures et Appliquees 157 (January 1, 2022): 101–35. https://doi.org/10.1016/j.matpur.2021.07.009.
Shen Z, Yang H, Zhang S. Optimal approximation rate of ReLU networks in terms of width and depth. Journal des Mathematiques Pures et Appliquees. 2022 Jan 1;157:101–35.
Shen, Z., et al. “Optimal approximation rate of ReLU networks in terms of width and depth.” Journal Des Mathematiques Pures et Appliquees, vol. 157, Jan. 2022, pp. 101–35. Scopus, doi:10.1016/j.matpur.2021.07.009.
Shen Z, Yang H, Zhang S. Optimal approximation rate of ReLU networks in terms of width and depth. Journal des Mathematiques Pures et Appliquees. 2022 Jan 1;157:101–135.
Journal cover image

Published In

Journal des Mathematiques Pures et Appliquees

DOI

ISSN

0021-7824

Publication Date

January 1, 2022

Volume

157

Start / End Page

101 / 135

Related Subject Headings

  • General Mathematics
  • 4901 Applied mathematics
  • 0102 Applied Mathematics
  • 0101 Pure Mathematics