DEEP NETWORK APPROXIMATION FOR SMOOTH FUNCTIONS
\bfA \bfb \bfs \bft \bfr \bfa \bfc \bft . This paper establishes the optimal approximation error characterization of deep rectified linear unit (ReLU) networks for smooth functions in terms of both width and depth simultaneously. To that end, we first prove that multivariate polynomials can be approximated by deep ReLU networks of width \scrO (N) and depth \scrO (L) with an approximation error \scrO (N - L). Through local Taylor expansions and their deep ReLU network approximations, we show that deep ReLU networks of width \scrO (N ln N) and depth \scrO (Lln L) can approximate f \in Cs([0, 1]d) with a nearly optimal approximation error \scrO (\| f\| Cs([0,1]d)N -2s/dL -2s/d). Our estimate is nonasymptotic in the sense that it is valid for arbitrary width and depth specified by N \in \BbbN + and L \in \BbbN +, respectively.
Duke Scholars
Altmetric Attention Stats
Dimensions Citation Stats
Published In
DOI
EISSN
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- Applied Mathematics
- 4904 Pure mathematics
- 4901 Applied mathematics
- 0102 Applied Mathematics
- 0101 Pure Mathematics
Citation
Published In
DOI
EISSN
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- Applied Mathematics
- 4904 Pure mathematics
- 4901 Applied mathematics
- 0102 Applied Mathematics
- 0101 Pure Mathematics