Skip to main content

Neural Network Approximation of Refinable Functions

Publication ,  Journal Article
Daubechies, I; Devore, R; Dym, N; Faigenbaum-Golovin, S; Kovalsky, SZ; Lin, KC; Park, J; Petrova, G; Sober, B
Published in: IEEE Transactions on Information Theory
January 1, 2023

In the desire to quantify the success of neural networks in deep learning and other applications, there is a great interest in understanding which functions are efficiently approximated by the outputs of neural networks. By now, there exists a variety of results which show that a wide range of functions can be approximated with sometimes surprising accuracy by these outputs. For example, it is known that the set of functions that can be approximated with exponential accuracy (in terms of the number of parameters used) includes, on one hand, very smooth functions such as polynomials and analytic functions and, on the other hand, very rough functions such as the Weierstrass function, which is nowhere differentiable. In this paper, we add to the latter class of rough functions by showing that it also includes refinable functions. Namely, we show that refinable functions are approximated by the outputs of deep ReLU neural networks with a fixed width and increasing depth with accuracy exponential in terms of their number of parameters. Our results apply to functions used in the standard construction of wavelets as well as to functions constructed via subdivision algorithms in Computer Aided Geometric Design.

Duke Scholars

Published In

IEEE Transactions on Information Theory

DOI

EISSN

1557-9654

ISSN

0018-9448

Publication Date

January 1, 2023

Volume

69

Issue

1

Start / End Page

482 / 495

Related Subject Headings

  • Networking & Telecommunications
  • 4613 Theory of computation
  • 4006 Communications engineering
  • 1005 Communications Technologies
  • 0906 Electrical and Electronic Engineering
  • 0801 Artificial Intelligence and Image Processing
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Daubechies, I., Devore, R., Dym, N., Faigenbaum-Golovin, S., Kovalsky, S. Z., Lin, K. C., … Sober, B. (2023). Neural Network Approximation of Refinable Functions. IEEE Transactions on Information Theory, 69(1), 482–495. https://doi.org/10.1109/TIT.2022.3199601
Daubechies, I., R. Devore, N. Dym, S. Faigenbaum-Golovin, S. Z. Kovalsky, K. C. Lin, J. Park, G. Petrova, and B. Sober. “Neural Network Approximation of Refinable Functions.” IEEE Transactions on Information Theory 69, no. 1 (January 1, 2023): 482–95. https://doi.org/10.1109/TIT.2022.3199601.
Daubechies I, Devore R, Dym N, Faigenbaum-Golovin S, Kovalsky SZ, Lin KC, et al. Neural Network Approximation of Refinable Functions. IEEE Transactions on Information Theory. 2023 Jan 1;69(1):482–95.
Daubechies, I., et al. “Neural Network Approximation of Refinable Functions.” IEEE Transactions on Information Theory, vol. 69, no. 1, Jan. 2023, pp. 482–95. Scopus, doi:10.1109/TIT.2022.3199601.
Daubechies I, Devore R, Dym N, Faigenbaum-Golovin S, Kovalsky SZ, Lin KC, Park J, Petrova G, Sober B. Neural Network Approximation of Refinable Functions. IEEE Transactions on Information Theory. 2023 Jan 1;69(1):482–495.

Published In

IEEE Transactions on Information Theory

DOI

EISSN

1557-9654

ISSN

0018-9448

Publication Date

January 1, 2023

Volume

69

Issue

1

Start / End Page

482 / 495

Related Subject Headings

  • Networking & Telecommunications
  • 4613 Theory of computation
  • 4006 Communications engineering
  • 1005 Communications Technologies
  • 0906 Electrical and Electronic Engineering
  • 0801 Artificial Intelligence and Image Processing