Skip to main content

Deep Network Approximation: Achieving Arbitrary Accuracy with Fixed Number of Neurons

Publication ,  Journal Article
Shen, Z; Yang, H; Zhang, S
Published in: Journal of Machine Learning Research
September 1, 2022

This paper develops simple feed-forward neural networks that achieve the universal approximation property for all continuous functions with a fixed finite number of neurons. These neural networks are simple because they are designed with a simple, computable, and continuous activation function σ leveraging a triangular-wave function and the softsign function. We first prove that σ-activated networks with width 36d(2d + 1) and depth 11 can approximate any continuous function on a d-dimensional hypercube within an arbitrarily small error. Hence, for supervised learning and its related regression problems, the hypothesis space generated by these networks with a size not smaller than 36d(2d + 1) × 11 is dense in the continuous function space C([a, b]d) and therefore dense in the Lebesgue spaces Lp([a, b]d) for p ∈ [1, ∞). Furthermore, we show that classification functions arising from image and signal classification are in the hypothesis space generated by σ-activated networks with width 36d(2d + 1) and depth 12 when there exist pairwise disjoint bounded closed subsets of Rd such that the samples of the same class are located in the same subset. Finally, we use numerical experimentation to show that replacing the rectified linear unit (ReLU) activation function by ours would improve the experiment results.

Duke Scholars

Published In

Journal of Machine Learning Research

EISSN

1533-7928

ISSN

1532-4435

Publication Date

September 1, 2022

Volume

23

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 17 Psychology and Cognitive Sciences
  • 08 Information and Computing Sciences
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Shen, Z., Yang, H., & Zhang, S. (2022). Deep Network Approximation: Achieving Arbitrary Accuracy with Fixed Number of Neurons. Journal of Machine Learning Research, 23.
Shen, Z., H. Yang, and S. Zhang. “Deep Network Approximation: Achieving Arbitrary Accuracy with Fixed Number of Neurons.” Journal of Machine Learning Research 23 (September 1, 2022).
Shen Z, Yang H, Zhang S. Deep Network Approximation: Achieving Arbitrary Accuracy with Fixed Number of Neurons. Journal of Machine Learning Research. 2022 Sep 1;23.
Shen, Z., et al. “Deep Network Approximation: Achieving Arbitrary Accuracy with Fixed Number of Neurons.” Journal of Machine Learning Research, vol. 23, Sept. 2022.
Shen Z, Yang H, Zhang S. Deep Network Approximation: Achieving Arbitrary Accuracy with Fixed Number of Neurons. Journal of Machine Learning Research. 2022 Sep 1;23.

Published In

Journal of Machine Learning Research

EISSN

1533-7928

ISSN

1532-4435

Publication Date

September 1, 2022

Volume

23

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 17 Psychology and Cognitive Sciences
  • 08 Information and Computing Sciences