Skip to main content

Deep Network Approximation: Beyond ReLU to Diverse Activation Functions

Publication ,  Journal Article
Zhang, S; Lu, J; Zhao, H
Published in: Journal of Machine Learning Research
January 1, 2024

This paper explores the expressive power of deep neural networks for a diverse range of activation functions. An activation function set A is defined to encompass the majority of commonly used activation functions, such as ReLU, LeakyReLU, ReLU2, ELU, CELU, SELU, Softplus, GELU, SiLU, Swish, Mish, Sigmoid, Tanh, Arctan, Softsign, dSiLU, and SRS. We demonstrate that for any activation function % ∈ A , a ReLU network of width N and depth L can be approximated to arbitrary precision by a %-activated network of width 3N and depth 2L on any bounded set. This finding enables the extension of most approximation results achieved with ReLU networks to a wide variety of other activation functions, albeit with slightly increased constants. Significantly, we establish that the (width, depth) scaling factors can be further reduced from (3, 2) to (1, 1) if % falls within a specific subset of A . This subset includes activation functions such as ELU, CELU, SELU, Softplus, GELU, SiLU, Swish, and Mish.

Duke Scholars

Published In

Journal of Machine Learning Research

EISSN

1533-7928

ISSN

1532-4435

Publication Date

January 1, 2024

Volume

25

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 17 Psychology and Cognitive Sciences
  • 08 Information and Computing Sciences
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Zhang, S., Lu, J., & Zhao, H. (2024). Deep Network Approximation: Beyond ReLU to Diverse Activation Functions. Journal of Machine Learning Research, 25.
Zhang, S., J. Lu, and H. Zhao. “Deep Network Approximation: Beyond ReLU to Diverse Activation Functions.” Journal of Machine Learning Research 25 (January 1, 2024).
Zhang S, Lu J, Zhao H. Deep Network Approximation: Beyond ReLU to Diverse Activation Functions. Journal of Machine Learning Research. 2024 Jan 1;25.
Zhang, S., et al. “Deep Network Approximation: Beyond ReLU to Diverse Activation Functions.” Journal of Machine Learning Research, vol. 25, Jan. 2024.
Zhang S, Lu J, Zhao H. Deep Network Approximation: Beyond ReLU to Diverse Activation Functions. Journal of Machine Learning Research. 2024 Jan 1;25.

Published In

Journal of Machine Learning Research

EISSN

1533-7928

ISSN

1532-4435

Publication Date

January 1, 2024

Volume

25

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 4905 Statistics
  • 4611 Machine learning
  • 17 Psychology and Cognitive Sciences
  • 08 Information and Computing Sciences