Skip to main content

Dynamic Regularization on Activation Sparsity for Neural Network Efficiency Improvement

Publication ,  Journal Article
Yang, Q; Mao, J; Wang, Z; Hai, L
Published in: ACM Journal on Emerging Technologies in Computing Systems
October 1, 2021

When deploying deep neural networks in embedded systems, it is crucial to decrease the model size and computational complexity for improving the execution speed and efficiency. In addition to conventional compression techniques, e.g., weight pruning and quantization, removing unimportant activations can also dramatically reduce the amount of data communication and the computation cost. Unlike weight parameters, the pattern of activations is directly related to input data and thereby changes dynamically. To regulate the dynamic activation sparsity (DAS), in this work, we propose a generic low-cost approach based on winners-take-all (WTA) dropout technique. The network enhanced by the proposed WTA dropout, namely DASNet, features structured activation sparsity with an improved sparsity level. Compared to the static feature map pruning methods, DASNets provide better computation cost reduction. The WTA dropout technique can be easily applied in deep neural networks without incurring additional training variables. More importantly, DASNet can be seamlessly integrated with other compression techniques, such as weight pruning and quantization, without compromising accuracy. Our experiments on various networks and datasets present significant runtime speedups with negligible accuracy losses.

Duke Scholars

Published In

ACM Journal on Emerging Technologies in Computing Systems

DOI

EISSN

1550-4840

ISSN

1550-4832

Publication Date

October 1, 2021

Volume

17

Issue

4

Related Subject Headings

  • Computer Hardware & Architecture
  • 4606 Distributed computing and systems software
  • 1007 Nanotechnology
  • 1006 Computer Hardware
  • 0906 Electrical and Electronic Engineering
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Yang, Q., Mao, J., Wang, Z., & Hai, L. (2021). Dynamic Regularization on Activation Sparsity for Neural Network Efficiency Improvement. ACM Journal on Emerging Technologies in Computing Systems, 17(4). https://doi.org/10.1145/3447776
Yang, Q., J. Mao, Z. Wang, and L. Hai. “Dynamic Regularization on Activation Sparsity for Neural Network Efficiency Improvement.” ACM Journal on Emerging Technologies in Computing Systems 17, no. 4 (October 1, 2021). https://doi.org/10.1145/3447776.
Yang Q, Mao J, Wang Z, Hai L. Dynamic Regularization on Activation Sparsity for Neural Network Efficiency Improvement. ACM Journal on Emerging Technologies in Computing Systems. 2021 Oct 1;17(4).
Yang, Q., et al. “Dynamic Regularization on Activation Sparsity for Neural Network Efficiency Improvement.” ACM Journal on Emerging Technologies in Computing Systems, vol. 17, no. 4, Oct. 2021. Scopus, doi:10.1145/3447776.
Yang Q, Mao J, Wang Z, Hai L. Dynamic Regularization on Activation Sparsity for Neural Network Efficiency Improvement. ACM Journal on Emerging Technologies in Computing Systems. 2021 Oct 1;17(4).

Published In

ACM Journal on Emerging Technologies in Computing Systems

DOI

EISSN

1550-4840

ISSN

1550-4832

Publication Date

October 1, 2021

Volume

17

Issue

4

Related Subject Headings

  • Computer Hardware & Architecture
  • 4606 Distributed computing and systems software
  • 1007 Nanotechnology
  • 1006 Computer Hardware
  • 0906 Electrical and Electronic Engineering