Skip to main content

DASNet: Dynamic activation sparsity for neural network efficiency improvement

Publication ,  Conference
Yang, Q; Mao, J; Wang, Z; Li, H
Published in: Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI
November 1, 2019

To improve the execution speed and efficiency of neural networks in embedded systems, it is crucial to decrease the model size and computational complexity. In addition to conventional compression techniques, e.g., weight pruning and quantization, removing unimportant activations can reduce the amount of data communication and the computation cost. Unlike weight parameters, the pattern of activations is directly related to input data and thereby changes dynamically. To regulate the dynamic activation sparsity (DAS), in this work, we propose a generic low-cost approach based on winners-take-all (WTA) dropout technique. The network enhanced by the proposed WTA dropout, namely DASNet, features structured activation sparsity with an improved sparsity level. Compared to the static feature map pruning methods, DASNets provide better computation cost reduction. The WTA technique can be easily applied in deep neural networks without incurring additional training variables. Our experiments on various networks and datasets present significant run-time speedups with negligible accuracy loss.

Duke Scholars

Published In

Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI

DOI

ISSN

1082-3409

Publication Date

November 1, 2019

Volume

2019-November

Start / End Page

1401 / 1405
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Yang, Q., Mao, J., Wang, Z., & Li, H. (2019). DASNet: Dynamic activation sparsity for neural network efficiency improvement. In Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI (Vol. 2019-November, pp. 1401–1405). https://doi.org/10.1109/ICTAI.2019.00197
Yang, Q., J. Mao, Z. Wang, and H. Li. “DASNet: Dynamic activation sparsity for neural network efficiency improvement.” In Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI, 2019-November:1401–5, 2019. https://doi.org/10.1109/ICTAI.2019.00197.
Yang Q, Mao J, Wang Z, Li H. DASNet: Dynamic activation sparsity for neural network efficiency improvement. In: Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI. 2019. p. 1401–5.
Yang, Q., et al. “DASNet: Dynamic activation sparsity for neural network efficiency improvement.” Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI, vol. 2019-November, 2019, pp. 1401–05. Scopus, doi:10.1109/ICTAI.2019.00197.
Yang Q, Mao J, Wang Z, Li H. DASNet: Dynamic activation sparsity for neural network efficiency improvement. Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI. 2019. p. 1401–1405.

Published In

Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI

DOI

ISSN

1082-3409

Publication Date

November 1, 2019

Volume

2019-November

Start / End Page

1401 / 1405