Skip to main content

Learning structured sparsity in deep neural networks

Publication ,  Conference
Wen, W; Wu, C; Wang, Y; Chen, Y; Li, H
Published in: Advances in Neural Information Processing Systems
January 1, 2016

High demand for computation resources severely hinders deployment of large-scale Deep Neural Networks (DNN) in resource constrained devices. In this work, we propose a Structured Sparsity Learning (SSL) method to regularize the structures (i.e., filters, channels, filter shapes, and layer depth) of DNNs. SSL can: (1) learn a compact structure from a bigger DNN to reduce computation cost; (2) obtain a hardware-friendly structured sparsity of DNN to efficiently accelerate the DNN's evaluation. Experimental results show that SSL achieves on average 5.1× and 3.1× speedups of convolutional layer computation of AlexNet against CPU and GPU, respectively, with off-the-shelf libraries. These speedups are about twice speedups of non-structured sparsity; (3) regularize the DNN structure to improve classification accuracy. The results show that for CIFAR-10, regularization on layer depth reduces a 20-layer Deep Residual Network (ResNet) to 18 layers while improves the accuracy from 91.25% to 92.60%, which is still higher than that of original ResNet with 32 layers. For AlexNet, SSL reduces the error by ∼ 1%.

Duke Scholars

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2016

Start / End Page

2082 / 2090

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Wen, W., Wu, C., Wang, Y., Chen, Y., & Li, H. (2016). Learning structured sparsity in deep neural networks. In Advances in Neural Information Processing Systems (pp. 2082–2090).
Wen, W., C. Wu, Y. Wang, Y. Chen, and H. Li. “Learning structured sparsity in deep neural networks.” In Advances in Neural Information Processing Systems, 2082–90, 2016.
Wen W, Wu C, Wang Y, Chen Y, Li H. Learning structured sparsity in deep neural networks. In: Advances in Neural Information Processing Systems. 2016. p. 2082–90.
Wen, W., et al. “Learning structured sparsity in deep neural networks.” Advances in Neural Information Processing Systems, 2016, pp. 2082–90.
Wen W, Wu C, Wang Y, Chen Y, Li H. Learning structured sparsity in deep neural networks. Advances in Neural Information Processing Systems. 2016. p. 2082–2090.

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2016

Start / End Page

2082 / 2090

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology