Skip to main content

Discrete model compression with resource constraint for deep neural networks

Publication ,  Conference
Gao, S; Huang, F; Pei, J; Huang, H
Published in: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
January 1, 2020

In this paper, we target to address the problem of compression and acceleration of Convolutional Neural Networks (CNNs). Specifically, we propose a novel structural pruning method to obtain a compact CNN with strong discriminative power. To find such networks, we propose an efficient discrete optimization method to directly optimize channel-wise differentiable discrete gate under resource constraint while freezing all the other model parameters. Although directly optimizing discrete variables is a complex non-smooth, non-convex and NP-hard problem, our optimization method can circumvent these difficulties by using the straight-through estimator. Thus, our method is able to ensure that the sub-network discovered within the training process reflects the true sub-network. We further extend the discrete gate to its stochastic version in order to thoroughly explore the potential sub-networks. Unlike many previous methods requiring per-layer hyper-parameters, we only require one hyper-parameter to control FLOPs budget. Moreover, our method is globally discrimination-aware due to the discrete setting. The experimental results on CIFAR-10 and ImageNet show that our method is competitive with state-of-the-art methods.

Duke Scholars

Published In

Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition

DOI

ISSN

1063-6919

Publication Date

January 1, 2020

Start / End Page

1896 / 1905
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Gao, S., Huang, F., Pei, J., & Huang, H. (2020). Discrete model compression with resource constraint for deep neural networks. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (pp. 1896–1905). https://doi.org/10.1109/CVPR42600.2020.00197
Gao, S., F. Huang, J. Pei, and H. Huang. “Discrete model compression with resource constraint for deep neural networks.” In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1896–1905, 2020. https://doi.org/10.1109/CVPR42600.2020.00197.
Gao S, Huang F, Pei J, Huang H. Discrete model compression with resource constraint for deep neural networks. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2020. p. 1896–905.
Gao, S., et al. “Discrete model compression with resource constraint for deep neural networks.” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2020, pp. 1896–905. Scopus, doi:10.1109/CVPR42600.2020.00197.
Gao S, Huang F, Pei J, Huang H. Discrete model compression with resource constraint for deep neural networks. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2020. p. 1896–1905.

Published In

Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition

DOI

ISSN

1063-6919

Publication Date

January 1, 2020

Start / End Page

1896 / 1905