Skip to main content

TRP: Trained rank pruning for efficient deep neural networks

Publication ,  Conference
Xu, Y; Li, Y; Zhang, S; Wen, W; Wang, B; Qi, Y; Chen, Y; Lin, W; Xiong, H
Published in: IJCAI International Joint Conference on Artificial Intelligence
January 1, 2020

To enable DNNs on edge devices like mobile phones, low-rank approximation has been widely adopted because of its solid theoretical rationale and efficient implementations. Several previous works attempted to directly approximate a pre-trained model by low-rank decomposition; however, small approximation errors in parameters can ripple over a large prediction loss. As a result, performance usually drops significantly and a sophisticated effort on fine-tuning is required to recover accuracy. Apparently, it is not optimal to separate low-rank approximation from training. Unlike previous works, this paper integrates low rank approximation and regularization into the training process. We propose Trained Rank Pruning (TRP), which alternates between low rank approximation and training. TRP maintains the capacity of the original network while imposing low-rank constraints during training. A nuclear regularization optimized by stochastic sub-gradient descent is utilized to further promote low rank in TRP. The TRP trained network inherently has a low-rank structure, and is approximated with negligible performance loss, thus eliminating the fine-tuning process after low rank decomposition. The proposed method is comprehensively evaluated on CIFAR-10 and ImageNet, outperforming previous compression methods using low rank approximation.

Duke Scholars

Published In

IJCAI International Joint Conference on Artificial Intelligence

ISSN

1045-0823

ISBN

9780999241165

Publication Date

January 1, 2020

Volume

2021-January

Start / End Page

977 / 983
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Xu, Y., Li, Y., Zhang, S., Wen, W., Wang, B., Qi, Y., … Xiong, H. (2020). TRP: Trained rank pruning for efficient deep neural networks. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 977–983).
Xu, Y., Y. Li, S. Zhang, W. Wen, B. Wang, Y. Qi, Y. Chen, W. Lin, and H. Xiong. “TRP: Trained rank pruning for efficient deep neural networks.” In IJCAI International Joint Conference on Artificial Intelligence, 2021-January:977–83, 2020.
Xu Y, Li Y, Zhang S, Wen W, Wang B, Qi Y, et al. TRP: Trained rank pruning for efficient deep neural networks. In: IJCAI International Joint Conference on Artificial Intelligence. 2020. p. 977–83.
Xu, Y., et al. “TRP: Trained rank pruning for efficient deep neural networks.” IJCAI International Joint Conference on Artificial Intelligence, vol. 2021-January, 2020, pp. 977–83.
Xu Y, Li Y, Zhang S, Wen W, Wang B, Qi Y, Chen Y, Lin W, Xiong H. TRP: Trained rank pruning for efficient deep neural networks. IJCAI International Joint Conference on Artificial Intelligence. 2020. p. 977–983.

Published In

IJCAI International Joint Conference on Artificial Intelligence

ISSN

1045-0823

ISBN

9780999241165

Publication Date

January 1, 2020

Volume

2021-January

Start / End Page

977 / 983