TPrune: Efficient Transformer Pruning for Mobile Devices

Journal Article (Journal Article)

The invention of Transformer model structure boosts the performance of Neural Machine Translation (NMT) tasks to an unprecedented level. Many previous works have been done to make the Transformer model more execution-friendly on resource-constrained platforms. These researches can be categorized into three key fields: Model Pruning, Transfer Learning, and Efficient Transformer Variants. The family of model pruning methods are popular for their simplicity in practice and promising compression rate and have achieved great success in the field of convolution neural networks (CNNs) for many vision tasks. Nonetheless, previous Transformer pruning works did not perform a thorough model analysis and evaluation on each Transformer component on off-the-shelf mobile devices. In this work, we analyze and prune transformer models at the line-wise granularity and also implement our pruning method on real mobile platforms. We explore the properties of all Transformer components as well as their sparsity features, which are leveraged to guide Transformer model pruning. We name our whole Transformer analysis and pruning pipeline as TPrune. In TPrune, we first propose Block-wise Structured Sparsity Learning (BSSL) to analyze Transformer model property. Then, based on the characters derived from BSSL, we apply Structured Hoyer Square (SHS) to derive the final pruned models. Comparing with the state-of-the-art Transformer pruning methods, TPrune is able to achieve a higher model compression rate with less performance degradation. Experimental results show that our pruned models achieve 1.16×-1.92× speedup on mobile devices with 0%-8% BLEU score degradation compared with the original Transformer model.

Full Text

Duke Authors

Cited Authors

  • Mao, J; Yang, H; Li, A; Li, H; Chen, Y

Published Date

  • July 1, 2021

Published In

  • Acm Transactions on Cyber Physical Systems

Volume / Issue

  • 5 / 3

Electronic International Standard Serial Number (EISSN)

  • 2378-9638

International Standard Serial Number (ISSN)

  • 2378-962X

Digital Object Identifier (DOI)

  • 10.1145/3446640

Citation Source

  • Scopus