Skip to main content

Restricted Recurrent Neural Networks

Publication ,  Journal Article
DIao, E; DIng, J; Tarokh, V
Published in: Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019
December 1, 2019

Recurrent Neural Network (RNN) and its variations such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), have become standard building blocks for learning online data of sequential nature in many research areas, including natural language processing and speech data analysis. In this paper, we present a new methodology to significantly reduce the number of parameters in RNNs while maintaining performance that is comparable or even better than classical RNNs. The new proposal, referred to as Restricted Recurrent Neural Network (RRNN), restricts the weight matrices corresponding to the input data and hidden states at each time step to share a large proportion of parameters. The new architecture can be regarded as a compression of its classical counterpart, but it does not require pre-training or sophisticated parameter fine-tuning, both of which are major issues in most existing compression techniques. Experiments on natural language modeling show that compared with its classical counterpart, the restricted recurrent architecture generally produces comparable results at about 50% compression rate. In particular, the Restricted LSTM can outperform classical RNN with even less number of parameters.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019

DOI

Publication Date

December 1, 2019

Start / End Page

56 / 63
 

Citation

APA
Chicago
ICMJE
MLA
NLM
DIao, E., DIng, J., & Tarokh, V. (2019). Restricted Recurrent Neural Networks. Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019, 56–63. https://doi.org/10.1109/BigData47090.2019.9006257
DIao, E., J. DIng, and V. Tarokh. “Restricted Recurrent Neural Networks.” Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019, December 1, 2019, 56–63. https://doi.org/10.1109/BigData47090.2019.9006257.
DIao E, DIng J, Tarokh V. Restricted Recurrent Neural Networks. Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019. 2019 Dec 1;56–63.
DIao, E., et al. “Restricted Recurrent Neural Networks.” Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019, Dec. 2019, pp. 56–63. Scopus, doi:10.1109/BigData47090.2019.9006257.
DIao E, DIng J, Tarokh V. Restricted Recurrent Neural Networks. Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019. 2019 Dec 1;56–63.

Published In

Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019

DOI

Publication Date

December 1, 2019

Start / End Page

56 / 63