Skip to main content

An end-to-end generative architecture for paraphrase generation

Publication ,  Conference
Yang, Q; Huo, Z; Shen, D; Cheng, Y; Wang, W; Wang, G; Carin, L
Published in: EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference
January 1, 2019

Generating high-quality paraphrases is a fundamental yet challenging natural language processing task. Despite the effectiveness of previous work based on generative models, there remain problems with exposure bias in recurrent neural networks, and often a failure to generate realistic sentences. To overcome these challenges, we propose the first end-to-end conditional generative architecture for generating paraphrases via adversarial training, which does not depend on extra linguistic information. Extensive experiments on four public datasets demonstrate the proposed method achieves state-of-the-art results, outperforming previous generative architectures on both automatic metrics (BLEU, METEOR, and TER) and human evaluations.

Duke Scholars

Published In

EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference

Publication Date

January 1, 2019

Start / End Page

3132 / 3142
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Yang, Q., Huo, Z., Shen, D., Cheng, Y., Wang, W., Wang, G., & Carin, L. (2019). An end-to-end generative architecture for paraphrase generation. In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 3132–3142).
Yang, Q., Z. Huo, D. Shen, Y. Cheng, W. Wang, G. Wang, and L. Carin. “An end-to-end generative architecture for paraphrase generation.” In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference, 3132–42, 2019.
Yang Q, Huo Z, Shen D, Cheng Y, Wang W, Wang G, et al. An end-to-end generative architecture for paraphrase generation. In: EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference. 2019. p. 3132–42.
Yang, Q., et al. “An end-to-end generative architecture for paraphrase generation.” EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference, 2019, pp. 3132–42.
Yang Q, Huo Z, Shen D, Cheng Y, Wang W, Wang G, Carin L. An end-to-end generative architecture for paraphrase generation. EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference. 2019. p. 3132–3142.

Published In

EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference

Publication Date

January 1, 2019

Start / End Page

3132 / 3142