Skip to main content

PLATO: Pre-trained dialogue generation model with discrete latent variable

Publication ,  Conference
Bao, S; He, H; Wang, F; Wu, H; Wang, H
Published in: Proceedings of the Annual Meeting of the Association for Computational Linguistics
January 1, 2020

Pre-training models have been proved effective for a wide range of natural language processing tasks. Inspired by this, we propose a novel dialogue generation pre-training framework to support various kinds of conversations, including chit-chat, knowledge grounded dialogues, and conversational question answering. In this framework, we adopt flexible attention mechanisms to fully leverage the bi-directional context and the uni-directional characteristic of language generation. We also introduce discrete latent variables to tackle the inherent one-to-many mapping problem in response generation. Two reciprocal tasks of response generation and latent act recognition are designed and carried out simultaneously within a shared network. Comprehensive experiments on three publicly available datasets verify the effectiveness and superiority of the proposed framework.

Duke Scholars

Published In

Proceedings of the Annual Meeting of the Association for Computational Linguistics

ISSN

0736-587X

Publication Date

January 1, 2020

Start / End Page

85 / 96
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Bao, S., He, H., Wang, F., Wu, H., & Wang, H. (2020). PLATO: Pre-trained dialogue generation model with discrete latent variable. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 85–96).
Bao, S., H. He, F. Wang, H. Wu, and H. Wang. “PLATO: Pre-trained dialogue generation model with discrete latent variable.” In Proceedings of the Annual Meeting of the Association for Computational Linguistics, 85–96, 2020.
Bao S, He H, Wang F, Wu H, Wang H. PLATO: Pre-trained dialogue generation model with discrete latent variable. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. 2020. p. 85–96.
Bao, S., et al. “PLATO: Pre-trained dialogue generation model with discrete latent variable.” Proceedings of the Annual Meeting of the Association for Computational Linguistics, 2020, pp. 85–96.
Bao S, He H, Wang F, Wu H, Wang H. PLATO: Pre-trained dialogue generation model with discrete latent variable. Proceedings of the Annual Meeting of the Association for Computational Linguistics. 2020. p. 85–96.

Published In

Proceedings of the Annual Meeting of the Association for Computational Linguistics

ISSN

0736-587X

Publication Date

January 1, 2020

Start / End Page

85 / 96