Skip to main content

Structural Contrastive Pretraining for Cross-Lingual Comprehension

Publication ,  Conference
Chen, N; Shou, L; Song, T; Gong, M; Pei, J; Chang, J; Jiang, D; Li, J
Published in: Proceedings of the Annual Meeting of the Association for Computational Linguistics
January 1, 2023

Multilingual language models trained using various pre-training tasks like mask language modeling (MLM) have yielded encouraging results on a wide range of downstream tasks. Despite the promising performances, structural knowledge in cross-lingual corpus is less explored in current works, leading to the semantic misalignment. In this paper, we propose a new pre-training task named Structural Contrast Pretraining (SCP) to align the structural words in a parallel sentence, improving the models' linguistic versatility and their capacity to understand representations in multilingual languages. Concretely, SCP treats each structural word in source and target languages as a positive pair. We further propose Cross-lingual Momentum Contrast (CL-MoCo) to optimize negative pairs by maintaining a large size of the queue. CL-MoCo extends the original MoCo approach into cross-lingual training and jointly optimizes the source-to-target language and target-to-source language representations in SCP, resulting in a more suitable encoder for cross-lingual transfer learning. We conduct extensive experiments and prove the effectiveness of our resulting model, named XLM-SCP, on three cross-lingual tasks across five datasets such as MLQA, WikiAnn. Our codes are available at https://github.com/nuochenpku/SCP.

Duke Scholars

Published In

Proceedings of the Annual Meeting of the Association for Computational Linguistics

ISSN

0736-587X

Publication Date

January 1, 2023

Start / End Page

2042 / 2057
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Chen, N., Shou, L., Song, T., Gong, M., Pei, J., Chang, J., … Li, J. (2023). Structural Contrastive Pretraining for Cross-Lingual Comprehension. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 2042–2057).
Chen, N., L. Shou, T. Song, M. Gong, J. Pei, J. Chang, D. Jiang, and J. Li. “Structural Contrastive Pretraining for Cross-Lingual Comprehension.” In Proceedings of the Annual Meeting of the Association for Computational Linguistics, 2042–57, 2023.
Chen N, Shou L, Song T, Gong M, Pei J, Chang J, et al. Structural Contrastive Pretraining for Cross-Lingual Comprehension. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. 2023. p. 2042–57.
Chen, N., et al. “Structural Contrastive Pretraining for Cross-Lingual Comprehension.” Proceedings of the Annual Meeting of the Association for Computational Linguistics, 2023, pp. 2042–57.
Chen N, Shou L, Song T, Gong M, Pei J, Chang J, Jiang D, Li J. Structural Contrastive Pretraining for Cross-Lingual Comprehension. Proceedings of the Annual Meeting of the Association for Computational Linguistics. 2023. p. 2042–2057.

Published In

Proceedings of the Annual Meeting of the Association for Computational Linguistics

ISSN

0736-587X

Publication Date

January 1, 2023

Start / End Page

2042 / 2057