Skip to main content

Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition

Publication ,  Conference
Liang, S; Gong, M; Pei, J; Shou, L; Zuo, W; Zuo, X; Jiang, D
Published in: Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
August 14, 2021

Named entity recognition (NER) is a fundamental component in many applications, such as Web Search and Voice Assistants. Although deep neural networks greatly improve the performance of NER, due to the requirement of large amounts of training data, deep neural networks can hardly scale out to many languages in an industry setting. To tackle this challenge, cross-lingual NER transfers knowledge from a rich-resource language to languages with low resources through pre-trained multilingual language models. Instead of using training data in target languages, cross-lingual NER has to rely on only training data in source languages, and optionally adds the translated training data derived from source languages. However, the existing cross-lingual NER methods do not make good use of rich unlabeled data in target languages, which is relatively easy to collect in industry applications. To address the opportunities and challenges, in this paper we describe our novel practice in Microsoft to leverage such large amounts of unlabeled data in target languages in real production settings. To effectively extract weak supervision signals from the unlabeled data, we develop a novel approach based on the ideas of semi-supervised learning and reinforcement learning. The empirical study on three benchmark data sets verifies that our approach establishes the new state-of-the-art performance with clear edges. Now, the NER techniques reported in this paper are on their way to become a fundamental component for Web ranking, Entity Pane, Answers Triggering, and Question Answering in the Microsoft Bing search engine. Moreover, our techniques will also serve as part of the Spoken Language Understanding module for a commercial voice assistant. We plan to open source the code of the prototype framework after deployment.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining

DOI

Publication Date

August 14, 2021

Start / End Page

3231 / 3239
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Liang, S., Gong, M., Pei, J., Shou, L., Zuo, W., Zuo, X., & Jiang, D. (2021). Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 3231–3239). https://doi.org/10.1145/3447548.3467196
Liang, S., M. Gong, J. Pei, L. Shou, W. Zuo, X. Zuo, and D. Jiang. “Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition.” In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 3231–39, 2021. https://doi.org/10.1145/3447548.3467196.
Liang S, Gong M, Pei J, Shou L, Zuo W, Zuo X, et al. Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition. In: Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2021. p. 3231–9.
Liang, S., et al. “Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition.” Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2021, pp. 3231–39. Scopus, doi:10.1145/3447548.3467196.
Liang S, Gong M, Pei J, Shou L, Zuo W, Zuo X, Jiang D. Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition. Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2021. p. 3231–3239.

Published In

Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining

DOI

Publication Date

August 14, 2021

Start / End Page

3231 / 3239