Boosting Cross-Lingual Transfer via Self-Learning with Uncertainty Estimation

Publication Date: 11/11/2021

Event: The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021)

Reference: pp. 6716-6723, 2021

Authors: Liyan Xu, Emory University; Xuchao Zhang, NEC Laboratories America, Inc.; Xujiang Zhao, University of Texas at Dallas; Haifeng Chen, NEC Laboratories America, Inc.; Feng Chen, University of Texas at Dallas; Jinho D. Choi, Emory University

Abstract: Recent multilingual pre-trained language models have achieved remarkable zero-shot performance, where the model is only finetuned on one source language and directly evaluated on target languages. In this work, we propose a self-learning framework that further utilizes unlabeled data of target languages, combined with uncertainty estimation in the process to select high-quality silver labels. Three different uncertainties are adapted and analyzed specifically for the cross lingual transfer: Language Heteroscedastic/Homoscedastic Uncertainty (LEU/LOU), Evidential Uncertainty (EVI). We evaluate our framework with uncertainties on two cross-lingual tasks including Named Entity Recognition (NER) and Natural Language Inference (NLI) covering 40 languages in total, which outperforms the baselines significantly by 10 F1 for NER on average and 2.5 accuracy for NLI.

Publication Link: