共 50 条
- [3] Low-Resource Neural Machine Translation Using XLNet Pre-training Model ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 503 - 514
- [4] Character-Aware Low-Resource Neural Machine Translation with Weight Sharing and Pre-training CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 321 - 333
- [5] Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2706 - 2718
- [8] Pre-training on High-Resource Speech Recognition Improves Low-Resource Speech-to-Text Translation 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 58 - 68
- [10] Multi-Stage Pre-training for Low-Resource Domain Adaptation PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 5461 - 5468