共 50 条
- [1] Cross-Lingual Knowledge Distillation for Chinese Video Captioning Jisuanji Xuebao/Chinese Journal of Computers, 2021, 44 (09): : 1907 - 1921
- [2] Knowledge Distillation Based Training of Universal ASR Source Models for Cross-lingual Transfer INTERSPEECH 2021, 2021, : 3450 - 3454
- [4] Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 3231 - 3239
- [5] Mongolian-Chinese Cross-lingual Topic Detection Based on Knowledge Distillation 2024 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING, IALP 2024, 2024, : 383 - 388
- [6] Conversations Powered by Cross-Lingual Knowledge SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 1442 - 1451
- [7] DiffSLU: Knowledge Distillation Based Diffusion Model for Cross-Lingual Spoken Language Understanding INTERSPEECH 2023, 2023, : 715 - 719
- [9] Distillation Language Adversarial Network for Cross-lingual Sentiment Analysis 2022 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP 2022), 2022, : 45 - 50