共 50 条
- [1] Improving AMR-to-text Generation with Multi-task Pre-training Ruan Jian Xue Bao/Journal of Software, 2021, 32 (10): : 3036 - 3050
- [3] Pre-training Multi-task Contrastive Learning Models for Scientific Literature Understanding FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12259 - 12275
- [6] Multi-task Pre-training with Soft Biometrics for Transfer-learning Palmprint Recognition Neural Processing Letters, 2023, 55 : 2341 - 2358
- [7] A Multi-Task Semantic Decomposition Framework with Task-specific Pre-training for Few-Shot NER PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 430 - 440
- [8] A Multi-Task Learning Framework for Abstractive Text Summarization THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 9987 - 9988