共 50 条
- [1] Rethinking Model Selection and Decoding for Keyphrase Generation with Pre-trained Sequence-to-Sequence Models 2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 6642 - 6658
- [2] Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation? FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 58 - 67
- [4] PhoBERT: Pre-trained language models for Vietnamese FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1037 - 1042
- [7] Sparse Sequence-to-Sequence Models 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1504 - 1519
- [9] Active Learning with Deep Pre-trained Models for Sequence Tagging of Clinical and Biomedical Texts 2019 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2019, : 482 - 489
- [10] Active Learning for Sequence Tagging with Deep Pre-trained Models and Bayesian Uncertainty Estimates 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 1698 - 1712