共 50 条
- [1] Revisiting Pre-trained Models for Chinese Natural Language Processing FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 657 - 668
- [3] Exploiting Word Semantics to Enrich Character Representations of Chinese Pre-trained Models NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 3 - 15
- [5] Impact of Morphological Segmentation on Pre-trained Language Models INTELLIGENT SYSTEMS, PT II, 2022, 13654 : 402 - 416
- [6] Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1716 - 1731
- [8] Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Processing Proc. Conf. Empir. Methods Nat. Lang. Process., EMNLP, (3135-3151):
- [10] Enhancing Pre-trained Chinese Character Representation with Word-aligned Attention 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3442 - 3448