共 50 条
- [22] Exploiting Syntactic Information to Boost the Fine-tuning of Pre-trained Models 2022 IEEE 46TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2022), 2022, : 575 - 582
- [23] On the Language Neutrality of Pre-trained Multilingual Representations FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1663 - 1674
- [25] Imparting Fairness to Pre-Trained Biased Representations 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 75 - 82
- [26] TrOCR: Transformer-Based Optical Character Recognition with Pre-trained Models THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 11, 2023, : 13094 - 13102
- [27] Connecting Pre-trained Language Models and Downstream Tasks via Properties of Representations ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [28] Radical-vectors with pre-trained models for Chinese Text Classification 2022 EURO-ASIA CONFERENCE ON FRONTIERS OF COMPUTER SCIENCE AND INFORMATION TECHNOLOGY, FCSIT, 2022, : 12 - 15
- [29] Sub-word information in pre-trained biomedical word representations: evaluation and hyper-parameter optimization SIGBIOMED WORKSHOP ON BIOMEDICAL NATURAL LANGUAGE PROCESSING (BIONLP 2018), 2018, : 56 - 66
- [30] PTCSpell: Pre-trained Corrector Based on Character Shape and Pinyin for Chinese Spelling Correction FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 6330 - 6343