共 50 条
- [21] Learning to Select Pre-trained Deep Representations with Bayesian Evidence Framework 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 5318 - 5326
- [22] XPhoneBERT: A Pre-trained Multilingual Model for Phoneme Representations for Text-to-Speech INTERSPEECH 2023, 2023, : 5506 - 5510
- [23] Exploiting Word Semantics to Enrich Character Representations of Chinese Pre-trained Models NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 3 - 15
- [24] Data-Centric Explainable Debiasing for Improving Fairness in Pre-trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 3773 - 3786
- [25] FairFix: Enhancing Fairness of Pre-trained Deep Neural Networks with Scarce Data Resources PROCEEDINGS OF THE 2024 IEEE 10TH INTERNATIONAL CONFERENCE ON INTELLIGENT DATA AND SECURITY, IDS 2024, 2024, : 14 - 20
- [26] Connecting Pre-trained Language Models and Downstream Tasks via Properties of Representations ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [28] Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 2346 - 2357
- [29] BiTimeBERT: Extending Pre-Trained Language Representations with Bi-Temporal Information PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 812 - 821
- [30] All Together Now! The Benefits of Adaptively Fusing Pre-trained Deep Representations ICPRAM: PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS, 2019, : 135 - 144