共 50 条
- [1] Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 556 - 568
- [2] Streaming Transformer-based Acoustic Models Using Self-attention with Augmented Memory INTERSPEECH 2020, 2020, : 2132 - 2136
- [4] SIMPLIFIED SELF-ATTENTION FOR TRANSFORMER-BASED END-TO-END SPEECH RECOGNITION 2021 IEEE SPOKEN LANGUAGE TECHNOLOGY WORKSHOP (SLT), 2021, : 75 - 81
- [5] Re-Transformer: A Self-Attention Based Model for Machine Translation AI IN COMPUTATIONAL LINGUISTICS, 2021, 189 : 3 - 10
- [6] Synthesizer: Rethinking Self-Attention for Transformer Models INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139 : 7192 - 7203
- [7] Roles and Utilization of Attention Heads in Transformer-based Neural Language Models 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3404 - 3417
- [8] Transformer-Based Models for Predicting Molecular Structures from Infrared Spectra Using Patch-Based Self-Attention JOURNAL OF PHYSICAL CHEMISTRY A, 2025, 129 (08): : 2077 - 2085
- [10] Transformer-Based Dual-Channel Self-Attention for UUV Autonomous Collision Avoidance IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2023, 8 (03): : 2319 - 2331