共 50 条
- [1] Linear Complexity Randomized Self-attention Mechanism INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
- [2] Singularformer: Learning to Decompose Self-Attention to Linearize the Complexity of Transformer PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 4433 - 4441
- [4] COMPARISON OF LOW COMPLEXITY SELF-ATTENTION MECHANISMS FOR ACOUSTIC EVENT DETECTION 2021 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2021, : 1139 - 1143
- [5] SHYNESS AND SELF-ATTENTION BULLETIN OF THE BRITISH PSYCHOLOGICAL SOCIETY, 1983, 36 (FEB): : A5 - A5
- [7] Attention and self-attention in random forests Progress in Artificial Intelligence, 2023, 12 : 257 - 273
- [8] SummaryMixing: A Linear-Complexity Alternative to Self-Attention for Speech Recognition and Understanding INTERSPEECH 2024, 2024, : 3460 - 3464
- [10] Self-Attention for Cyberbullying Detection 2020 INTERNATIONAL CONFERENCE ON CYBER SITUATIONAL AWARENESS, DATA ANALYTICS AND ASSESSMENT (CYBER SA 2020), 2020,