共 50 条
[32]
Accelerating Training of Transformer-Based Language Models with Progressive Layer Dropping
[J].
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS (NEURIPS 2020),
2020, 33
[33]
Arlo: Serving Transformer-based Language Models with Dynamic Input Lengths
[J].
53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024,
2024,
:367-376
[34]
Enhancing Address Data Integrity using Transformer-Based Language Models
[J].
32ND IEEE SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU 2024,
2024,
[36]
TRANSFORMER-BASED STREAMING ASR WITH CUMULATIVE ATTENTION
[J].
2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP),
2022,
:8272-8276
[37]
Attention Calibration for Transformer-based Sequential Recommendation
[J].
PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023,
2023,
:3595-3605
[38]
Korean Sign Language Recognition Using Transformer-Based Deep Neural Network
[J].
APPLIED SCIENCES-BASEL,
2023, 13 (05)
[39]
Quantifying the Bias of Transformer-Based Language Models for African American English in Masked Language Modeling
[J].
ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2023, PT I,
2023, 13935
:532-543
[40]
Incorporating Medical Knowledge to Transformer-based Language Models for Medical Dialogue Generation
[J].
PROCEEDINGS OF THE 21ST WORKSHOP ON BIOMEDICAL LANGUAGE PROCESSING (BIONLP 2022),
2022,
:110-115