共 50 条
[12]
Self-knowledge distillation based on dynamic mixed attention
[J].
Kongzhi yu Juece/Control and Decision,
2024, 39 (12)
:4099-4108
[15]
Boosting the Performance of Lightweight HAR Models with Attention and Knowledge Distillation
[J].
2024 INTERNATIONAL CONFERENCE ON INTELLIGENT ENVIRONMENTS, IE 2024,
2024,
:1-8
[16]
Fast and Scalable Recommendation Retrieval Model with Mixed Attention and Knowledge Distillation
[J].
INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2024, PT I,
2025, 15346
:244-253
[17]
KDAS: KNOWLEDGE DISTILLATION VIA ATTENTION SUPERVISION FRAMEWORK FOR POLYP SEGMENTATION
[J].
2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME 2024,
2024,
[18]
KNOWLEDGE DISTILLATION WITH CATEGORY-AWARE ATTENTION AND DISCRIMINANT LOGIT LOSSES
[J].
2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME),
2019,
:1792-1797