共 50 条
[31]
Tea Buds Grading Method Based on Multiscale Attention Mechanism and Knowledge Distillation
[J].
Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery,
2022, 53 (09)
:399-407and458
[37]
Knowledge Distillation in RNN-Attention Models for Early Prediction of Student Performance
[J].
40TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING,
2025,
:64-73
[39]
Attention-Fused CNN Model Compression with Knowledge Distillation for Brain Tumor Segmentation
[J].
MEDICAL IMAGE UNDERSTANDING AND ANALYSIS, MIUA 2022,
2022, 13413
:328-338