共 50 条
[41]
Distilling Knowledge from an Ensemble of Models for Punctuation Prediction
[J].
18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION,
2017,
:2779-2783
[42]
KENet: Distilling Convolutional Networks via Knowledge Enhancement
[J].
IFAC PAPERSONLINE,
2020, 53 (05)
:385-390
[43]
A Teacher-Student Knowledge Distillation Framework for Enhanced Detection of Anomalous User Activity
[J].
2023 IEEE 24TH INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE, IRI,
2023,
:20-21
[45]
Distilling Knowledge on Text Graph for Social Media Attribute Inference
[J].
PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22),
2022,
:2024-2028
[46]
Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
[J].
2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME,
2023,
:1943-1948
[47]
Distilling knowledge from ensembles of neural networks for speech recognition
[J].
17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES,
2016,
:3439-3443
[50]
Teacher-Student Synergetic Knowledge Distillation for Detecting Alcohol Consumption in NIR Iris Images
[J].
COMPUTER ANALYSIS OF IMAGES AND PATTERNS, CAIP 2023, PT II,
2023, 14185
:162-171