A hybrid model compression approach via knowledge distillation for predicting energy consumption in additive manufacturing

被引:4
作者
Li, Yixin [1 ]
Hu, Fu [1 ]
Liu, Ying [1 ]
Ryan, Michael [1 ]
Wang, Ray [2 ]
机构
[1] Cardiff Univ, Inst Mech & Mfg Engn, Sch Engn, Cardiff, Wales
[2] Unicmicro Co Ltd, Guangzhou, Peoples R China
关键词
Additive manufacturing; deep learning; neural network compression; knowledge distillation; energy consumption prediction; INDUSTRY; 4.0; ENSEMBLE; OPTIMIZATION; SUSTAINABILITY; DEMAND;
D O I
10.1080/00207543.2022.2160501
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Recently, additive manufacturing (AM) has received increased attention due to its high energy consumption. By extracting hidden information or highly representative features from energy-relevant data, knowledge distillation (KD) reduces predictive model complexity and computational load. By using almost predetermined and fixed models, the distillation process restricts students and teachers from learning from one model to another. To reduce computational costs while maintaining acceptable performance, a teacher assistant (TA) was added to the teacher-student architecture. Firstly, a teacher ensemble was combined with three baseline models to enhance accuracy. In the second step, a teacher ensemble (TA) was formed to bridge the capacity gap between the ensemble and the simplified model. As a result, the complexity of the student model was reduced. Using geometry-based features derived from layer-wise image data, a KD-based predictive model was developed to evaluate the feasibility and effectiveness of two independently trained student models. In comparison with independently trained student models, the performance of the proposed method has the lowest RMSE, MAE, and training time.
引用
收藏
页码:4525 / 4547
页数:23
相关论文
共 56 条
[11]  
Dunaway D, 2017, PROC ASME DES ENG TE
[12]   Intensive Care Unit Mortality Prediction: An Improved Patient-Specific Stacking Ensemble Model [J].
El-Rashidy, Nora ;
El-Sappagh, Shaker ;
Abuhmed, Tamer ;
Abdelrazek, Samir ;
El-Bakry, Hazem M. .
IEEE ACCESS, 2020, 8 :133541-133564
[13]   Efficient Knowledge Distillation from an Ensemble of Teachers [J].
Fukuda, Takashi ;
Suzuki, Masayuki ;
Kurata, Gakuto ;
Thomas, Samuel ;
Cui, Jia ;
Ramabhadran, Bhuvana .
18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION, 2017, :3697-3701
[14]   The status, challenges, and future of additive manufacturing in engineering [J].
Gao, Wei ;
Zhang, Yunbo ;
Ramanujan, Devarajan ;
Ramani, Karthik ;
Chen, Yong ;
Williams, Christopher B. ;
Wang, Charlie C. L. ;
Shin, Yung C. ;
Zhang, Song ;
Zavattieri, Pablo D. .
COMPUTER-AIDED DESIGN, 2015, 69 :65-89
[15]  
Goodfellow I, 2016, ADAPT COMPUT MACH LE, P1
[16]   Knowledge Distillation: A Survey [J].
Gou, Jianping ;
Yu, Baosheng ;
Maybank, Stephen J. ;
Tao, Dacheng .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2021, 129 (06) :1789-1819
[17]  
Han S, 2015, ADV NEUR IN, V28
[18]  
Hinton G., 2014, NEURIPS DEEP LEARN W
[19]   Deep Fusion for Energy Consumption Prediction in Additive Manufacturing [J].
Hu, Fu ;
Qin, Jian ;
Li, Yixin ;
Liu, Ying ;
Sun, Xianfang .
54TH CIRP CONFERENCE ON MANUFACTURING SYSTEMS 2021-TOWARDS DIGITALIZED MANUFACTURING 4.0, CMS 2021, 2021, 104 :1878-1883
[20]  
Huang Zehao, 2017, ARXIV