A hybrid model compression approach via knowledge distillation for predicting energy consumption in additive manufacturing

被引:4
作者
Li, Yixin [1 ]
Hu, Fu [1 ]
Liu, Ying [1 ]
Ryan, Michael [1 ]
Wang, Ray [2 ]
机构
[1] Cardiff Univ, Inst Mech & Mfg Engn, Sch Engn, Cardiff, Wales
[2] Unicmicro Co Ltd, Guangzhou, Peoples R China
关键词
Additive manufacturing; deep learning; neural network compression; knowledge distillation; energy consumption prediction; INDUSTRY; 4.0; ENSEMBLE; OPTIMIZATION; SUSTAINABILITY; DEMAND;
D O I
10.1080/00207543.2022.2160501
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Recently, additive manufacturing (AM) has received increased attention due to its high energy consumption. By extracting hidden information or highly representative features from energy-relevant data, knowledge distillation (KD) reduces predictive model complexity and computational load. By using almost predetermined and fixed models, the distillation process restricts students and teachers from learning from one model to another. To reduce computational costs while maintaining acceptable performance, a teacher assistant (TA) was added to the teacher-student architecture. Firstly, a teacher ensemble was combined with three baseline models to enhance accuracy. In the second step, a teacher ensemble (TA) was formed to bridge the capacity gap between the ensemble and the simplified model. As a result, the complexity of the student model was reduced. Using geometry-based features derived from layer-wise image data, a KD-based predictive model was developed to evaluate the feasibility and effectiveness of two independently trained student models. In comparison with independently trained student models, the performance of the proposed method has the lowest RMSE, MAE, and training time.
引用
收藏
页码:4525 / 4547
页数:23
相关论文
共 56 条
[1]   A methodological framework for the inclusion of modern additive manufacturing into the production portfolio of a focused factory [J].
Achillas, Ch. ;
Aidonis, D. ;
Iakovou, E. ;
Thymianidis, M. ;
Tzetzis, D. .
JOURNAL OF MANUFACTURING SYSTEMS, 2015, 37 :328-339
[2]  
[Anonymous], 2015, 17892015 IEEE, P1, DOI [10.1109/IEEESTD.2015.71186181, DOI 10.1109/IEEESTD.2015.7118618]
[3]  
Ba LJ, 2014, ADV NEUR IN, V27
[4]   Direct electrical energy demand in Fused Deposition Modelling [J].
Balogun, Vincent A. ;
Kirkwood, Neil D. ;
Mativenga, Paul T. .
21ST CIRP CONFERENCE ON LIFE CYCLE ENGINEERING, 2014, 15 :38-43
[5]   Sustainability of additive manufacturing: measuring the energy consumption of the laser sintering process [J].
Baumers, M. ;
Tuck, C. ;
Bourell, D. L. ;
Sreenivasan, R. ;
Hague, R. .
PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART B-JOURNAL OF ENGINEERING MANUFACTURE, 2011, 225 (B12) :2228-2239
[6]   Transparency Built-in Energy Consumption and Cost Estimation for Additive Manufacturing [J].
Baumers, Martin ;
Tuck, Chris ;
Wildman, Ricky ;
Ashcroft, Ian ;
Rosamond, Emma ;
Hague, Richard .
JOURNAL OF INDUSTRIAL ECOLOGY, 2013, 17 (03) :418-431
[7]  
Bucila C., 2006, KDD, V54, P1
[8]   Recent advances in efficient computation of deep convolutional neural networks [J].
Cheng, Jian ;
Wang, Pei-song ;
Li, Gang ;
Hu, Qing-hao ;
Lu, Han-qing .
FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2018, 19 (01) :64-77
[9]  
Cheng Y., 2017, A survey of model compression and acceleration for deep neural networks
[10]   On the Efficacy of Knowledge Distillation [J].
Cho, Jang Hyun ;
Hariharan, Bharath .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :4793-4801