A Spatial-Temporal Adaptive Graph Convolutional Network with Multi-Sensor Signals for Tool Wear Prediction

被引:0
作者
Xia, Yu [1 ]
Zheng, Guangji [1 ]
Li, Ye [1 ]
Liu, Hui [1 ]
机构
[1] Cent South Univ, Inst Artificial Intelligence & Robot IAIR, Sch Traff & Transportat Engn, Key Lab Traff Safety Track,Minist Educ, Changsha 410075, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2025年 / 15卷 / 04期
关键词
tool wear prediction; spatial-temporal graph neural network; attention mechanism; multi-sensor fusion;
D O I
10.3390/app15042058
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Tool wear monitoring is crucial for optimizing cutting performance, reducing costs, and improving production efficiency. Existing tool wear prediction models usually design integrated models based on a convolutional neural network (CNN) and recurrent neural network (RNN) to extract spatial and temporal features separately. However, the topological structures between multi-sensor networks are ignored, and the ability to extract spatial features is limited. To overcome these limitations, a novel spatial-temporal adaptive graph convolutional network (STAGCN) is proposed to capture spatial-temporal dependencies with multi-sensor signals. First, a simple linear model is used to capture temporal patterns in individual time-series data. Second, a spatial-temporal layer composed of a bidirectional Mamba and an adaptive graph convolution is established to extract degradation features and reflect the dynamic degradation trend using an adaptive graph. Third, multi-scale triple linear attention (MTLA) is used to fuse the extracted multi-scale features across spatial, temporal, and channel dimensions, which can assign different weights adaptively to retain important information and weaken the influence of redundant features. Finally, the fused features are fed into a linear regression layer to estimate the tool wear. Experimental results conducted on the PHM2010 dataset demonstrate the effectiveness of the proposed STAGCN model, achieving a mean absolute error (MAE) of 3.40 mu m and a root mean square error (RMSE) of 4.32 mu m in the average results across three datasets.
引用
收藏
页数:24
相关论文
共 70 条
  • [1] Mohanraj T., Shankar S., Rajasekar R., Sakthivel N., Pramanik A., Tool condition monitoring techniques in milling process—A review, J. Mater. Res. Technol, 9, pp. 1032-1042, (2020)
  • [2] Vetrichelvan G., Sundaram S., Kumaran S.S., Velmurugan P., An investigation of tool wear using acoustic emission and genetic algorithm, J. Vib. Control, 21, pp. 3061-3066, (2015)
  • [3] Liu C., Wang G., Li Z., Incremental learning for online tool condition monitoring using Ellipsoid ARTMAP network model, Appl. Soft Comput, 35, pp. 186-198, (2015)
  • [4] Serin G., Sener B., Ozbayoglu A.M., Unver H.O., Review of tool condition monitoring in machining and opportunities for deep learning, Int. J. Adv. Manuf. Technol, 109, pp. 953-974, (2020)
  • [5] Liu Z., Yue C., Li X., Liu X., Liang S.Y., Wang L., Research on tool wear based on 3D FEM simulation for milling process, J. Manuf. Mater. Process, 4, (2020)
  • [6] Zhenyu S., Xin L., Ningmin D., Qibiao Y., Evaluation of tool wear and cutting performance considering effects of dynamic nodes movement based on FEM simulation, Chin. J. Aeronaut, 34, pp. 140-152, (2021)
  • [7] Tran M.-Q., Doan H.-P., Vu V.Q., Vu L.T., Machine learning and IoT-based approach for tool condition monitoring: A review and future prospects, Measurement, 207, (2023)
  • [8] Pimenov D.Y., Bustillo A., Wojciechowski S., Sharma V.S., Gupta M.K., Kuntoglu M., Artificial intelligence systems for tool condition monitoring in machining: Analysis and critical review, J. Intell. Manuf, 34, pp. 2079-2121, (2023)
  • [9] Wu D., Jennings C., Terpenny J., Gao R.X., Kumara S., A comparative study on machine learning algorithms for smart manufacturing: Tool wear prediction using random forests, J. Manuf. Sci. Eng, 139, (2017)
  • [10] Traini E., Bruno G., Lombardi F., Tool condition monitoring framework for predictive maintenance: A case study on milling process, Int. J. Prod. Res, 59, pp. 7179-7193, (2021)