An attention-based long short-term memory prediction model for working conditions of copper electrolytic plates

被引:4
|
作者
Zhu, Hongqiu [1 ,2 ]
Peng, Lei [1 ]
Zhou, Can [1 ]
Dai, Yusi [1 ]
Peng, Tianyu [1 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Cent South Univ, State Key Lab High Performance Complex Mfg, Changsha 410083, Peoples R China
基金
中国国家自然科学基金;
关键词
plate states prediction; average gray value; LSTM; attention mechanism; SHORT-CIRCUIT DETECTION; MECHANISM;
D O I
10.1088/1361-6501/acc11f
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Copper is an important source of non-ferrous metals, with electrolytic refining being one of the main methods to produce fine copper. In the electrolytic process, plate states seriously affect the output and quality of the copper. Therefore, timely and accurate prediction of the working states of the plates is of great significance to the copper electrolytic refining process. Aiming at the issues associated with traditional plate state detection algorithms of large lag, poor anti-interference ability and low accuracy, a plate state prediction model based on a long short-term memory (LSTM) neural network with an attention mechanism is here proposed in this paper. The average gray values of the plates in infrared imagery are used to characterize the plates' working states. To address the problems of large fluctuation and the large amount of time series data required in such a study, a double-layer LSTM neural network structure is used to improve the efficiency and accuracy of model training. Meanwhile, in view of the periodicity of the time series data and the possible correlation between adjacent data, a unique attention mechanism is proposed to enable the model to learn this correlation between the adjacent data so as to improve the accuracy of the model prediction. The experimental results show that the accuracy of the proposed model for plate state prediction reaches 95.11%. Compared with commonly used prediction algorithms, the plate state prediction model proposed in this paper demonstrates stronger prediction ability.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] A Customized Attention-Based Long Short-Term Memory Network for Distant Supervised Relation Extraction
    He, Dengchao
    Zhang, Hongjun
    Hao, Wenning
    Zhang, Rui
    Cheng, Kai
    NEURAL COMPUTATION, 2017, 29 (07) : 1964 - 1985
  • [32] Biomedical Ontology Matching Through Attention-Based Bidirectional Long Short-Term Memory Network
    Xue, Xingsi
    Jiang, Chao
    Zhang, Jie
    Hu, Cong
    JOURNAL OF DATABASE MANAGEMENT, 2021, 32 (04) : 14 - 27
  • [33] Effective Attention-based Neural Architectures for Sentence Compression with Bidirectional Long Short-Term Memory
    Nhi-Thao Tran
    Viet-Thang Luong
    Ngan Luu-Thuy Nguyen
    Minh-Quoc Nghiem
    PROCEEDINGS OF THE SEVENTH SYMPOSIUM ON INFORMATION AND COMMUNICATION TECHNOLOGY (SOICT 2016), 2016, : 123 - 130
  • [34] Milling tool wear prediction: optimized long short-term memory model based on attention mechanism
    Liu, Yiming
    Yang, Shucai
    Sun, Tao
    Zhang, Yuhua
    FERROELECTRICS, 2023, 607 (01) : 56 - 72
  • [35] Dam Deformation Interpretation and Prediction Based on a Long Short-Term Memory Model Coupled with an Attention Mechanism
    Su, Yan
    Weng, Kailiang
    Lin, Chuan
    Chen, Zeqin
    APPLIED SCIENCES-BASEL, 2021, 11 (14):
  • [36] Tool Wear Prediction Based on Adaptive Feature and Temporal Attention with Long Short-Term Memory Model
    Wang, Wanzhen
    Ngu, Sze Song
    Xin, Miaomiao
    Liu, Rong
    Wang, Qian
    Qiu, Man
    Zhang, Shengqun
    INTERNATIONAL JOURNAL OF ENGINEERING AND TECHNOLOGY INNOVATION, 2024, 14 (03) : 271 - 284
  • [37] Attention-based convolutional neural network and long short-term memory for short-term detection of mood disorders based on elicited speech responses
    Huang, Kun-Yi
    Wu, Chung-Hsien
    Su, Ming-Hsiang
    PATTERN RECOGNITION, 2019, 88 : 668 - 678
  • [38] A novel attention-based long short term memory and fully connected neutral network approach for production energy consumption prediction under complex working conditions
    Yang, Yanfang
    Gao, Jujian
    Xiao, Jinhua
    Zhang, Xiaoshu
    Eynard, Benoit
    Pei, Eujin
    Shu, Liang
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [39] A novel attention-based long short term memory and fully connected neutral network approach for production energy consumption prediction under complex working conditions
    Yang, Yanfang
    Gao, JuJian
    Xiao, Jinhua
    Zhang, Xiaoshu
    Eynard, Benoit
    Pei, Eujin
    Shu, Liang
    Engineering Applications of Artificial Intelligence, 2024, 133
  • [40] Soybean futures price prediction with dual-stage attention-based long short-term memory: a decomposition and extension approach
    Fan, Kun
    Hu, Yanrong
    Liu, Hongjiu
    Liu, Qingyang
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 45 (06) : 10579 - 10602