Online performance and proactive maintenance assessment of data driven prediction models

被引:2
作者
Shen, Yingjun [1 ]
Wang, Taohong [1 ]
Song, Zhe [1 ,2 ]
机构
[1] Nanjing Univ, Business Sch, Nanjing 210093, Peoples R China
[2] Nanjing Univ, Smart Business & Data Sci Res Ctr, Nanjing 210093, Peoples R China
关键词
Data driven model selection; Predictive model deterioration; Online predictive performance; Proactive maintenance evaluation; Machine learning; SELECTION CRITERIA; MUTUAL INFORMATION; NEURAL-NETWORK; DATA STREAMS; WIND-SPEED; TIME; POWER;
D O I
10.1007/s10845-024-02357-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many Data-driven decisions in manufacturing need accurate and reliable predictions. Due to high complexity and variability of working conditions, a prediction model may deteriorate over time after deployed. Traditional performance evaluation indexes mainly assess the prediction model from a static perspective, which is difficult to meet the actual needs of model selection and proactive maintenance, resulting in unstable online prediction performance. For regression-based prediction models, this paper designs online prediction performance evaluation indexes (OPPEI) to evaluate the prediction model in terms of its accuracy, degradation speed, and stability. For proactive maintenance, this paper proposes a model maintenance evaluation method based on Principal Component Analysis (PCA). We use PCA to transform various performance indexes and extract the first principal component as a model maintenance evaluation index, which could reduce the over-sensitive or insensitive phenomenon of single indicator. The effectiveness of online prediction performance evaluation indexes and PCA-based proactive maintenance evaluation method are verified by simulation and several real-world load forecasting experiments.
引用
收藏
页码:3959 / 3993
页数:35
相关论文
共 49 条
  • [1] Principal component analysis
    Abdi, Herve
    Williams, Lynne J.
    [J]. WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2010, 2 (04): : 433 - 459
  • [2] Concept Drift Detection in Data Stream Mining : A literature review
    Agrahari, Supriya
    Singh, Anil Kumar
    [J]. JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2022, 34 (10) : 9523 - 9540
  • [3] Data stream analysis: Foundations, major tasks and tools
    Bahri, Maroua
    Bifet, Albert
    Gama, Joao
    Gomes, Heitor Murilo
    Maniu, Silviu
    [J]. WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2021, 11 (03)
  • [4] Barbiero P., 2020, arXiv, DOI [10.48550/arXiv.2006.15680, DOI 10.48550/ARXIV.2006.15680]
  • [5] Beyond i.i.d.: Non-IID Thinking, Informatics, and Learning
    Cao, Longbing
    [J]. IEEE INTELLIGENT SYSTEMS, 2022, 37 (04) : 5 - 17
  • [6] Cavalcante RC, 2016, IEEE IJCNN, P740, DOI 10.1109/IJCNN.2016.7727274
  • [7] Towards incremental learning of nonstationary imbalanced data stream: a multiple selectively recursive approach
    Chen, Sheng
    He, Haibo
    [J]. EVOLVING SYSTEMS, 2011, 2 (01) : 35 - 50
  • [8] Modified AIC and MDL model selection criteria for short data records
    De Ridder, F
    Pintelon, R
    Schoukens, J
    Gillikin, DP
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2005, 54 (01) : 144 - 150
  • [9] Sensitivity Analysis of k-Fold Cross Validation in Prediction Error Estimation
    Diego Rodriguez, Juan
    Perez, Aritz
    Antonio Lozano, Jose
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2010, 32 (03) : 569 - 575
  • [10] A survey on learning from data streams: current and future trends
    Gama, Joao
    [J]. PROGRESS IN ARTIFICIAL INTELLIGENCE, 2012, 1 (01) : 45 - 55