A new hyper-parameter optimization method for machine learning in fault classification

被引:5
|
作者
Ye, Xingchen [1 ]
Gao, Liang [1 ]
Li, Xinyu [1 ]
Wen, Long [2 ]
机构
[1] Huazhong Univ Sci & Technol, State Key Lab Digital Mfg Equipment & Technol, 1037 Luoyu Rd, Wuhan 430074, Peoples R China
[2] China Univ Geosci, Sch Mech Engn & Elect Informat, 388 Lumo Rd, Wuhan 430074, Peoples R China
基金
国家重点研发计划;
关键词
Hyper-parameter optimization; Fault classification; Dimension reduction; Partial dependencies; CONVOLUTIONAL NEURAL-NETWORK; FEATURE-SELECTION; DIAGNOSIS; MODEL;
D O I
10.1007/s10489-022-04238-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurate bearing fault classification is essential for the safe and stable operation of rotating machinery. The success of Machine Learning (ML) in fault classification is mainly dependent on efficient features and the optimal pre-defined hyper-parameters. Various hyper-parameter optimization (HPO) methods have been proposed to tune the ML algorithms' hyper-parameters in low dimensions but ignore the hyper-parameters of Feature Engineering (FE). The hyper-parameter dimension is high because both FE and the ML algorithm contain many hyper-parameters. This paper proposed a new HPO method for high dimensions based on dimension reduction and partial dependencies. Firstly, the whole hyper-parameter space is separated into two subspaces of FE and the ML algorithm to reduce time consumption. Secondly, the sensitive intervals of hyperparameters can be recognized by partial dependencies due to the nonlinearity of the relationship between the hyperparameters. Then HPO is conducted in intervals to acquire more satisfactory accuracy. The proposed method is verified on three OpenML datasets and the CWRU bearing dataset. The results show that it can automatically construct efficient domain features and outperforms traditional HPO methods and famous ML algorithms. The proposed method is also very time efficient.
引用
收藏
页码:14182 / 14200
页数:19
相关论文
共 50 条
  • [1] A new hyper-parameter optimization method for machine learning in fault classification
    Xingchen Ye
    Liang Gao
    Xinyu Li
    Long Wen
    Applied Intelligence, 2023, 53 : 14182 - 14200
  • [2] An efficient hyper-parameter optimization method for supervised learning
    Shi, Ying
    Qi, Hui
    Qi, Xiaobo
    Mu, Xiaofang
    APPLIED SOFT COMPUTING, 2022, 126
  • [3] Federated learning with hyper-parameter optimization
    Kundroo, Majid
    Kim, Taehong
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (09)
  • [4] Classification complexity assessment for hyper-parameter optimization
    Cai, Ziyun
    Long, Yang
    Shao, Ling
    PATTERN RECOGNITION LETTERS, 2019, 125 : 396 - 403
  • [5] A study on depth classification of defects by machine learning based on hyper-parameter search
    Chen, Haoze
    Zhang, Zhijie
    Yin, Wuliang
    Zhao, Chenyang
    Wang, Fengxiang
    Li, Yanfeng
    MEASUREMENT, 2022, 189
  • [6] Hyper-Parameter Optimization Using MARS Surrogate for Machine-Learning Algorithms
    Li, Yangyang
    Liu, Guangyuan
    Lu, Gao
    Jiao, Licheng
    Marturi, Naresh
    Shang, Ronghua
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2020, 4 (03): : 287 - 297
  • [7] CNN hyper-parameter optimization for environmental sound classification
    Inik, Ozkan
    APPLIED ACOUSTICS, 2023, 202
  • [8] Cultural Events Classification using Hyper-parameter Optimization of Deep Learning Technique
    Feng Zhipeng
    Gani, Hamdan
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (05) : 603 - 609
  • [9] Quantum Inspired High Dimensional Hyper-Parameter Optimization of Machine Learning Model
    Li, Yangyang
    Lu, Gao
    Zhou, Linhao
    Jiao, Licheng
    2017 INTERNATIONAL SMART CITIES CONFERENCE (ISC2), 2017,
  • [10] A New Baseline for Automated Hyper-Parameter Optimization
    Geitle, Marius
    Olsson, Roland
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, 2019, 11943 : 521 - 530