A Self-Attention Integrated Learning Model for Landing Gear Performance Prediction

被引:2
作者
Lin, Lin [1 ]
Tong, Changsheng [1 ]
Guo, Feng [1 ]
Fu, Song [1 ]
Lv, Yancheng [1 ]
He, Wenhui [1 ]
机构
[1] Harbin Inst Technol, Sch Mechatron Engn, Harbin 150001, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
performance prediction; feature selection; data distribution; integrated learning; self-attention; DESIGN;
D O I
10.3390/s23136219
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The landing gear structure suffers from large loads during aircraft takeoff and landing, and an accurate prediction of landing gear performance is beneficial to ensure flight safety. Nevertheless, the landing gear performance prediction method based on machine learning has a strong reliance on the dataset, in which the feature dimension and data distribution will have a great impact on the prediction accuracy. To address these issues, a novel MCA-MLPSA is developed. First, an MCA (multiple correlation analysis) method is proposed to select key features. Second, a heterogeneous multilearner integration framework is proposed, which makes use of different base learners. Third, an MLPSA (multilayer perceptron with self-attention) model is proposed to adaptively capture the data distribution and adjust the weights of each base learner. Finally, the excellent prediction performance of the proposed MCA-MLPSA is validated by a series of experiments on the landing gear data.
引用
收藏
页数:24
相关论文
共 50 条
  • [41] The implicit mathematical reasoning model combining self-attention and convolution
    Yao, Zhuangkai
    Zeng, Bi
    Hu, Huiting
    Wei, Pengfei
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 45 (01) : 975 - 988
  • [42] A mutual embedded self-attention network model for code search?
    Hu, Haize
    Liu, Jianxun
    Zhang, Xiangping
    Cao, Ben
    Cheng, Siqiang
    Long, Teng
    JOURNAL OF SYSTEMS AND SOFTWARE, 2023, 198
  • [43] Self-Attention based Siamese Neural Network recognition Model
    Liu, Yuxing
    Chang, Geng
    Fu, Guofeng
    Wei, Yingchao
    Lan, Jie
    Liu, Jiarui
    2022 34TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2022, : 721 - 724
  • [44] BSLKT: A Bagging Model with Self-Attention and LightGBM for Knowledge Tracing
    Zhang, Zhuoxu
    Li, Haoyun
    PROCEEDINGS OF ACM TURING AWARD CELEBRATION CONFERENCE, ACM TURC 2021, 2021, : 126 - 130
  • [45] Prediction of oil production based on multivariate analysis and self-attention mechanism integrated with long short-term memory
    Yan, Hua
    Liu, Ming
    Yang, Bin
    Yang, Yang
    Ni, Hu
    Wang, Haoyu
    Wang, Ying
    PETROLEUM SCIENCE AND TECHNOLOGY, 2024,
  • [46] A Self-attention Based Model for Offline Handwritten Text Recognition
    Nam Tuan Ly
    Trung Tan Ngo
    Nakagawa, Masaki
    PATTERN RECOGNITION, ACPR 2021, PT II, 2022, 13189 : 356 - 369
  • [47] Discriminant Feature Learning with Self-attention for Person Re-identification
    Li, Yang
    Jiang, Xiaoyan
    Hwang, Jenq-Neng
    NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 11 - 19
  • [48] A self-attention model with contrastive learning for online group recommendation in event-based social networks
    Zhou, Zhiheng
    Huang, Xiaomei
    Xiong, Naixue
    Liao, Guoqiong
    Deng, Xiaobin
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (07) : 9713 - 9741
  • [49] Learning-based correspondence classifier with self-attention hierarchical network
    Chu, Mingfan
    Ma, Yong
    Mei, Xiaoguang
    Huang, Jun
    Fan, Fan
    APPLIED INTELLIGENCE, 2023, 53 (20) : 24360 - 24376
  • [50] Self-attention guided representation learning for image-text matching
    Qi, Xuefei
    Zhang, Ying
    Qi, Jinqing
    Lu, Huchuan
    NEUROCOMPUTING, 2021, 450 : 143 - 155