AM-RP Stacking PILers: Random projection stacking pseudoinverse learning algorithm based on attention mechanism

被引:1
|
作者
Cai, Zhenjiao [1 ]
Zhang, Sulan [1 ]
Guo, Ping [2 ]
Zhang, Jifu [1 ]
Hu, Lihua [1 ]
机构
[1] Taiyuan Univ Sci & Technol, Sch Comp Sci & Technol, Taiyuan 030024, Peoples R China
[2] Beijing Normal Univ, Sch Syst Sci, Beijing 100875, Peoples R China
来源
VISUAL COMPUTER | 2024年 / 40卷 / 01期
关键词
Stacking pseudoinverse learner; Random projection; Attention mechanism; CLASSIFICATION; REGRESSION; NETWORKS;
D O I
10.1007/s00371-023-02780-7
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The stacking pseudoinverse learning algorithm can effectively improve the classification accuracy of the model and reduce the training time. However, the effect of random projection blocks on the neural network is often ignored, which reduces the performance of stacked generalization. To improve the generalization performance of neural networks and obtain high-quality classification results, we propose a random projection stacking pseudoinverse learning algorithm based on attention mechanism named AM-RP stacking PILers. Firstly, the input weight matrices with specific distributions are randomly generated, and then different random projection blocks are obtained through a pseudoinverse learning algorithm. Secondly, the training results of different random projection blocks are taken as a new input dataset, and the idea of attention mechanism is introduced to assign different weights to different random projection blocks. Finally, we use the stacking pseudoinverse learning algorithm to train a single hidden layer neural network and obtain the classification results with high generalization performance. The experimental results on a total of 76788 images in three public datasets of Salinas, MNIST and COIL-20 show that our algorithm achieves better performance in accuracy, precision, recall, F1 score and training time.
引用
收藏
页码:273 / 285
页数:13
相关论文
共 50 条
  • [1] AM-RP Stacking PILers: Random projection stacking pseudoinverse learning algorithm based on attention mechanism
    Zhenjiao Cai
    Sulan Zhang
    Ping Guo
    Jifu Zhang
    Lihua Hu
    The Visual Computer, 2024, 40 : 273 - 285
  • [2] A Progressive Stacking Pseudoinverse Learning Framework via Active Learning in Random Subspaces
    Cai, Zhenjiao
    Zhang, Sulan
    Guo, Ping
    Zhang, Jifu
    Hu, Lihua
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2024, 54 (05): : 2822 - 2832
  • [3] Soil Salinity Inversion Based on a Stacking Integrated Learning Algorithm
    Dong, Haili
    Tian, Fei
    AGRICULTURE-BASEL, 2024, 14 (10):
  • [4] An E-mail Classification Algorithm based on Stacking Integrated Learning
    Wan, Li-Xia
    Huang, Wei-Xing
    Tang, Qing-Hua
    Journal of Computers (Taiwan), 2022, 33 (02) : 105 - 114
  • [5] Landslide spatial prediction based on cascade forest and stacking ensemble learning algorithm
    Chen, Sijing
    Pan, Yutong
    Lu, Chengda
    Wang, Yawu
    Wu, Min
    Pedrycz, Witold
    INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2025, 56 (03) : 658 - 670
  • [6] Dynamic Mechanical Strength Prediction of BFRC Based on Stacking Ensemble Learning and Genetic Algorithm Optimization
    Zheng, Jiayan
    Wang, Minghui
    Yao, Tianchen
    Tang, Yichen
    Liu, Haijing
    BUILDINGS, 2023, 13 (05)
  • [7] Short-Term Wind Power Prediction Based on a Modified Stacking Ensemble Learning Algorithm
    Yang, Yankun
    Li, Yuling
    Cheng, Lin
    Yang, Shiyou
    SUSTAINABILITY, 2024, 16 (14)
  • [8] Effective prediction of human skin cancer using stacking based ensemble deep learning algorithm
    Devadhas, David Neels Ponkumar
    Sugirtharaj, Hephzi Punithavathi Isaac
    Fernandez, Mary Harin
    Periyasamy, Duraipandy
    NETWORK-COMPUTATION IN NEURAL SYSTEMS, 2024,
  • [9] Semantic Segmentation Algorithm Based on Attention Mechanism and Transfer Learning
    Ye, Jianfeng
    Lu, Chong
    Xiong, Junfeng
    Wang, Huaming
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020
  • [10] A Federated Incremental Learning Algorithm Based on Dual Attention Mechanism
    Hu, Kai
    Lu, Meixia
    Li, Yaogen
    Gong, Sheng
    Wu, Jiasheng
    Zhou, Fenghua
    Jiang, Shanshan
    Yang, Yi
    APPLIED SCIENCES-BASEL, 2022, 12 (19):