Central Attention Mechanism for Convolutional Neural Networks

被引:0
|
作者
Geng, Y.X. [1 ]
Wang, L. [2 ]
Wang, Z.Y. [3 ]
Wang, Y.G. [1 ]
机构
[1] School of Computer Science and Software Engineering, University of Science and Technology Liaoning, Anshan,114051, China
[2] School of Computer Science and Software Engineering, University of Science and Technology Liaoning, Anshan,114051, China
[3] Automation Design Institute, Metallurgical Engineering Technology Co., Ltd., Dalian,116000, China
关键词
Tensors;
D O I
暂无
中图分类号
学科分类号
摘要
Model performance has been significantly enhanced by channel attention. The average pooling procedure creates skewness, lowering the performance of the network architecture. In the channel attention approach, average pooling is used to collect feature information to provide representative values. By leveraging the central limit theorem, we hypothesize that the strip-shaped average pooling operation will generate a one-dimensional tensor by considering the spatial position information of the feature map. The resulting tensor, obtained through average pooling, serves as the representative value for the features, mitigating skewness during the process. By incorporating the concept of the central limit theorem into the channel attention operation process, this study introduces a novel attention mechanism known as theCentral Attention Mechanism (CAM). Instead of directly using average pooling to generate channel representative values, the central attention approach employs star-stripe average pooling to normalize multiple feature representative values into a single representative value. In this way, strip-shaped average pooling can be utilized to collect data and generate a one-dimensional tensor, while star-stripe average pooling can provide feature representative values based on different spatial directions. To generate channel attention for the complementary input features, the activation of the feature representation value is performed for each channel. Our attention approach is flexible and can be seamlessly incorporated into various traditional network structures. Through rigorous testing, we demonstrate the effectiveness of our attention strategy, which can be applied to a wide range of computer vision applications and outperforms previous attention techniques. © (2024), (International Association of Engineers). All rights reserved.
引用
收藏
页码:1642 / 1648
相关论文
共 50 条
  • [21] Integrated Convolutional and Graph Attention Neural Networks for Electroencephalography
    Kang, Jae-eon
    Lee, Changha
    Lee, Jong-Hwan
    2024 12TH INTERNATIONAL WINTER CONFERENCE ON BRAIN-COMPUTER INTERFACE, BCI 2024, 2024,
  • [22] Evaluating Attention in Convolutional Neural Networks for Blended Images
    Portscher, Andrea
    Stabinger, Sebastian
    Rodriguez-Sanchez, Antonio
    2022 IEEE 5TH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING APPLICATIONS AND SYSTEMS, IPAS, 2022,
  • [23] Spatial Decomposition and Aggregation for Attention in Convolutional Neural Networks
    Zhu, Meng
    Min, Weidong
    Xiang, Hongyue
    Zha, Cheng
    Huang, Zheng
    Li, Longfei
    Fu, Qiyan
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2024, 38 (01)
  • [24] Quantifying Student Attention using Convolutional Neural Networks
    Coaja, Andreea
    Rusu, Catalin, V
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 3, 2022, : 293 - 299
  • [25] GAttANet: Global Attention Agreement for Convolutional Neural Networks
    VanRullen, Rufin
    Alamia, Andrea
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 281 - 293
  • [26] Spatial Channel Attention for Deep Convolutional Neural Networks
    Liu, Tonglai
    Luo, Ronghai
    Xu, Longqin
    Feng, Dachun
    Cao, Liang
    Liu, Shuangyin
    Guo, Jianjun
    MATHEMATICS, 2022, 10 (10)
  • [27] Circular Convolutional Neural Networks Based on Triplet Attention
    Wang J.
    Lei J.
    Zhang J.
    Sun S.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2022, 35 (02): : 116 - 129
  • [28] Spatial Pyramid Attention for Deep Convolutional Neural Networks
    Ma, Xu
    Guo, Jingda
    Sansom, Andrew
    McGuire, Mara
    Kalaani, Andrew
    Chen, Qi
    Tang, Sihai
    Yang, Qing
    Fu, Song
    IEEE TRANSACTIONS ON MULTIMEDIA, 2021, 23 : 3048 - 3058
  • [29] TAME: Attention Mechanism Based Feature Fusion for Generating Explanation Maps of Convolutional Neural Networks
    Ntrougkas, Mariano
    Gkalelis, Nikolaos
    Mezaris, Vasileios
    2022 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM), 2022, : 58 - 65
  • [30] Combining Contextual Information by Self-attention Mechanism in Convolutional Neural Networks for Text Classification
    Wu, Xin
    Cai, Yi
    Li, Qing
    Xu, Jingyun
    Leung, Ho-fung
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2018, PT I, 2018, 11233 : 453 - 467