Spatial Attention-Based Capsule Networks With Guaranteed Group Equivariance

被引:0
|
作者
Zeng, Ru [1 ]
Song, Yan [1 ]
Qin, Yuzhang [1 ]
机构
[1] Univ Shanghai Sci & Technol, Dept Control Sci & Engn, Shanghai 200093, Peoples R China
基金
上海市自然科学基金; 中国国家自然科学基金;
关键词
Routing; Attention mechanisms; Filters; Convolution; Robustness; Computational modeling; Visualization; Capsule network; equivariance; invariance; attention mechanism;
D O I
10.1109/TASE.2024.3438190
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Some capsule networks (CapsNets) reported lately aim to enforce capsule poses and descriptors to be equivariant and invariant respectively by adding extra loss functions as regularization but without providing rigorous proof. To address this problem, a group equivariant spatial attention mechanism (GSA) is proposed to rigidly guarantee the equivariance with mathematical proof while enhancing the spatial information in capsule poses. In addition, to alleviate the computation burden associated with the conventional routing algorithm, group poolings are developed to generate the descriptors and poses of capsules, which contribute greatly to preserving the invariance and equivariance of CapsNets. With the proposed components of GSA and group poolings, a new attentive CapsNet, namely spatial attentive group equivariant CapsNets (SAGE-CapsNets), is constructed in this paper. To validate the invariance and equivariance of SAGE-CapsNets, we conduct experiments involving classification, semantic segmentation, and visualization. The results obtained from these experiments provide empirical evidence of the effectiveness of our proposed approach.
引用
收藏
页码:6076 / 6087
页数:12
相关论文
共 50 条
  • [1] Attention-Based Capsule Networks with Dynamic Routing for Relation Extraction
    Zhang, Ningyu
    Deng, Shumin
    Sun, Zhanlin
    Chen, Xi
    Zhang, Wei
    Chen, Huajun
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 986 - 992
  • [2] Multi-stage Attention-based Capsule Networks for Image Classification
    Song, Yan
    Wang, Yong
    Zidonghua Xuebao/Acta Automatica Sinica, 2024, 50 (09): : 1804 - 1817
  • [3] Attention-based capsule network with shared parameters
    Song Y.
    Qin Y.-Z.
    Zeng R.
    Kongzhi yu Juece/Control and Decision, 2023, 38 (06): : 1577 - 1585
  • [4] Attention-Based Spatial-Temporal Fusion Networks for Traffic Flow Prediction
    Wang, Jiaying
    Yang, Heng
    Shan, Jing
    Jiang, Junyi
    Song, Xiaoxu
    WEB INFORMATION SYSTEMS AND APPLICATIONS, WISA 2024, 2024, 14883 : 500 - 511
  • [5] Tag recommendation by text classification with attention-based capsule network
    Lei K.
    Fu Q.
    Yang M.
    Liang Y.
    Neurocomputing, 2022, 391 : 65 - 73
  • [6] WideCaps: a wide attention-based capsule network for image classification
    Pawan, S. J.
    Sharma, Rishi
    Reddy, Hemanth
    Vani, M.
    Rajan, Jeny
    MACHINE VISION AND APPLICATIONS, 2023, 34 (04)
  • [7] WideCaps: a wide attention-based capsule network for image classification
    S. J. Pawan
    Rishi Sharma
    Hemanth Reddy
    M. Vani
    Jeny Rajan
    Machine Vision and Applications, 2023, 34
  • [8] Channel attention-based spatial-temporal graph neural networks for traffic prediction
    Wang, Bin
    Gao, Fanghong
    Tong, Le
    Zhang, Qian
    Zhu, Sulei
    DATA TECHNOLOGIES AND APPLICATIONS, 2023, 58 (01) : 81 - 94
  • [9] Attention-based graph neural networks: a survey
    Sun, Chengcheng
    Li, Chenhao
    Lin, Xiang
    Zheng, Tianji
    Meng, Fanrong
    Rui, Xiaobin
    Wang, Zhixiao
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (SUPPL 2) : 2263 - 2310
  • [10] Attention-based graph neural networks: a survey
    Chengcheng Sun
    Chenhao Li
    Xiang Lin
    Tianji Zheng
    Fanrong Meng
    Xiaobin Rui
    Zhixiao Wang
    Artificial Intelligence Review, 2023, 56 : 2263 - 2310