Orthogonal Transforms For Learning Invariant Representations In Equivariant Neural Networks

被引:1
|
作者
Singh, Jaspreet [1 ]
Singh, Chandan [1 ]
Rana, Ankur [1 ]
机构
[1] Punjabi Univ, Patiala, Punjab, India
关键词
D O I
10.1109/WACV56688.2023.00157
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The convolutional layers of the standard convolutional neural networks (CNNs) are equivariant to translation. Recently, a new class of CNNs is introduced which is equivariant to other affine geometric transformations such as rotation and reflection by replacing the standard convolutional layer with the group convolutional layer or using the steerable filters in the convloutional layer. We propose to embed the 2D positional encoding which is invariant to rotation, reflection and translation using orthogonal polar harmonic transforms (PHTs) before flattening the feature maps for fully-connected or classification layer in the equivariant CNN architecture. We select the PHTs among several invariant transforms, as they are very efficient in performance and speed. The proposed 2D positional encoding scheme between the convolutional and fully-connected layers of the equivariant networks is shown to provide significant improvement in performance on the rotated MNIST, CIFAR-10 and CIFAR-100 datasets.
引用
收藏
页码:1523 / 1530
页数:8
相关论文
共 50 条
  • [11] Invariant Representations with Stochastically Quantized Neural Networks
    Cerrato, Mattia
    Koeppel, Marius
    Esposito, Roberto
    Kramer, Stefan
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 6962 - 6970
  • [12] Geometric deep learning and equivariant neural networks
    Jan E. Gerken
    Jimmy Aronsson
    Oscar Carlsson
    Hampus Linander
    Fredrik Ohlsson
    Christoffer Petersson
    Daniel Persson
    Artificial Intelligence Review, 2023, 56 : 14605 - 14662
  • [13] Geometric deep learning and equivariant neural networks
    Gerken, Jan E.
    Aronsson, Jimmy
    Carlsson, Oscar
    Linander, Hampus
    Ohlsson, Fredrik
    Petersson, Christoffer
    Persson, Daniel
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (12) : 14605 - 14662
  • [14] Exploring Complementary Strengths of Invariant and Equivariant Representations for Few-Shot Learning
    Rizve, Mamshad Nayeem
    Khan, Salman
    Khan, Fahad Shahbaz
    Shah, Mubarak
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 10831 - 10841
  • [15] Invariant and Equivariant Reynolds Networks
    Sannai, Akiyoshi Sannai
    Kawano, Makoto
    Kumagai, Wataru
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 36
  • [16] A Comparison between Invariant and Equivariant Classical and Quantum Graph Neural Networks
    Forestano, Roy T.
    Cara, Marcal Comajoan
    Dahale, Gopal Ramesh
    Dong, Zhongtian
    Gleyzer, Sergei
    Justice, Daniel
    Kong, Kyoungchul
    Magorsch, Tom
    Matchev, Konstantin T.
    Matcheva, Katia
    Unlu, Eyup B.
    AXIOMS, 2024, 13 (03)
  • [17] Machine learning Hubbard parameters with equivariant neural networks
    Uhrin, Martin
    Zadoks, Austin
    Binci, Luca
    Marzari, Nicola
    Timrov, Iurii
    NPJ COMPUTATIONAL MATERIALS, 2025, 11 (01)
  • [18] Learning Invariant Weights in Neural Networks
    van der Ouderaa, Tycho F. A.
    van der Wilk, Mark
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 1992 - 2001
  • [19] Inferring Intents From Equivariant-Invariant Representations and Relational Learning in Multiagent Systems
    Qiu, Xihe
    Wang, Haoyu
    Tan, Xiaoyu
    IEEE SYSTEMS JOURNAL, 2024, 18 (03): : 1765 - 1775
  • [20] Visual Cortex and Deep Networks - Learning Invariant Representations
    Clarke, Alasdair
    PERCEPTION, 2018, 47 (03) : 355 - 356