Learning translation invariance by self-supervised neural networks

被引:0
|
作者
机构
[1] Chudy, L.
[2] Koska, M.
[3] Chudy, V.
关键词
Correlation based learning - Input patterns - Local activity preservation feedback threshold - Self supervised neural networks - Translation invariance;
D O I
暂无
中图分类号
学科分类号
摘要
We discuss the critical factors of translation invariant learning within the framework of competitive learning. Even if we have well preprocessed input feature patterns which make the problem linearly separable, there remains to be handled the conflict between the sparse nature of the input patterns and correlation-based learning. Motivated by a batch-mode solution of the problem we introduce some local mechanisms to avoid this conflict. A self-supervised learning scheme is proposed which uses the local activity preservation feedback threshold as a reinforcement signal and weight adaptation in a positionally local manner. This scheme can handle different object classes with overlapping features, what makes it more general when compared to the recently proposed Modified Hebbian learning for the translation invariant classification of four lines of different orientation.
引用
收藏
相关论文
共 50 条
  • [31] SELF-SUPERVISED ADAPTIVE NETWORKS
    LUTTRELL, SP
    IEE PROCEEDINGS-F RADAR AND SIGNAL PROCESSING, 1992, 139 (06) : 371 - 377
  • [32] TilinGNN: Learning to Tile with Self-Supervised Graph Neural Network
    Xu, Hao
    Hui, Ka-Hei
    Fu, Chi-Wing
    Zhang, Hao
    ACM TRANSACTIONS ON GRAPHICS, 2020, 39 (04):
  • [33] Self-Supervised Representation Learning for Evolutionary Neural Architecture Search
    Wei, Chen
    Tang, Yiping
    Niu, Chuang Niu Chuang
    Hu, Haihong
    Wang, Yue
    Liang, Jimin
    IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2021, 16 (03) : 33 - 49
  • [34] Language-Aware Multilingual Machine Translation with Self-Supervised Learning
    Xu, Haoran
    Maillard, Jean
    Goswami, Vedanuj
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 526 - 539
  • [35] Self-supervised learning of materials concepts from crystal structures via deep neural networks
    Suzuki, Yuta
    Taniai, Tatsunori
    Saito, Kotaro
    Ushiku, Yoshitaka
    Ono, Kanta
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2022, 3 (04):
  • [36] Gated Self-supervised Learning for Improving Supervised Learning
    Fuadi, Erland Hillman
    Ruslim, Aristo Renaldo
    Wardhana, Putu Wahyu Kusuma
    Yudistira, Novanto
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 611 - 615
  • [37] Self-Supervised Dialogue Learning
    Wu, Jiawei
    Wang, Xin
    Wang, William Yang
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3857 - 3867
  • [38] Self-supervised Learning of PSMNet via Generative Adversarial Networks
    Yang, Xinyi
    Lai, Haifeng
    Zou, Bin
    Fu, Hang
    Long, Qian
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT VI, ICIC 2024, 2024, 14880 : 469 - 479
  • [39] Self-supervised learning of monocular depth using quantized networks
    Lu, Keyu
    Zeng, Chengyi
    Zeng, Yonghu
    NEUROCOMPUTING, 2022, 488 : 634 - 646
  • [40] Graph Self-Supervised Learning With Application to Brain Networks Analysis
    Wen, Guangqi
    Cao, Peng
    Liu, Lingwen
    Yang, Jinzhu
    Zhang, Xizhe
    Wang, Fei
    Zaiane, Osmar R.
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (08) : 4154 - 4165