DEEP SELF-SUPERVISED BAND-LEVEL LEARNING FOR HYPERPSPECTRAL CLASSIFICATION

被引:1
|
作者
Santiago, Jonathan Gonzalez [1 ]
Schenkel, Fabian [1 ]
Middelmann, Wolfgang [1 ]
机构
[1] Fraunhofer IOSB, Gutleuthausstr 1, Ettlingen, Germany
来源
IMAGE AND SIGNAL PROCESSING FOR REMOTE SENSING XXVIII | 2022年 / 12267卷
关键词
Self-Supervised Learning; Band-Level Learning; Contrastive Learning; Transfer Learning; Deep Convolutional Neural Networks; Hyperspectral Classification;
D O I
10.1117/12.2636245
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hyperspectral image classification is one of the most researched topics within hyperspectral analysis. Its importance is determined by its immediate outcome, a classified image used for planning and decision-making processes within a variety of engineering and scientific disciplines. Within the last few years, researchers have solved this task employing self-supervised learning to learn robust feature representations to alleviate the dependency on large amounts of labels required by supervised deep learning. Aiming to learn representations for hyperspectral classification purposes, several of these works use dimensionality reduction that could exclude relevant information during feature learning. Moreover, they are based on contrastive instance learning that requires a large memory bank to store the result of pairwise feature discriminations, which represents a computational hurdle. To overcome these challenges, the current approach performs self-supervised cluster assignments between sets of contiguous bands to learn semantically meaningful representations that accurately contribute to solving the hyperspectral classification task with fewer labels. The approach starts with the pre-processing of the data for self-supervised learning purposes. Subsequently, the self-supervised band-level learning phase takes the preprocessed image patches to learn relevant feature representations. Afterwards, the classification step uses the previously learned encoder model and turns it into a pixel classifier to execute the classification with fewer labels than awaited. Lastly, the validation makes use of the kappa coefficient, and the overall and average accuracy as well-established metrics for assessing classification results. The method employs two benchmark datasets for evaluation. Experimental results show that the classification quality of the proposed method surpasses supervised learning and contrastive instance learning-based methods for the majority of the studied data partition levels. The construction of the most adequate set of augmentations for hyperspectral imagery also indicated the potential of the results to further improve.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] DEEP SELF-SUPERVISED PIXEL-LEVEL LEARNING FOR HYPERSPECTRAL CLASSIFICATION
    Gonzalez-Santiago, Jonathan
    Schenkel, Fabian
    Gross, Wolfgang
    Middelmann, Wolfgang
    2022 12TH WORKSHOP ON HYPERSPECTRAL IMAGING AND SIGNAL PROCESSING: EVOLUTION IN REMOTE SENSING (WHISPERS), 2022,
  • [2] Deep self-supervised transformation learning for leukocyte classification
    Chen, Xinwei
    Zheng, Guolin
    Zhou, Liwei
    Li, Zuoyong
    Fan, Haoyi
    JOURNAL OF BIOPHOTONICS, 2023, 16 (03)
  • [3] Self-Supervised RF Signal Representation Learning for NextG Signal Classification With Deep Learning
    Davaslioglu, Kemal
    Boztas, Serdar
    Ertem, Mehmet Can
    Sagduyu, Yalin E.
    Ayanoglu, Ender
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2023, 12 (01) : 65 - 69
  • [4] Self-supervised learning for Environmental Sound Classification
    Tripathi, Achyut Mani
    Mishra, Aakansha
    APPLIED ACOUSTICS, 2021, 182
  • [5] Contrastive self-supervised learning for neurodegenerative disorder classification
    Gryshchuk, Vadym
    Singh, Devesh
    Teipel, Stefan
    Dyrba, Martin
    ADNI Study Grp
    AIBL Study Grp
    FTLDNI Study Grp
    FRONTIERS IN NEUROINFORMATICS, 2025, 19
  • [6] Self-Supervised Learning for Solar Radio Spectrum Classification
    Li, Siqi
    Yuan, Guowu
    Chen, Jian
    Tan, Chengming
    Zhou, Hao
    UNIVERSE, 2022, 8 (12)
  • [7] A Deep Cut Into Split Federated Self-Supervised Learning
    Przewiezlikowski, Marcin
    Osial, Marcin
    Zielinski, Bartosz
    Smieja, Marek
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT II, ECML PKDD 2024, 2024, 14942 : 444 - 459
  • [8] A Survey on Contrastive Self-Supervised Learning
    Jaiswal, Ashish
    Babu, Ashwin Ramesh
    Zadeh, Mohammad Zaki
    Banerjee, Debapriya
    Makedon, Fillia
    TECHNOLOGIES, 2021, 9 (01)
  • [9] An improved self-supervised learning for EEG classification
    Ou, Yanghan
    Sun, Siqin
    Gan, Haitao
    Zhou, Ran
    Yang, Zhi
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2022, 19 (07) : 6907 - 6922
  • [10] Self-supervised Learning for Reading Activity Classification
    Islam, Md Rabiul
    Sakamoto, Shuji
    Yamada, Yoshihiro
    Vargo, Andrew W.
    Iwata, Motoi
    Iwamura, Masakazu
    Kise, Koichi
    PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, 2021, 5 (03):