Self-supervised Anomaly Detection by Self-distillation and Negative Sampling

被引:3
|
作者
Rafiee, Nima [1 ]
Gholamipoor, Rahil [1 ]
Adaloglou, Nikolas [1 ]
Jaxy, Simon [1 ]
Ramakers, Julius [1 ]
Kollmann, Markus [1 ,2 ]
机构
[1] Heinrich Heine Univ, Dept Comp Sci, Dusseldorf, Germany
[2] Heinrich Heine Univ, Dept Biol, Dusseldorf, Germany
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV | 2022年 / 13532卷
关键词
Anomaly detection; Self-supervised learning; Self-distillation; Negative sampling;
D O I
10.1007/978-3-031-15937-4_39
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Detecting whether examples belong to a given in-distribution or are out-of-distribution (OOD) requires identifying features that are specific to the in-distribution. In the absence of labels, these features can be learned by self-supervised representation learning techniques under the generic assumption that the most abstract features are those which are statistically most over-represented in comparison to other distributions from the same domain. This work shows that self-distillation of the in-distribution training set together with contrasting against negative examples derived from shifting transformation of auxiliary data strongly improves OOD detection. We find that this improvement depends on how the negative samples are generated, with the general observation that negative samples that keep the statistics of lower level features but change the global semantics result in higher detection accuracy on average. For the first time, we introduce a sensitivity score using which we can optimise negative sampling in a systematic way in an unsupervised setting. We demonstrate the efficiency of our approach across a diverse range of OOD detection problems, setting new benchmarks for unsupervised OOD detection in the visual domain.
引用
收藏
页码:459 / 470
页数:12
相关论文
共 50 条
  • [1] Monocular Depth Estimation via Self-Supervised Self-Distillation
    Hu, Haifeng
    Feng, Yuyang
    Li, Dapeng
    Zhang, Suofei
    Zhao, Haitao
    SENSORS, 2024, 24 (13)
  • [2] Self-distillation improves self-supervised learning for DNA sequence inference
    Yu, Tong
    Cheng, Lei
    Khalitov, Ruslan
    Olsson, Erland B.
    Yang, Zhirong
    NEURAL NETWORKS, 2025, 183
  • [3] Self-supervised network for oriented synthetic aperture radar ship detection based on self-distillation
    Li, Wentao
    Xu, Haixia
    Shi, Furong
    Yuan, Liming
    Wen, Xianbin
    JOURNAL OF APPLIED REMOTE SENSING, 2024, 18 (04)
  • [4] Self-Supervised Spatiotemporal Graph Neural Networks With Self-Distillation for Traffic Prediction
    Ji, Junzhong
    Yu, Fan
    Lei, Minglong
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (02) : 1580 - 1593
  • [5] DinoSR: Self-Distillation and Online Clustering for Self-supervised Speech Representation Learning
    Liu, Alexander H.
    Chang, Heng-Jui
    Auli, Michael
    Hsu, Wei-Ning
    Glass, James
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [6] Self-supervised monocular depth estimation with self-distillation and dense skip connection
    Xiang, Xuezhi
    Li, Wei
    Wang, Yao
    El Saddik, Abdulmotaleb
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2024, 246
  • [7] Self-supervised learning with self-distillation on COVID-19 medical image classification
    Tan, Zhiyong
    Yu, Yuhai
    Meng, Jiana
    Liu, Shuang
    Li, Wei
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2024, 243
  • [8] Hyperspectral anomaly detection with self-supervised anomaly prior
    Liu, Yidan
    Jiang, Kai
    Xie, Weiying
    Zhang, Jiaqing
    Li, Yunsong
    Fang, Leyuan
    NEURAL NETWORKS, 2025, 187
  • [9] SSSD: Self-Supervised Self Distillation
    Chen, Wei-Chi
    Chu, Wei-Ta
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2769 - 2776
  • [10] Self-Supervised Anomaly Detection With Neural Transformations
    Qiu, Chen
    Kloft, Marius
    Mandt, Stephan
    Rudolph, Maja
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2025, 47 (03) : 2170 - 2185