WOOD: Wasserstein-Based Out-of-Distribution Detection

被引:1
|
作者
Wang, Yinan [1 ]
Sun, Wenbo [2 ]
Jin, Jionghua [3 ]
Kong, Zhenyu [4 ]
Yue, Xiaowei [5 ]
机构
[1] Rensselaer Polytech Inst, Dept Ind & Syst Engn, Troy, NY 12180 USA
[2] Univ Michigan, Transportat Res Inst, Ann Arbor, MI 48109 USA
[3] Univ Michigan, Dept Ind & Operat Engn, Ann Arbor, MI 48109 USA
[4] Virginia Tech, Grad Dept Ind & Syst Engn, Blacksburg, VA 24060 USA
[5] Tsinghua Univ, Inst Qual & Reliabil, Dept Ind Engn, Beijing 100190, Peoples R China
关键词
Cyber security; image classification; machine learning; OOD detection; Wasserstein distance;
D O I
10.1109/TPAMI.2023.3328883
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The training and testing data for deep-neural-network-based classifiers are usually assumed to be sampled from the same distribution. When part of the testing samples are drawn from a distribution that is sufficiently far away from that of the training samples (a.k.a. out-of-distribution (OOD) samples), the trained neural network has a tendency to make high-confidence predictions for these OOD samples. Detection of the OOD samples is critical when training a neural network used for image classification, object detection, etc. It can enhance the classifier's robustness to irrelevant inputs, and improve the system's resilience and security under different forms of attacks. Detection of OOD samples has three main challenges: (i) the proposed OOD detection method should be compatible with various architectures of classifiers (e.g., DenseNet, ResNet) without significantly increasing the model complexity and requirements on computational resources; (ii) the OOD samples may come from multiple distributions, whose class labels are commonly unavailable; (iii) a score function needs to be defined to effectively separate OOD samples from in-distribution (InD) samples. To overcome these challenges, we propose a Wasserstein-based out-of-distribution detection (WOOD) method. The basic idea is to define a Wasserstein-based score that evaluates the dissimilarity between a test sample and the distribution of InD samples. An optimization problem is then formulated and solved based on the proposed score function. The statistical learning bound of the proposed method is investigated to guarantee that the loss value achieved by the empirical optimizer approximates the global optimum. The comparison study results demonstrate that the proposed WOOD consistently outperforms other existing OOD detection methods.
引用
收藏
页码:944 / 956
页数:13
相关论文
共 50 条
  • [31] Out-of-Distribution Detection by Cross-Class Vicinity Distribution of In-Distribution Data
    Zhao, Zhilin
    Cao, Longbing
    Lin, Kun-Yu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) : 13777 - 13788
  • [32] Personalized Purchase Prediction of Market Baskets with Wasserstein-Based Sequence Matching
    Kraus, Mathias
    Feuerriegel, Stefan
    KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 2643 - 2652
  • [33] Typicality Excels Likelihood for Unsupervised Out-of-Distribution Detection in Medical Imaging
    Abdi, Lemar
    Valiuddin, M. M. Amaan
    Viviers, Christiaan G. A.
    de With, Peter H. N.
    van der Sommen, Fons
    UNCERTAINTY FOR SAFE UTILIZATION OF MACHINE LEARNING IN MEDICAL IMAGING, UNSURE 2024, 2025, 15167 : 149 - 159
  • [34] Image Dataset Quality Assessment Through Descriptive Out-of-Distribution Detection
    Kharma, Sarni
    Grossmann, Juergen
    KI 2024: ADVANCES IN ARTIFICIAL INTELLIGENCE, KI 2024, 2024, 14992 : 147 - 159
  • [35] Out-of-Distribution Detection of Human Activity Recognition with Smartwatch Inertial Sensors
    Boyer, Philip
    Burns, David
    Whyne, Cari
    SENSORS, 2021, 21 (05) : 1 - 23
  • [36] An Empirical Evaluation of Out-of-Distribution Detection Using Pretrained Language Models
    Yoon, Byungmu
    Kim, Jaeyoung
    2023 5TH INTERNATIONAL CONFERENCE ON CONTROL AND ROBOTICS, ICCR, 2023, : 302 - 308
  • [37] Rethinking Out-of-Distribution Detection From a Human-Centric Perspective
    Zhu, Yao
    Chen, Yuefeng
    Li, Xiaodan
    Zhang, Rong
    Xue, Hui
    Tian, Xiang
    Jiang, Rongxin
    Zheng, Bolun
    Chen, Yaowu
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2024, 132 (10) : 4633 - 4650
  • [38] FLaNS: Feature-Label Negative Sampling for Out-of-Distribution Detection
    Lim, Chaejin
    Hyeon, Junhee
    Lee, Kiseong
    Han, Dongil
    IEEE ACCESS, 2025, 13 : 43878 - 43888
  • [39] Diversify: A General Framework for Time Series Out-of-Distribution Detection and Generalization
    Lu, Wang
    Wang, Jindong
    Sun, Xinwei
    Chen, Yiqiang
    Ji, Xiangyang
    Yang, Qiang
    Xie, Xing
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (06) : 4534 - 4550
  • [40] An Out-of-Distribution Generalization Framework Based on Variational Backdoor Adjustment
    Su, Hang
    Wang, Wei
    MATHEMATICS, 2024, 12 (01)