Out-of-Distribution Detection via Conditional Kernel Independence Model

被引:0
|
作者
Wang, Yu [1 ]
Zou, Jingjing [2 ]
Lin, Jingyang [3 ]
Ling, Qing [3 ]
Pan, Yingwei [4 ]
Yao, Ting [4 ]
Mei, Tao [4 ]
机构
[1] Qiyuan Lab, Beijing, Peoples R China
[2] Univ Calif San Diego, San Diego, CA USA
[3] Sun Yat Sen Univ, Guangzhou, Peoples R China
[4] JD AI Res, Beijing, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022 | 2022年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, various methods have been introduced to address the OOD detection problem with training outlier exposure. These methods usually count on discriminative softmax metric or energy method to screen OOD samples. In this paper, we probe an alternative hypothesis on OOD detection by constructing a novel latent variable model based on independent component analysis (ICA) techniques. This novel method named Conditional-i builds upon the probabilistic formulation, and applies the Hilbert-Schmidt Independence Criteria that offers a convenient solution for optimizing variable dependencies. Conditional-i exclusively encodes the useful class condition into the probabilistic model, which provides the desired convenience in delivering theoretical support for the OOD detection task. To facilitate the implementation of the Conditional-i model, we construct unique memory bank architectures that allow for convenient end-to-end training within a tractable budget. Empirical results demonstrate an evident performance boost on benchmarks against SOTA methods. We also provide valuable theoretical justifications that our training strategy is guaranteed to bound the error in the context of OOD detection. Code is available at: https://github.com/OODHSIC/conditional-i.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Out-of-Distribution Generalization in Kernel Regression
    Canatar, Abdulkadir
    Bordelon, Blake
    Pehlevan, Cengiz
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Out-of-distribution Detection Learning with Unreliable Out-of-distribution Sources
    Zheng, Haotian
    Wang, Qizhou
    Fang, Zhen
    Xia, Xiaobo
    Liu, Feng
    Liu, Tongliang
    Han, Bo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [3] Out-of-Distribution Node Detection Based on Graph Heat Kernel Diffusion
    Li, Fangfang
    Wang, Yangshuai
    Du, Xinyu
    Li, Xiaohua
    Yu, Ge
    MATHEMATICS, 2024, 12 (18)
  • [4] Learning by Erasing: Conditional Entropy Based Transferable Out-of-Distribution Detection
    Xing, Meng
    Feng, Zhiyong
    Su, Yong
    Oh, Changjae
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 6, 2024, : 6261 - 6269
  • [5] Principled Out-of-Distribution Detection via Multiple Testing
    Magesh, Akshayaa
    Veeravalli, Venugopal V.
    Roy, Anirban
    Jha, Susmit
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [6] On the Learnability of Out-of-distribution Detection
    Fang, Zhen
    Li, Yixuan
    Liu, Feng
    Han, Bo
    Lu, Jie
    Journal of Machine Learning Research, 2024, 25
  • [7] An Object Detection Model Robust to Out-of-Distribution Data
    Park, Ho-rim
    Hwang, Kyu-hong
    Ha, Young-guk
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2021), 2021, : 275 - 278
  • [8] Entropic Out-of-Distribution Detection
    Macedo, David
    Ren, Tsang Ing
    Zanchettin, Cleber
    Oliveira, Adriano L., I
    Ludermir, Teresa
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [9] Watermarking for Out-of-distribution Detection
    Wang, Qizhou
    Liu, Feng
    Zhang, Yonggang
    Zhang, Jing
    Gong, Chen
    Liu, Tongliang
    Han, Bo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [10] Is Out-of-Distribution Detection Learnable?
    Fang, Zhen
    Li, Yixuan
    Lu, Jie
    Dong, Jiahua
    Han, Bo
    Liu, Feng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,