Cross-domain Ensemble Distillation for Domain Generalization

被引:0
|
作者
Lee, Kyungmoon [1 ,2 ]
Kim, Sungyeon [2 ]
Kwak, Suha [2 ]
机构
[1] NALBI Inc, Seoul, South Korea
[2] POSTECH, Pohang, South Korea
来源
关键词
Domain generalization; Knowledge distillation; Flat minima;
D O I
10.1007/978-3-031-19806-9_1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain generalization is the task of learning models that generalize to unseen target domains. We propose a simple yet effective method for domain generalization, named cross-domain ensemble distillation (XDED), that learns domain-invariant features while encouraging the model to converge to flat minima, which recently turned out to be a sufficient condition for domain generalization. To this end, our method generates an ensemble of the output logits from training data with the same label but from different domains and then penalizes each output for the mismatch with the ensemble. Also, we present a de-stylization technique that standardizes features to encourage the model to produce style-consistent predictions even in an arbitrary target domain. Our method greatly improves generalization capability in public benchmarks for cross-domain image classification, cross-dataset person re-ID, and cross-dataset semantic segmentation. Moreover, we show that models learned by our method are robust against adversarial attacks and unseen corruptions.
引用
收藏
页码:1 / 20
页数:20
相关论文
共 50 条
  • [1] Cross-Domain Gated Learning for Domain Generalization
    Dapeng Du
    Jiawei Chen
    Yuexiang Li
    Kai Ma
    Gangshan Wu
    Yefeng Zheng
    Limin Wang
    International Journal of Computer Vision, 2022, 130 : 2842 - 2857
  • [2] Cross-Domain Gated Learning for Domain Generalization
    Du, Dapeng
    Chen, Jiawei
    Li, Yuexiang
    Ma, Kai
    Wu, Gangshan
    Zheng, Yefeng
    Wang, Limin
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2022, 130 (11) : 2842 - 2857
  • [3] Cross-Domain Feature Augmentation for Domain Generalization
    Liu, Yingnan
    Zou, Yingtian
    Qiao, Rui
    Liu, Fusheng
    Lee, Mong Li
    Hsu, Wynne
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 1146 - 1154
  • [4] Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation
    Gong, Xuan
    Sharma, Abhishek
    Karanam, Srikrishna
    Wu, Ziyan
    Chen, Terrence
    Doermann, David
    Innanje, Arun
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11891 - 11899
  • [5] Constrained Maximum Cross-Domain Likelihood for Domain Generalization
    Lin, Jianxin
    Tang, Yongqiang
    Wang, Junping
    Zhang, Wensheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 15
  • [6] Ladder Curriculum Learning for Domain Generalization in Cross-Domain Classification
    Wang, Xiaoshun
    Luo, Sibei
    Gao, Yiming
    IEEE ACCESS, 2024, 12 : 95356 - 95367
  • [7] Cross-domain knowledge distillation for text classification
    Zhang, Shaokang
    Jiang, Lei
    Tan, Jianlong
    NEUROCOMPUTING, 2022, 509 : 11 - 20
  • [8] Cross-domain recommendation via knowledge distillation
    Li, Xiuze
    Huang, Zhenhua
    Wu, Zhengyang
    Wang, Changdong
    Chen, Yunwen
    KNOWLEDGE-BASED SYSTEMS, 2025, 311
  • [9] Cross-Domain Generalization of Neural Constituency Parsers
    Fried, Daniel
    Kitaev, Nikita
    Klein, Dan
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 323 - 330
  • [10] A Theory of Relation Learning and Cross-Domain Generalization
    Doumas, Leonidas A. A.
    Puebla, Guillermo
    Martin, Andrea E.
    Hummel, John E.
    PSYCHOLOGICAL REVIEW, 2022, 129 (05) : 999 - 1041