Distilling object detectors with mask-guided feature and relation-based knowledge

被引:0
|
作者
Zeng, Liang [1 ]
Ma, Liyan [1 ]
Luo, Xiangfeng [1 ]
Guo, Yinsai [1 ]
Chen, Xue [1 ,2 ]
机构
[1] Shanghai Univ, Sch Comp Engn & Sci, Shanghai 200444, Peoples R China
[2] State Key Lab Math Engn & Adv Comp, Wuxi 214083, Peoples R China
基金
中国国家自然科学基金;
关键词
knowledge distillation; multi-value mask; object detection;
D O I
10.1504/IJCSE.2024.137291
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Knowledge distillation (KD) is an effective technique for network compression and model accuracy enhancement in image classification, semantic segmentation, pre-trained language model, and so on. However, existing KD methods are specialised for image classification and cannot be used effectively for object detection tasks, with the following two limitations: the imbalance of foreground and background instances and the neglect distillation of relation-based knowledge. In this paper, we present a general mask-guided feature and relation-based knowledge distillation framework (MAR) consisting of two components, mask-guided distillation, and relation-based distillation, to address the above problems. The mask-guided distillation is designed to emphasise students' learning of close-to-object features via multi-value masks, while relation-based distillation is proposed to mimic the relational information between different feature pixels on the classification head. Extensive experiments show that our methods achieve excellent AP improvements on both one-stage and two-stage detectors. Specifically, faster R-CNN with ResNet50 backbone achieves 40.6% in mAP under 1 x schedule on the COCO dataset, which is 3.2% higher than the baseline and even surpasses the teacher detector.
引用
收藏
页码:195 / 203
页数:10
相关论文
共 20 条
  • [1] Distilling object detectors with efficient logit mimicking and mask-guided feature imitation
    Lu, Xin
    Cao, Yichao
    Chen, Shikun
    Li, Weixuan
    Zhou, Xin
    Lu, Xiaobo
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 245
  • [2] Distilling Object Detectors with Global Knowledge
    Tang, Sanli
    Zhang, Zhongyu
    Cheng, Zhanzhan
    Lu, Jing
    Xu, Yunlu
    Niu, Yi
    He, Fan
    COMPUTER VISION, ECCV 2022, PT IX, 2022, 13669 : 422 - 438
  • [3] Mask-guided SSD for small-object detection
    Sun, Chang
    Ai, Yibo
    Wang, Sheng
    Zhang, Weidong
    APPLIED INTELLIGENCE, 2021, 51 (06) : 3311 - 3322
  • [4] Mask-guided SSD for small-object detection
    Chang Sun
    Yibo Ai
    Sheng Wang
    Weidong Zhang
    Applied Intelligence, 2021, 51 : 3311 - 3322
  • [5] Mask-Guided Transformer for Human-Object Interaction Detection
    Ying, Daocheng
    Yang, Hua
    Sun, Jun
    2022 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (VCIP), 2022,
  • [6] Semantic Assistance in SAR Object Detection: A Mask-Guided Approach
    Liu, Wei
    Zhou, Lifan
    Zhong, Shan
    Gong, Shengrong
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 19395 - 19407
  • [7] A Feature Prefusion and Mask-Guided Network for Camera Decoration Defect Detection
    Wang, Hui
    Zhao, Yuqian
    Zhang, Fan
    Gui, Gui
    Luo, Qiwu
    Yang, Chunhua
    Gui, Weihua
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [8] Relation-Based Knowledge Distillation for Anomaly Detection
    Cheng, Hekai
    Yang, Lu
    Liu, Zulong
    PATTERN RECOGNITION AND COMPUTER VISION, PT I, 2021, 13019 : 105 - 116
  • [9] Knowledge distillation for object detection based on Inconsistency-based Feature Imitation and Global Relation Imitation
    Ju, Peng
    Zhang, Yi
    NEUROCOMPUTING, 2024, 566
  • [10] Mask-Guided Clothes-Irrelevant and Background-Irrelevant Network with Knowledge Propagation for Cloth-Changing Person Re-identification
    Zhu, Gaofeng
    Liu, Gan
    Chen, Longtao
    Liao, Guoxing
    Zeng, Huanqiang
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT XII, 2025, 15042 : 229 - 242