Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation

被引:61
作者
Feng, Tao [1 ]
Wang, Mang [1 ]
Yuan, Hangjie [2 ]
机构
[1] Alibaba Grp, Hangzhou, Peoples R China
[2] Zhejiang Univ, Hangzhou, Peoples R China
来源
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2022年
关键词
D O I
10.1109/CVPR52688.2022.00921
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traditional object detectors are ill-equipped for incremental learning. However, fine-tuning directly on a well-trained detection model with only new data will lead to catastrophic forgetting. Knowledge distillation is a flexible way to mitigate catastrophic forgetting. In Incremental Object Detection (IOD), previous work mainly focuses on distilling for the combination of features and responses. However, they under-explore the information that contains in responses. In this paper, we propose a response-based incremental distillation method, dubbed Elastic Response Distillation (ERD), which focuses on elastically learning responses from the classification head and the regression head. Firstly, our method transfers category knowledge while equipping student detector with the ability to retain localization information during incremental learning. In addition, we further evaluate the quality of all locations and provide valuable responses by the Elastic Response Selection (ERS) strategy. Finally, we elucidate that the knowledge from different responses should be assigned with different importance during incremental distillation. Extensive experiments conducted on MS COCO demonstrate our method achieves state-of-the-art result, which substantially narrows the performance gap towards full training.
引用
收藏
页码:9417 / 9426
页数:10
相关论文
共 38 条
[1]   Memory Aware Synapses: Learning What (not) to Forget [J].
Aljundi, Rahaf ;
Babiloni, Francesca ;
Elhoseiny, Mohamed ;
Rohrbach, Marcus ;
Tuytelaars, Tinne .
COMPUTER VISION - ECCV 2018, PT III, 2018, 11207 :144-161
[2]  
[Anonymous], 2015, Microsoft COCO captions: Data collection and evaluation server
[3]  
[Anonymous], 2019, ARXIV190407850, DOI DOI 10.1080/08870446.2019.1574348
[4]  
Bucilua C., 2006, P ACM INT C KNOWLEDG, P535
[5]   Data-free Knowledge Distillation for Object Detection [J].
Chawla, Akshay ;
Yin, Hongxu ;
Molchanov, Pavlo ;
Alvarez, Jose .
2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, :3288-3297
[6]  
Chen GB, 2017, ADV NEUR IN, V30
[7]  
Chen P., 2021, IEEE C COMP VIS PATT
[8]   New existence results for coupled delayed differential systems with multi-parameters [J].
Chen, Ruipeng ;
Li, Xiaoya .
BOUNDARY VALUE PROBLEMS, 2020, 2020 (01)
[9]   General Instance Distillation for Object Detection [J].
Dai, Xing ;
Jiang, Zeren ;
Wu, Zhao ;
Bao, Yiping ;
Wang, Zhicheng ;
Liu, Si ;
Zhou, Erjin .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :7838-7847
[10]   Identifying players in broadcast videos using graph convolutional network [J].
Feng, Tao ;
Ji, Kaifan ;
Bian, Ang ;
Liu, Chang ;
Zhang, Jianzhou .
PATTERN RECOGNITION, 2022, 124