Class incremental learning via Multi-hinge distillation

被引:0
作者
Lin, Qianhe [1 ]
Yu, Yuanlong [2 ]
Huang, Zhiyong [3 ]
机构
[1] FuZhou Univ, Coll Software, Fuzhou, Peoples R China
[2] FuZhou Univ, Coll Math & Comp Sci, Fuzhou, Peoples R China
[3] ZheJiang Lab, Intelligent Robot Res Ctr, Hangzhou, Peoples R China
来源
2020 CHINESE AUTOMATION CONGRESS (CAC 2020) | 2020年
关键词
class incremental learning; catastrophic forgetting; distillation;
D O I
10.1109/CAC51589.2020.9327814
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Class incremental learning has been proposed and developed widely to solve the problem in which training data is continuously used to extend the model's knowledge. It has been applied in many fields, such as pattern recognition and computer vision. However, traditional incremental learning methods facing the challenge of catastrophic forgetting. To this end, we use a generative adversarial network to assist the model's class incremental learning. Apart from that, some methods employ old classifier output soft targets for generated pseudo-data to help classifier keep knowledge of old classes, but generated samples with soft target and new class samples with the explicit training labels will lead to asymmetric information problem between old and new classes during incremental learning. This paper presents a new knowledge distillation method by using multi-hinge cGAN, we use multi-hinge distillation transfer class-correlation information among all classes from discriminator to the classifier, and help classifier overcomes catastrophic forgetting. Experiments on MNIST and CIFARIO show that our method is comparable to many state-of-art incremental learning methods.
引用
收藏
页码:5996 / 6000
页数:5
相关论文
共 14 条
[1]   Riemannian Walk for Incremental Learning: Understanding Forgetting and Intransigence [J].
Chaudhry, Arslan ;
Dokania, Puneet K. ;
Ajanthan, Thalaiyasingam ;
Torr, Philip H. S. .
COMPUTER VISION - ECCV 2018, PT XI, 2018, 11215 :556-572
[2]  
Goodfellow IJ, 2014, ADV NEUR IN, V27, P2672
[3]  
Hinton G., 2015, Distilling the knowledge in a neural network
[4]  
Isele D., 2018, ARXIV180210269
[5]  
Kavalerov Ilya, 2019, ARXIV191204216
[6]   Overcoming catastrophic forgetting in neural networks [J].
Kirkpatricka, James ;
Pascanu, Razvan ;
Rabinowitz, Neil ;
Veness, Joel ;
Desjardins, Guillaume ;
Rusu, Andrei A. ;
Milan, Kieran ;
Quan, John ;
Ramalho, Tiago ;
Grabska-Barwinska, Agnieszka ;
Hassabis, Demis ;
Clopath, Claudia ;
Kumaran, Dharshan ;
Hadsell, Raia .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2017, 114 (13) :3521-3526
[7]   Learning without Forgetting [J].
Li, Zhizhong ;
Hoiem, Derek .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (12) :2935-2947
[8]  
McCloskey M., 1989, Psychology of learning and motivation, V24, P109, DOI [DOI 10.1016/S0079-7421(08)60536-8, 10.1016/S0079-7421(08)60536-8]
[9]  
Miyato T., 2018, PROC INT C LEARN REP
[10]  
Miyato Takeru, 2018, INT C LEARN REPR