Feature Estimations Based Correlation Distillation for Incremental Image Retrieval

被引:18
作者
Chen, Wei [1 ]
Liu, Yu [2 ]
Pu, Nan [1 ]
Wang, Weiping [3 ]
Liu, Li [3 ,4 ]
Lew, Michael S. [1 ]
机构
[1] Leiden Univ, Leiden Inst Adv Comp Sci, NL-2311 EZ Leiden, Netherlands
[2] Dalian Univ Technol, DUT RU Int Sch Informat Sci & Engn, Dalian 116024, Peoples R China
[3] NUDT, Coll Syst Engn, Changsha 410073, Peoples R China
[4] Univ Oulu, Ctr Machine Vis & Signal Anal, Oulu 90014, Finland
基金
中国国家自然科学基金;
关键词
Task analysis; Correlation; Data models; Modeling; Training; Context modeling; Image retrieval; Incremental learning; fine-grained image retrieval; correlations distillation; feature estimation; KNOWLEDGE;
D O I
10.1109/TMM.2021.3073279
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep learning for fine-grained image retrieval in an incremental context is less investigated. In this paper, we explore this task to realize the model's continuous retrieval ability. That means, the model enables to perform well on new incoming data and reduce forgetting of the knowledge learned on preceding old tasks. For this purpose, we distill semantic correlations knowledge among the representations extracted from the new data only so as to regularize the parameters updates using the teacher-student framework. In particular, for the case of learning multiple tasks sequentially, aside from the correlations distilled from the penultimate model, we estimate the representations for all prior models and further their semantic correlations by using the representations extracted from the new data. To this end, the estimated correlations are used as an additional regularization and further prevent catastrophic forgetting over all previous tasks, and it is unnecessary to save the stream of models trained on these tasks. Extensive experiments demonstrate that the proposed method performs favorably for retaining performance on the already-trained old tasks and achieving good accuracy on the current task when new data are added at once or sequentially.
引用
收藏
页码:1844 / 1856
页数:13
相关论文
共 57 条
  • [11] Product Quantization for Nearest Neighbor Search
    Jegou, Herve
    Douze, Matthijs
    Schmid, Cordelia
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (01) : 117 - 128
  • [12] Jung H, 2018, AAAI CONF ARTIF INTE, P3358
  • [13] Khosla A., 2011, P IEEE C COMP VIS PA, V2
  • [14] Overcoming catastrophic forgetting in neural networks
    Kirkpatricka, James
    Pascanu, Razvan
    Rabinowitz, Neil
    Veness, Joel
    Desjardins, Guillaume
    Rusu, Andrei A.
    Milan, Kieran
    Quan, John
    Ramalho, Tiago
    Grabska-Barwinska, Agnieszka
    Hassabis, Demis
    Clopath, Claudia
    Kumaran, Dharshan
    Hadsell, Raia
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2017, 114 (13) : 3521 - 3526
  • [15] Overcoming Catastrophic Forgetting with Unlabeled Data in the Wild
    Lee, Kibok
    Lee, Kimin
    Shin, Jinwoo
    Lee, Honglak
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 312 - 321
  • [16] Ultrahigh-Capacity and Fire-Resistant LiFePO4-Based Composite Cathodes for Advanced Lithium-Ion Batteries
    Li, Heng
    Peng, Long
    Wu, Dabei
    Wu, Jin
    Zhu, Ying-Jie
    Hu, Xianluo
    [J]. ADVANCED ENERGY MATERIALS, 2019, 9 (10)
  • [17] Li W., 2018, ARXIV180505510
  • [18] Deep Collaborative Embedding for Social Image Understanding
    Li, Zechao
    Tang, Jinhui
    Mei, Tao
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (09) : 2070 - 2083
  • [19] Learning without Forgetting
    Li, Zhizhong
    Hoiem, Derek
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (12) : 2935 - 2947
  • [20] Lopez-Paz D, 2017, ADV NEUR IN, V30