Progressive Meta-Learning With Curriculum

被引:43
作者
Zhang, Ji [1 ]
Song, Jingkuan [2 ]
Gao, Lianli [1 ]
Liu, Ye [1 ]
Shen, Heng Tao [1 ]
机构
[1] Univ Elect Sci & Technol China, Ctr Future Media, Chengdu 611730, Sichuan, Peoples R China
[2] Univ Elect Sci & Technol China, Sichuan Prov Peoples Hosp, Inst Neurol, Chengdu 610072, Sichuan, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Training; Adaptation models; Computational modeling; Ear; Standards; Pediatrics; Few-shot learning; meta-learning; curriculum learning; self-paced learning; hard task-sampling;
D O I
10.1109/TCSVT.2022.3164190
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Meta-learning offers an effective solution to learn new concepts under scarce supervision through an episodic-training scheme: a series of target-like tasks sampled from base classes are sequentially fed into a meta-learner to extract cross-task knowledge, which can facilitate the quick acquisition of task-specific knowledge of the target task with few samples. Despite its noticeable improvements, the episodic-training strategy samples tasks randomly and uniformly, without considering their hardness and quality, which may not progressively improve the meta-leaner's generalization. In this paper, we propose Progressive Meta-learning using tasks from easy to hard. First, based on a predefined curriculum, we develop a Curriculum-Based Meta-learning (CubMeta) method. CubMeta is in a stepwise manner, and in each step, we design a BrotherNet module to establish harder tasks and an effective learning scheme for obtaining an ensemble of stronger meta-learners. Then we move a step further to propose an end-to-end Self-Paced Meta-learning (SepMeta) method. The curriculum in SepMeta is effectively integrated as a regularization term into the objective so that the meta-learner can measure the hardness of tasks adaptively, according to what the model has already learned. Extensive experiments on benchmark datasets demonstrate the effectiveness of the proposed methods. Our code is available at https://github.com/nobody-777.
引用
收藏
页码:5916 / 5930
页数:15
相关论文
共 57 条
[1]  
Afrasiyabi Arman, 2020, Computer Vision - ECCV 2020 16th European Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12350), P18, DOI 10.1007/978-3-030-58558-7_2
[2]  
Antoniou A, 2019, Arxiv, DOI arXiv:1810.09502
[3]   Learning to Forget for Meta-Learning [J].
Baik, Sungyong ;
Hong, Seokil ;
Lee, Kyoung Mu .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :2376-2384
[4]  
Bengio Y., 2009, P 26 ANN INT C MACHI, P41
[5]  
Bin Liu, 2020, Computer Vision - ECCV 2020. 16th European Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12349), P438, DOI 10.1007/978-3-030-58548-8_26
[6]   Hierarchical Graph Neural Networks for Few-Shot Learning [J].
Chen, Cen ;
Li, Kenli ;
Wei, Wei ;
Zhou, Joey Tianyi ;
Zeng, Zeng .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (01) :240-252
[7]   Meta-Learning-Based Incremental Few-Shot Object Detection [J].
Cheng, Meng ;
Wang, Hanli ;
Long, Yu .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (04) :2158-2169
[8]  
Chenghao Liu, 2020, Computer Vision - ECCV 2020. 16th European Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12363), P752, DOI 10.1007/978-3-030-58523-5_44
[9]   On the Efficacy of Knowledge Distillation [J].
Cho, Jang Hyun ;
Hariharan, Bharath .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :4793-4801
[10]   A Two-Stage Approach to Few-Shot Learning for Image Recognition [J].
Das, Debasmit ;
Lee, C. S. George .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 :3336-3350