Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks

被引:0
|
作者
Deng, Zhiwei [1 ]
Russakovsky, Olga [1 ]
机构
[1] Princeton Univ, Dept Comp Sci, Princeton, NJ 08544 USA
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose an algorithm that compresses the critical information of a large dataset into compact addressable memories. These memories can then be recalled to quickly re-train a neural network and recover the performance (instead of storing and re-training on the full original dataset). Building upon the dataset distillation framework, we make a key observation that a shared common representation allows for more efficient and effective distillation. Concretely, we learn a set of bases (aka "memories") which are shared between classes and combined through learned flexible addressing functions to generate a diverse set of training examples. This leads to several benefits: 1) the size of compressed data does not necessarily grow linearly with the number of classes; 2) an overall higher compression rate with more effective distillation is achieved; and 3) more generalized queries are allowed beyond recalling the original classes. We demonstrate state-of-the-art results on the dataset distillation task across six benchmarks, including up to 16.5% and 9.7% in retained accuracy improvement when distilling CIFAR10 and CIFAR100 respectively. We then leverage our framework to perform continual learning, achieving state-of-the-art results on four benchmarks, with 23.2% accuracy improvement on MANY. The code is released on our project webpage(1).
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Mammography Datasets for Neural Networks-Survey
    Mracko, Adam
    Vanovcanova, Lucia
    Cimrak, Ivan
    JOURNAL OF IMAGING, 2023, 9 (05)
  • [22] Associative memories using Cellular Neural Networks
    Correa, Leonardo Garcia
    Delbem, Alexandre C. B.
    Liang, Zhao
    PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, 2007, : 539 - 544
  • [23] HIERARCHICAL NEURAL NETWORKS FOR THE STORAGE OF CORRELATED MEMORIES
    DESHPANDE, V
    DASGUPTA, C
    JOURNAL OF STATISTICAL PHYSICS, 1991, 64 (3-4) : 755 - 779
  • [24] Heteroassociative memories via cellular neural networks
    Brucoli, M
    Carnimeo, L
    Grassi, G
    INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, 1998, 26 (03) : 231 - 241
  • [25] CONSTRUCTING ASSOCIATIVE MEMORIES USING NEURAL NETWORKS
    XU, X
    TSAI, WT
    NEURAL NETWORKS, 1990, 3 (03) : 301 - 309
  • [26] On the Design of Nonlinear Neural Networks for Associative Memories
    Xu Shundou (Department of Basic Courses
    TheJournalofChinaUniversitiesofPostsandTelecommunications, 1997, (01) : 40 - 46
  • [27] DESIGN TECHNIQUES OF NEURAL NETWORKS FOR ASSOCIATIVE MEMORIES
    MICHEL, AN
    FARRELL, JA
    PROCEEDINGS OF THE 28TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-3, 1989, : 252 - 259
  • [28] Capacity of cellular neural networks as associative memories
    Lukianiuk, A
    1996 FOURTH IEEE INTERNATIONAL WORKSHOP ON CELLULAR NEURAL NETWORKS AND THEIR APPLICATIONS, PROCEEDINGS (CNNA-96), 1996, : 37 - 40
  • [29] Understanding Hidden Memories of Recurrent Neural Networks
    Ming, Yao
    Cao, Shaozu
    Zhang, Ruixiang
    Li, Zhen
    Chen, Yuanzhe
    Song, Yangqiu
    Qu, Huamin
    2017 IEEE CONFERENCE ON VISUAL ANALYTICS SCIENCE AND TECHNOLOGY (VAST), 2017, : 13 - 24
  • [30] On the contribution of the ventromedial prefrontal cortex to the neural representation of past memories
    Santangelo, Valerio
    COGNITIVE NEUROSCIENCE, 2022, 13 (3-4) : 154 - 155