Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks

被引:0
|
作者
Deng, Zhiwei [1 ]
Russakovsky, Olga [1 ]
机构
[1] Princeton Univ, Dept Comp Sci, Princeton, NJ 08544 USA
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose an algorithm that compresses the critical information of a large dataset into compact addressable memories. These memories can then be recalled to quickly re-train a neural network and recover the performance (instead of storing and re-training on the full original dataset). Building upon the dataset distillation framework, we make a key observation that a shared common representation allows for more efficient and effective distillation. Concretely, we learn a set of bases (aka "memories") which are shared between classes and combined through learned flexible addressing functions to generate a diverse set of training examples. This leads to several benefits: 1) the size of compressed data does not necessarily grow linearly with the number of classes; 2) an overall higher compression rate with more effective distillation is achieved; and 3) more generalized queries are allowed beyond recalling the original classes. We demonstrate state-of-the-art results on the dataset distillation task across six benchmarks, including up to 16.5% and 9.7% in retained accuracy improvement when distilling CIFAR10 and CIFAR100 respectively. We then leverage our framework to perform continual learning, achieving state-of-the-art results on four benchmarks, with 23.2% accuracy improvement on MANY. The code is released on our project webpage(1).
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Synthesizing Traffic Datasets using Graph Neural Networks
    Rodriguez-Criado, Daniel
    Chli, Maria
    Manso, Luis J.
    Vogiatzis, George
    2023 IEEE 26TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS, ITSC, 2023, : 3361 - 3368
  • [42] Discovering Trends in Large Datasets Using Neural Networks
    Khosrow Kaikhah
    Sandesh Doddameti
    Applied Intelligence, 2006, 24 : 51 - 60
  • [43] Comparison of different neural networks performances on motorboat datasets
    Amasyali, M. Fatih
    Bal, Mert
    Celebi, Ugur B.
    Ekinci, Serkan
    Boyaci, U. Kasif
    Computer and Information Sciences - ISCIS 2006, Proceedings, 2006, 4263 : 212 - 220
  • [44] Star cellular neural networks for associative and dynamic memories
    Itoh, M
    Chua, LO
    INTERNATIONAL JOURNAL OF BIFURCATION AND CHAOS, 2004, 14 (05): : 1725 - 1772
  • [45] ON BINARY ASSOCIATIVE MEMORIES BASED ON RECURRENT NEURAL NETWORKS
    CHIUEH, TD
    JOURNAL OF THE CHINESE INSTITUTE OF ENGINEERS, 1994, 17 (01) : 55 - 62
  • [46] On the limiting capacity of associative memories on the basis of neural networks
    Shubnikov, EI
    OPTICS AND SPECTROSCOPY, 1998, 85 (06) : 908 - 912
  • [47] On binary associative memories based on recurrent neural networks
    Chiueh, Tzi-Dar
    Journal of the Chinese Institute of Engineers, Transactions of the Chinese Institute of Engineers,Series A/Chung-kuo Kung Ch'eng Hsuch K'an, 1994, 17 (01): : 55 - 62
  • [48] CIRCUITRY FOR ARTIFICIAL NEURAL NETWORKS WITH NONVOLATILE ANALOG MEMORIES
    SHIMABUKURO, RL
    SHOEMAKER, PA
    STEWART, ME
    1989 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS 1-3, 1989, : 1217 - 1220
  • [49] Implementation of cellular neural networks for heteroassociative and autoassociative memories
    Brucoli, M
    Carnimeo, L
    Grassi, G
    1996 FOURTH IEEE INTERNATIONAL WORKSHOP ON CELLULAR NEURAL NETWORKS AND THEIR APPLICATIONS, PROCEEDINGS (CNNA-96), 1996, : 63 - 68
  • [50] On stability and associative recall of memories in attractor neural networks
    Sampath, Suchitra
    Srivastava, Vipin
    PLOS ONE, 2020, 15 (09):