Deep associative neural network for associative memory based on unsupervised representation learning

被引:36
作者
Liu, Jia [1 ]
Gong, Maoguo [1 ]
He, Haibo [2 ]
机构
[1] Xidian Univ, Sch Elect Engn, Key Lab Intelligent Percept & Image Understanding, Minist Educ, Xian 710071, Shaanxi, Peoples R China
[2] Univ Rhode Isl, Dept Elect Comp & Biomed Engn, Kingston, RI 02881 USA
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Unsupervised representation learning; Deep neural network; Associative memory; Image recovery;
D O I
10.1016/j.neunet.2019.01.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a deep associative neural network (DANN) based on unsupervised representation learning for associative memory. In brain, the knowledge is learnt by associating different types of sensory data, such as image and voice. The associative memory models which imitate such a learning process have been studied for decades but with simpler architectures they fail to deal with large scale complex data as compared with deep neural networks. Therefore, we define a deep architecture consisting of a perception layer and hierarchical propagation layers. To learn the network parameters, we define a probabilistic model for the whole network inspired from unsupervised representation learning models. The model is optimized by a modified contrastive divergence algorithm with a novel iterated sampling process. After training, given a new data or corrupted data, the correct label or corrupted part is associated by the network. The DANN is able to achieve many machine learning problems, including not only classification, but also depicting the data given a label and recovering corrupted images. Experiments on MNIST digits and CIFAR-10 datasets demonstrate the learning capability of the proposed DANN. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页码:41 / 53
页数:13
相关论文
共 35 条
[1]  
[Anonymous], 2008, NIPS
[2]  
[Anonymous], 2006, Advances in Neural Information Processing Systems
[3]  
[Anonymous], 2009, LEARNING MULTIPLE LA
[4]  
[Anonymous], P AAAI C ART INT
[5]  
[Anonymous], 2015, P IEEE C COMP VIS PA
[6]   Deep Machine Learning-A New Frontier in Artificial Intelligence Research [J].
Arel, Itamar ;
Rose, Derek C. ;
Karnowski, Thomas P. .
IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2010, 5 (04) :13-18
[7]  
Awan AA, 2017, ACM SIGPLAN NOTICES, V52, P193, DOI [10.1145/3155284.3018769, 10.1145/3018743.3018769]
[8]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[9]   Learning precise timing with LSTM recurrent networks [J].
Gers, FA ;
Schraudolph, NN ;
Schmidhuber, J .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (01) :115-143
[10]  
Glorot X., 2011, Deep Sparse Rectifier Neural Networks, P315