Biased ART: A neural architecture that shifts attention toward previously disregarded features following an incorrect prediction

被引:24
作者
Carpenter, Gail A. [1 ]
Gaddam, Sai Chaitanya [1 ]
机构
[1] Boston Univ, Dept Cognit & Neural Syst, Boston, MA 02215 USA
基金
美国国家科学基金会;
关键词
Adaptive resonance theory; ART; ARTMAP; Featural biasing; Supervised learning; Top-down/bottom-up interactions; MULTIDIMENSIONAL MAPS; NETWORK ARCHITECTURE; INFORMATION FUSION; RECOGNITION;
D O I
10.1016/j.neunet.2009.07.025
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Memories in Adaptive Resonance Theory (ART) networks are based on matched patterns that focus attention on those portions of bottom-up inputs that match active top-down expectations. While this learning strategy has proved successful for both brain models and applications, computational examples show that attention to early critical features may later distort memory representations during online fast learning. For supervised learning, biased ARTMAP (bARTMAP) solves the problem of over-emphasis on early critical features by directing attention away from previously attended features after the system makes a predictive error. Small-scale, hand-computed analog and binary examples illustrate key model dynamics. Two-dimensional simulation examples demonstrate the evolution of bARTMAP memories as they are learned online. Benchmark simulations show that featural biasing also improves performance on large-scale examples. One example, which predicts movie genres and is based, in part, on the Netflix Prize database, was developed for this project. Both first principles and consistent performance improvements on all simulation studies suggest that featural biasing should be incorporated by default in all ARTMAP systems. Benchmark datasets and bARTMAP code are available from the CNS Technology Lab Website: http://techlab.bu.edu/bART/. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:435 / 451
页数:17
相关论文
共 23 条
[1]  
BENNET J, 2007, NETFLIX PRIZE
[2]  
Carpenter G. A., 1994, P WORLD C NEUR NETW, P713
[3]   Distributed ARTMAP: a neural network for fast distributed supervised learning [J].
Carpenter, GA ;
Milenova, BL ;
Noeske, BW .
NEURAL NETWORKS, 1998, 11 (05) :793-813
[4]   ART-EMAP - A NEURAL-NETWORK ARCHITECTURE FOR OBJECT RECOGNITION BY EVIDENCE ACCUMULATION [J].
CARPENTER, GA ;
ROSS, WD .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (04) :805-818
[5]   Distributed learning, recognition, and prediction by ART and ARTMAP neural networks [J].
Carpenter, GA .
NEURAL NETWORKS, 1997, 10 (08) :1473-1494
[6]   Self-organizing information fusion and hierarchical knowledge discovery: a new framework using ARTMAP neural networks [J].
Carpenter, GA ;
Martens, S ;
Ogas, OJ .
NEURAL NETWORKS, 2005, 18 (03) :287-295
[7]  
Carpenter GA, 2003, IEEE IJCNN, P1396
[8]   ART-3 - HIERARCHICAL SEARCH USING CHEMICAL TRANSMITTERS IN SELF-ORGANIZING PATTERN-RECOGNITION ARCHITECTURES [J].
CARPENTER, GA ;
GROSSBERG, S .
NEURAL NETWORKS, 1990, 3 (02) :129-152
[9]   FUZZY ART - FAST STABLE LEARNING AND CATEGORIZATION OF ANALOG PATTERNS BY AN ADAPTIVE RESONANCE SYSTEM [J].
CARPENTER, GA ;
GROSSBERG, S ;
ROSEN, DB .
NEURAL NETWORKS, 1991, 4 (06) :759-771
[10]   FUZZY ARTMAP - A NEURAL NETWORK ARCHITECTURE FOR INCREMENTAL SUPERVISED LEARNING OF ANALOG MULTIDIMENSIONAL MAPS [J].
CARPENTER, GA ;
GROSSBERG, S ;
MARKUZON, N ;
REYNOLDS, JH ;
ROSEN, DB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (05) :698-713