Neural network processing for multiset data

被引:0
作者
McGregor, Simon
机构
来源
Artificial Neural Networks - ICANN 2007, Pt 1, Proceedings | 2007年 / 4668卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces the notion of the variadic neural network (VNN). The inputs to a variadic network are an arbitrary-length list of n-tuples of real numbers, where n is fixed. In contrast to a recurrent network which processes a list sequentially, typically being affected more by more recent list elements, a variadic network processes the list simultaneously and is affected equally by all list elements. Formally speaking, the network can be seen as instantiating a function on a multiset along with a member of that multiset. I describe a simple implementation of a variadic network architecture, the multi-layer variadic perceptron (MLVP), and present experimental results showing that such a network can learn various variadic functions by back-propagation.
引用
收藏
页码:460 / 470
页数:11
相关论文
共 7 条
[1]  
[Anonymous], 1998, LECT NOTES COMPUTER, DOI DOI 10.1007/S13928716
[2]  
COLLINS M, 2002, ADV NEURAL INFORMATI, V14
[3]   Text classification using string kernels [J].
Lodhi, H ;
Saunders, C ;
Shawe-Taylor, J ;
Cristianini, N ;
Watkins, C .
JOURNAL OF MACHINE LEARNING RESEARCH, 2002, 2 (03) :419-444
[4]   A SCALED CONJUGATE-GRADIENT ALGORITHM FOR FAST SUPERVISED LEARNING [J].
MOLLER, MF .
NEURAL NETWORKS, 1993, 6 (04) :525-533
[5]  
PETROVSKY AB, 2006, INT J MANAGEMENT DEC, V7, P166
[6]  
RIEDMILLER M, 1992, RPROP FAST ADAPTIVE
[7]   LEARNING REPRESENTATIONS BY BACK-PROPAGATING ERRORS [J].
RUMELHART, DE ;
HINTON, GE ;
WILLIAMS, RJ .
NATURE, 1986, 323 (6088) :533-536