Interpreting and Improving Multi-Layered Networks by Free Energy-Based Competitive Learning

被引:0
|
作者
Kamimura, Ryotaro [1 ]
机构
[1] Tokai Univ, Informat Sci Educ Ctr, Kanagawa 2591292, Japan
来源
2008 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), VOLS 1-6 | 2008年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a new algorithm to interpret and improve multi-layered neural networks. The new method is used to simplify and interpret final representations obtained by conventional learning methods such as BP or RBF. In addition, the method is used to retrain networks so as to produce networks with better performance in terms of generalization errors. Neural networks have so far produced effective internal representations for many problems. However, because obtained information is extremely distributed over many elements, it is difficult to interpret the meaning of representations. For this problem, we use competitive learning by which we can focus upon some elements in networks. In addition, because the number of effective elements in neural networks can significantly be reduced, we can expect improved performance in terms of generalization. Competitive processes are considered to be a process or entropy minimization in which a small number of units acquires the majority of information on input patterns. For simplifying computation, we use a free energy in which entropy and training errors are simultaneously used. We applied the methods to an artificial data and a cabinet approval rating estimation problem. In both problems, we succeeded in extracting the main features in input patterns and improved generalization performance could be obtained.
引用
收藏
页码:1811 / 1817
页数:7
相关论文
共 50 条