STATIONARY-POINTS OF A SINGLE-LAYER PERCEPTRON FOR NONSEPARABLE DATA MODELS

被引:4
|
作者
SHYNK, JJ [1 ]
BERSHAD, NJ [1 ]
机构
[1] UNIV CALIF IRVINE,IRVINE,CA 92717
关键词
PERCEPTRONS; STOCHASTIC TRAINING MODELS; NONSEPARABLE DATA MODELS; LEARNING ALGORITHMS; ROSENBLATT ALGORITHM; CONVERGENCE ANALYSIS; STATIONARY POINTS; PERFORMANCE SURFACES;
D O I
10.1016/0893-6080(93)90016-P
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A single-layer perceptron divides the input signal space into two regions separated by a hyperplane. In many applications, however, the training signal of the adaptive algorithm represents more complicated decision regions that may not be linearly separable. For these cases, it is usually not obvious how the adaptive algorithm of a single-layer perceptron will perform, in terms of its convergence properties and the optimum location of the hyperplane boundary. In this paper, we determine the stationary points of Rosenblatt's learning algorithm for a single-layer perceptron and two nonseparable models of the training data. Our analysis is based on a system identification formulation of the training signal, and the perceptron input signals are modeled as independent Gaussian sequences. Expressions for the corresponding performance function are also derived, and computer simulations are presented thal verify the analytical results.
引用
收藏
页码:189 / 202
页数:14
相关论文
empty
未找到相关数据