Self-splitting competitive learning: A new on-line clustering paradigm

被引:155
作者
Zhang, YJ [1 ]
Liu, ZQ
机构
[1] Univ Melbourne, Dept Comp Sci & Software Engn, Parkville, Vic 3010, Australia
[2] City Univ Hong Kong, Sch Creat Media, Hong Kong, Hong Kong, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2002年 / 13卷 / 02期
关键词
clustering; competitive learning; one-prototype-take-one-cluster (OPTOC); self-splitting; unsupervised learning; validity measure; winner-take-all (WTA);
D O I
10.1109/72.991422
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Clustering in the neural-network literature is generally based on the competitive learning paradigm. This paper addresses two major issues associated with conventional competitive learning, namely, sensitivity to initialization and difficulty in determining the number of prototypes. In general, selecting the appropriate number of prototypes is a difficult task, as we do not usually know the number of clusters in the input data a priori. It is therefore desirable to develop an algorithm that has no dependency on the initial prototype locations and is able to adaptively generate prototypes to fit the input data patterns. In this paper, we present a new, more powerful competitive learning algorithm, self-splitting competitive learning (SSCL), that is able to find the natural number of clusters based on the one-prototype-take-one-cluster (OPTOC) paradigm and a self-splitting validity measure. It starts with a single prototype randomly initialized in the feature space and splits adaptively during the learning process until all clusters are found, each cluster is associated with a prototype at its center. We have conducted extensive experiments to demonstrate the effectiveness of the SSCL algorithm. The results show that SSCL has the desired ability for a variety of applications, including unsupervised classification, curve detection, and image segmentation.
引用
收藏
页码:369 / 380
页数:12
相关论文
共 48 条
[1]   COMPETITIVE LEARNING ALGORITHMS FOR VECTOR QUANTIZATION [J].
AHALT, SC ;
KRISHNAMURTHY, AK ;
CHEN, PK ;
MELTON, DE .
NEURAL NETWORKS, 1990, 3 (03) :277-290
[2]  
[Anonymous], 1992, NEURAL NETWORKS FUZZ
[3]   Unsupervised Learning [J].
Barlow, H. B. .
NEURAL COMPUTATION, 1989, 1 (03) :295-311
[4]  
Bezdek J. C., 1981, Pattern recognition with fuzzy objective function algorithms
[5]   COMPLEXITY OPTIMIZED DATA CLUSTERING BY COMPETITIVE NEURAL NETWORKS [J].
BUHMANN, J ;
KUHNEL, H .
NEURAL COMPUTATION, 1993, 5 (01) :75-88
[6]  
BUHMANN JM, 2000, ADV NEURAL INFORMATI, V12
[7]   ART-3 - HIERARCHICAL SEARCH USING CHEMICAL TRANSMITTERS IN SELF-ORGANIZING PATTERN-RECOGNITION ARCHITECTURES [J].
CARPENTER, GA ;
GROSSBERG, S .
NEURAL NETWORKS, 1990, 3 (02) :129-152
[8]   A MASSIVELY PARALLEL ARCHITECTURE FOR A SELF-ORGANIZING NEURAL PATTERN-RECOGNITION MACHINE [J].
CARPENTER, GA ;
GROSSBERG, S .
COMPUTER VISION GRAPHICS AND IMAGE PROCESSING, 1987, 37 (01) :54-115
[9]   ART-2 - SELF-ORGANIZATION OF STABLE CATEGORY RECOGNITION CODES FOR ANALOG INPUT PATTERNS [J].
CARPENTER, GA ;
GROSSBERG, S .
APPLIED OPTICS, 1987, 26 (23) :4919-4930
[10]   Nonlinear system modeling by competitive learning and adaptive fuzzy inference system [J].
Chen, JQ ;
Xi, YG .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 1998, 28 (02) :231-238