Robust Dimension Reduction for Clustering With Local Adaptive Learning

被引:30
|
作者
Wang, Xiao-Dong [1 ,2 ]
Chen, Rung-Ching [2 ]
Zeng, Zhi-Qiang [1 ]
Hong, Chao-Qun [1 ]
Yan, Fei [1 ]
机构
[1] Xiamen Univ Technol, Coll Comp & Informat Engn, Xiamen 361024, Peoples R China
[2] Chaoyang Univ Technol, Dept Informat Management, Taichung 413, Taiwan
关键词
Dimension reduction; K-means; l(2,1)-norm; manifold learning; K-MEANS; ALGORITHMS; FRAMEWORK; EXTENSIONS;
D O I
10.1109/TNNLS.2018.2850823
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In pattern recognition and data mining, clustering is a classical technique to group matters of interest and has been widely employed to numerous applications. Among various clustering algorithms, K-means (KM) clustering is most popular for its simplicity and efficiency. However, with the rapid development of the social network, high-dimensional data are frequently generated, which poses a considerable challenge to the traditional KM clustering as the curse of dimensionality. In such scenarios, it is difficult to directly cluster such highdimensional data that always contain redundant features and noises. Although the existing approaches try to solve this problem using joint subspace learning and KM clustering, there are still the following limitations: 1) the discriminative information in low-dimensional subspace is not well captured; 2) the intrinsic geometric information is seldom considered; and 3) the optimizing procedure of a discrete cluster indicator matrix is vulnerable to noises. In this paper, we propose a novel clustering model to cope with the above-mentioned challenges. Within the proposed model, discriminative information is adaptively explored by unifying local adaptive subspace learning and KM clustering. We extend the proposed model using a robust l(2,1)-norm loss function, where the robust cluster centroid can be calculated in a weighted iterative procedure. We also explore and discuss the relationships between the proposed algorithm and several related studies. Extensive experiments on kinds of benchmark data sets demonstrate the advantage of the proposed model compared with the state-of-the-art clustering approaches.
引用
收藏
页码:657 / 669
页数:13
相关论文
共 50 条
  • [21] Dimension reduction for model-based clustering
    Scrucca, Luca
    STATISTICS AND COMPUTING, 2010, 20 (04) : 471 - 484
  • [22] An adaptive estimation of dimension reduction space
    Xia, YC
    Tong, H
    Li, WK
    Zhu, LX
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2002, 64 : 363 - 388
  • [23] A note on structural adaptive dimension reduction
    Polzehl, Joerg
    Sperlich, Stefan
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2009, 79 (06) : 805 - 818
  • [24] Minimax adaptive dimension reduction for regression
    Paris, Quentin
    JOURNAL OF MULTIVARIATE ANALYSIS, 2014, 128 : 186 - 202
  • [25] Structure adaptive approach for dimension reduction
    Hristache, M
    Juditsky, A
    Polzehl, J
    Spokoiny, V
    ANNALS OF STATISTICS, 2001, 29 (06): : 1537 - 1566
  • [26] LOCAL MANIFOLD LEARNING WITH ROBUST NEIGHBORS SELECTION FOR HYPERSPECTRAL DIMENSIONALITY REDUCTION
    Hong, Dan Feng
    Yokoya, Naoto
    Zhu, Xiao Xiang
    2016 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2016, : 40 - 43
  • [27] Adaptive weighted ensemble clustering via kernel learning and local information preservation
    Li, Taiyong
    Shu, Xiaoyang
    Wu, Jiang
    Zheng, Qingxiao
    Lv, Xi
    Xu, Jiaxuan
    KNOWLEDGE-BASED SYSTEMS, 2024, 294
  • [28] Discriminative sparse embedding based on adaptive graph for dimension reduction
    Liu, Zhonghua
    Shi, Kaiming
    Zhang, Kaibing
    Ou, Weihua
    Wang, Lin
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020, 94
  • [29] Local Regression and Global Information-Embedded Dimension Reduction
    Yao, Chao
    Han, Junwei
    Nie, Feiping
    Xiao, Fu
    Li, Xuelong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (10) : 4882 - 4893
  • [30] Robust dimension reduction based on canonical correlation
    Zhou, Jianhui
    JOURNAL OF MULTIVARIATE ANALYSIS, 2009, 100 (01) : 195 - 209