Adaptive local Principal Component Analysis improves the clustering of high-dimensional data

被引:5
作者
Migenda, Nico [1 ,3 ]
Moeller, Ralf [2 ]
Schenck, Wolfram [1 ]
机构
[1] Bielefeld Univ Appl Sci & Arts, Ctr Appl Data Sci CfADS, Bielefeld, Germany
[2] Bielefeld Univ, Fac Technol, Comp Engn Grp, Bielefeld, Germany
[3] Schulstr 10, D-33330 Gutersloh, Germany
关键词
High-dimensional clustering; Potential function; Adaptive learning rate; Ranking criteria; Neural network-based PCA; Mixture PCA; Local PCA; LEARNING ALGORITHM; DECOMPOSITION; CONVERGENCE;
D O I
10.1016/j.patcog.2023.110030
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In local Principal Component Analysis (PCA), a distribution is approximated by multiple units, each repre-senting a local region by a hyper-ellipsoid obtained through PCA. We present an extension for local PCA which adaptively adjusts both the learning rate of each unit and the potential function which guides the competition between the local units. Our local PCA method is an online neural network method where unit centers and shapes are modified after the presentation of each data point. For several benchmark distributions, we demonstrate that our method improves the overall quality of clustering, especially for high-dimensional distributions where many conventional methods do not perform satisfactorily. Our online method is also well suited for the processing of streaming data: The two adaptive mechanisms lead to a quick reorganization of the clustering when the underlying distribution changes.
引用
收藏
页数:16
相关论文
共 38 条
  • [21] Lu T, 2010, J. Inf. Hiding Multimed. Signal Process., V1, P190
  • [22] Global convergence of Oja's PCA learning algorithm with a non-zero-approaching adaptive learning rate
    Lv, Jian Cheng
    Yi, Zhang
    Tan, K. K.
    [J]. THEORETICAL COMPUTER SCIENCE, 2006, 367 (03) : 286 - 307
  • [23] MARTINETZ T, 1991, ARTIFICIAL NEURAL NETWORKS, VOLS 1 AND 2, P397
  • [24] A Survey of Clustering With Deep Learning: From the Perspective of Network Architecture
    Min, Erxue
    Guo, Xifeng
    Liu, Qiang
    Zhang, Gen
    Cui, Jianjing
    Long, Jun
    [J]. IEEE ACCESS, 2018, 6 : 39501 - 39514
  • [25] First-order approximation of Gram-Schmidt orthonormalization beats deflation in coupled PCA learning rules
    Moeller, Ralf
    [J]. NEUROCOMPUTING, 2006, 69 (13-15) : 1582 - 1590
  • [26] An extension of neural gas to local PCA
    Möller, R
    Hoffmann, H
    [J]. NEUROCOMPUTING, 2004, 62 : 305 - 326
  • [27] Interlocking of learning and orthonormalization in RRLSA
    Möller, R
    [J]. NEUROCOMPUTING, 2002, 49 : 429 - 433
  • [28] Parallel random swap: An efficient and reliable clustering algorithm in java']java
    Nigro, Libero
    Cicirelli, Franco
    Fra, Pasi
    [J]. SIMULATION MODELLING PRACTICE AND THEORY, 2023, 124
  • [29] A SIMPLIFIED NEURON MODEL AS A PRINCIPAL COMPONENT ANALYZER
    OJA, E
    [J]. JOURNAL OF MATHEMATICAL BIOLOGY, 1982, 15 (03) : 267 - 273
  • [30] Robust recursive least squares learning algorithm for principal component analysis
    Ouyang, S
    Bao, Z
    Liao, GS
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (01): : 215 - 221