Optimal Reduced Sets for Sparse Kernel Spectral Clustering

被引:0
作者
Mall, Raghvendra [1 ]
Mehrkanoon, Siamak [1 ]
Langone, Rocco [1 ]
Suykens, Johan A. K. [1 ]
机构
[1] ESAT SCD, Kasteelpk Arenberg 10,Bus 2446, B-3001 Heverlee, Belgium
来源
PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2014年
关键词
SELECTION; REGRESSION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel spectral clustering (KSC) solves a weighted kernel principal component analysis problem in a primal-dual optimization framework. It results in a clustering model using the dual solution of the problem. It has a powerful out-of-sample extension property leading to good clustering generalization w.r.t. the unseen data points. The out-of-sample extension property allows to build a sparse model on a small training set and introduces the first level of sparsity. The clustering dual model is expressed in terms of non-sparse kernel expansions where every point in the training set contributes. The goal is to find reduced set of training points which can best approximate the original solution. In this paper a second level of sparsity is introduced in order to reduce the time complexity of the computationally expensive out-of-sample extension. In this paper we investigate various penalty based reduced set techniques including the Group Lasso, L-0, L-1 + L-0 penalization and compare the amount of sparsity gained w.r.t. a previous L-1 penalization technique. We observe that the optimal results in terms of sparsity corresponds to the Group Lasso penalization technique in majority of the cases. We showcase the effectiveness of the proposed approaches on several real world datasets and an image segmentation dataset.
引用
收藏
页码:2436 / 2443
页数:8
相关论文
共 50 条
  • [31] Spatial subspace clustering for drill hole spectral data
    Guo, Yi
    Gao, Junbin
    Li, Feng
    JOURNAL OF APPLIED REMOTE SENSING, 2014, 8
  • [32] Some sets of orthogonal polynomial kernel functions
    Tian, Meng
    Wang, Wenjian
    APPLIED SOFT COMPUTING, 2017, 61 : 742 - 756
  • [33] Sparse regression for large data sets with outliers
    Bottmer, Lea
    Croux, Christophe
    Wilms, Ines
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2022, 297 (02) : 782 - 794
  • [34] Sparse confidence sets for normal mean models
    Ning, Yang
    Cheng, Guang
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2023, 12 (03)
  • [35] Gaussian Kernel Width Optimization for Sparse Bayesian Learning
    Mohsenzadeh, Yalda
    Sheikhzadeh, Hamid
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (04) : 709 - 719
  • [36] Sparse kernel learning with LASSO and Bayesian inference algorithm
    Gao, Junbin
    Kwan, Paul W.
    Shi, Daming
    NEURAL NETWORKS, 2010, 23 (02) : 257 - 264
  • [37] Spectral Reweighting and Spectral Similarity Weighting for Sparse Hyperspectral Unmixing
    Zhang, Dengyong
    Wang, Taowei
    Yang, Shujun
    Jia, Yuheng
    Li, Feng
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [38] Sparse Multiple Kernel Learning for Signal Processing Applications
    Subrahmanya, Niranjan
    Shin, Yung C.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2010, 32 (05) : 788 - 798
  • [39] SLiKER: Sparse loss induced kernel ensemble regression
    Shen, Xiang-Jun
    Ni, ChengGong
    Wang, Liangjun
    Zha, Zheng-Jun
    PATTERN RECOGNITION, 2021, 109
  • [40] Kernel Homotopy Based Sparse Representation For Object Classification
    Kang, Cuicui
    Liao, Shengcai
    Xiang, Shiming
    Pan, Chunhong
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 1479 - 1482