Affine Subspace Robust Low-Rank Self-Representation: From Matrix to Tensor

被引:23
作者
Tang, Yongqiang [1 ]
Xie, Yuan [2 ]
Zhang, Wensheng [1 ,3 ]
机构
[1] Chinese Acad Sci, Inst Automat, State Key Lab Multimodal Artificial Intelligence, Beijing 100190, Peoples R China
[2] East China Normal Univ, Sch Comp Sci & Technol, Shanghai 200050, Peoples R China
[3] Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 101408, Peoples R China
基金
上海市自然科学基金; 中国国家自然科学基金;
关键词
Affine subspace; low-rank representation; low-rank tensor; multi-view learning; subspace clustering; CLASSIFICATION; FACTORIZATION; APPROXIMATION; ALGORITHM;
D O I
10.1109/TPAMI.2023.3257407
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank self-representation based subspace learning has confirmed its great effectiveness in a broad range of applications. Nevertheless, existing studies mainly focus on exploring the global linear subspace structure, and cannot commendably handle the case where the samples approximately (i.e., the samples contain data errors) lie in several more general affine subspaces. To overcome this drawback, in this paper, we innovatively propose to introduce affine and nonnegative constraints into low-rank self-representation learning. While simple enough, we provide their underlying theoretical insight from a geometric perspective. The union of two constraints geometrically restricts each sample to be expressed as a convex combination of other samples in the same subspace. In this way, when exploring the global affine subspace structure, we can also consider the specific local distribution of data in each subspace. To comprehensively demonstrate the benefits of introducing two constraints, we instantiate three low-rank self-representation methods ranging from single-view low-rank matrix learning to multi-view low-rank tensor learning. We carefully design the solution algorithms to efficiently optimize the proposed three approaches. Extensive experiments are conducted on three typical tasks, including single-view subspace clustering, multi-view subspace clustering, and multi-view semi-supervised classification. The notably superior experimental results powerfully verify the effectiveness of our proposals.
引用
收藏
页码:9357 / 9373
页数:17
相关论文
共 87 条
  • [1] [Anonymous], 2016, P 25 INT JOINT C ART
  • [2] [Anonymous], 2013, International Conference on Machine Learning
  • [3] Identification of switched linear systems via sparse optimization
    Bako, Laurent
    [J]. AUTOMATICA, 2011, 47 (04) : 668 - 677
  • [4] Structural Deep Clustering Network
    Bo, Deyu
    Wang, Xiao
    Shi, Chuan
    Zhu, Meiqi
    Lu, Emiao
    Cui, Peng
    [J]. WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 1400 - 1410
  • [5] Bosch A, 2007, IEEE I CONF COMP VIS, P1863
  • [6] k-plane clustering
    Bradley, PS
    Mangasarian, OL
    [J]. JOURNAL OF GLOBAL OPTIMIZATION, 2000, 16 (01) : 23 - 32
  • [7] Burden RL., 1985, NUMERICAL ANAL
  • [8] Document clustering using locality preserving indexing
    Cai, D
    He, XF
    Han, JW
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2005, 17 (12) : 1624 - 1637
  • [9] A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION
    Cai, Jian-Feng
    Candes, Emmanuel J.
    Shen, Zuowei
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) : 1956 - 1982
  • [10] Heterogeneous Image Features Integration via Multi-Modal Semi-Supervised Learning Model
    Cai, Xiao
    Nie, Feiping
    Cai, Weidong
    Huang, Heng
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2013, : 1737 - 1744