Differentiable Bi-Sparse Multi-View Co-Clustering

被引:37
作者
Du, Shide [1 ,2 ]
Liu, Zhanghui [1 ,2 ]
Chen, Zhaoliang [1 ,2 ]
Yang, Wenyuan [3 ]
Wang, Shiping [1 ,2 ]
机构
[1] Fuzhou Univ, Coll Math & Comp Sci, Fuzhou 350116, Peoples R China
[2] Fuzhou Univ, Fujian Prov Key Lab Network Comp & Intelligent In, Fuzhou 350116, Peoples R China
[3] Minnan Normal Univ, Fujian Key Lab Granular Comp & Applicat, Zhangzhou 363000, Peoples R China
基金
中国国家自然科学基金;
关键词
Collaboration; Deep learning; multi-view clustering; co-clustering; sparse representation; differentiable blocks;
D O I
10.1109/TSP.2021.3101979
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep multi-view clustering utilizes neural networks to extract the potential peculiarities of complementarity and consistency information among multi-view features. This can obtain a consistent representation that improves clustering performance. Although a multitude of deep multi-view clustering approaches have been proposed, most lack theoretic interpretability while maintaining the advantages of good performance. In this paper, we propose an effective differentiable network with alternating iterative optimization for multi-view co-clustering termed differentiable bi-sparse multi-view co-clustering (DBMC) and an extension named elevated DBMC (EDBMC). The proposed methods are transformed into equivalent deep networks based on the constructed objective loss functions. They have the advantages of strong interpretability of the classical machine learning methods and the superior performance of deep networks. Moreover, DBMC and EDBMC can learn a joint and consistent collaborative representation from multi-source features and guarantee sparsity between multi-view feature space and single-view sample space. Meanwhile, they can be converted into deep differentiable network frameworks with block-wise iterative training. Correspondingly, we design two three-step iterative differentiable networks to resolve resultant optimization problems with theoretically guaranteed convergence. Extensive experiments on six multi-view benchmark datasets demonstrate that the proposed frameworks outperform other state-of-the-art multi-view clustering methods.
引用
收藏
页码:4623 / 4636
页数:14
相关论文
共 45 条
[31]   Multi-view subspace clustering with intactness-aware similarity [J].
Wang, Xiaobo ;
Lei, Zhen ;
Guo, Xiaojie ;
Zhang, Changqing ;
Shi, Hailin ;
Li, Stan Z. .
PATTERN RECOGNITION, 2019, 88 :50-63
[32]   Essential Tensor Learning for Multi-View Spectral Clustering [J].
Wu, Jianlong ;
Lin, Zhouchen ;
Zha, Hongbin .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (12) :5910-5922
[33]  
Xie X., 2019, PR MACH LEARN RES, P6902
[34]   Hyper-Laplacian Regularized Multilinear Multiview Self-Representations for Clustering and Semisupervised Learning [J].
Xie, Yuan ;
Zhang, Wensheng ;
Qu, Yanyun ;
Dai, Longquan ;
Tao, Dacheng .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (02) :572-586
[35]   On Unifying Multi-view Self-Representations for Clustering by Tensor Multi-rank Minimization [J].
Xie, Yuan ;
Tao, Dacheng ;
Zhang, Wensheng ;
Liu, Yan ;
Zhang, Lei ;
Qu, Yanyun .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2018, 126 (11) :1157-1179
[36]  
Xu P, 2019, AAAI CONF ARTIF INTE, P379
[37]   Deep Spectral Clustering using Dual Autoencoder Network [J].
Yang, Xu ;
Deng, Cheng ;
Zheng, Feng ;
Yan, Junchi ;
Liu, Wei .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :4061-4070
[38]   Multiview Subspace Clustering via Tensorial t-Product Representation [J].
Yin, Ming ;
Gao, Junbin ;
Xi, Shengli ;
Guo, Yi .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (03) :851-864
[39]   Multiview Consensus Graph Clustering [J].
Zhan, Kun ;
Nie, Feiping ;
Wang, Jing ;
Yang, Yi .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (03) :1261-1270
[40]   Online Deep Clustering for Unsupervised Representation Learning [J].
Zhan, Xiaohang ;
Xie, Jiahao ;
Liu, Ziwei ;
Ong, Yew-Soon ;
Loy, Chen Change .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :6687-6696