Unpaired Multi-View Graph Clustering With Cross-View Structure Matching

被引:30
作者
Wen, Yi [1 ]
Wang, Siwei [1 ]
Liao, Qing [2 ]
Liang, Weixuan [1 ]
Liang, Ke [1 ]
Wan, Xinhang [1 ]
Liu, Xinwang [1 ]
机构
[1] Natl Univ Def Technol, Sch Comp, Changsha 410073, Peoples R China
[2] Harbin Inst Technol, Dept Comp Sci & Technol, Shenzhen 150006, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Kernel; Visualization; Matrix decomposition; Learning systems; Fuses; Clustering algorithms; Graph fusion; graph learning; multi-view clustering (MVC); unpaired data; ALGORITHM; ROBUST;
D O I
10.1109/TNNLS.2023.3291696
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-view clustering (MVC), which effectively fuses information from multiple views for better performance, has received increasing attention. Most existing MVC methods assume that multi-view data are fully paired, which means that the mappings of all corresponding samples between views are predefined or given in advance. However, the data correspondence is often incomplete in real-world applications due to data corruption or sensor differences, referred to as the data-unpaired problem (DUP) in multi-view literature. Although several attempts have been made to address the DUP issue, they suffer from the following drawbacks: 1) most methods focus on the feature representation while ignoring the structural information of multi-view data, which is essential for clustering tasks; 2) existing methods for partially unpaired problems rely on pregiven cross-view alignment information, resulting in their inability to handle fully unpaired problems; and 3) their inevitable parameters degrade the efficiency and applicability of the models. To tackle these issues, we propose a novel parameter-free graph clustering framework termed unpaired multi-view graph clustering framework with cross-view structure matching (UPMGC-SM). Specifically, unlike the existing methods, UPMGC-SM effectively utilizes the structural information from each view to refine cross-view correspondences. Besides, our UPMGC-SM is a unified framework for both the fully and partially unpaired multi-view graph clustering. Moreover, existing graph clustering methods can adopt our UPMGC-SM to enhance their ability for unpaired scenarios. Extensive experiments demonstrate the effectiveness and generalization of our proposed framework for both paired and unpaired datasets.
引用
收藏
页码:16049 / 16063
页数:15
相关论文
共 75 条
[11]  
Huang Z., 2020, PROC ADV NEURAL INFO, P2892
[12]   Partition level multiview subspace clustering [J].
Kang, Zhao ;
Zhao, Xinjia ;
Peng, Chong ;
Zhu, Hongyuan ;
Zhou, Joey Tianyi ;
Peng, Xi ;
Chen, Wenyu ;
Xu, Zenglin .
NEURAL NETWORKS, 2020, 122 :279-288
[13]   Local Sample-Weighted Multiple Kernel Clustering With Consensus Discriminative Graph [J].
Li, Liang ;
Wang, Siwei ;
Liu, Xinwang ;
Zhu, En ;
Shen, Li ;
Li, Kenli ;
Li, Keqin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) :1721-1734
[14]  
Li M., 2016, P 25 INT JOINT C ART, P1704
[15]  
Li RH, 2019, PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2916
[16]   Liquid computing of spiking neural network with multi-clustered and active-neuron-dominant structure [J].
Li, Xiumin ;
Liu, Hui ;
Xue, Fangzheng ;
Zhou, Hongjun ;
Song, Yongduan .
NEUROCOMPUTING, 2017, 243 :155-165
[17]   Multiview Clustering: A Scalable and Parameter-Free Bipartite Graph Fusion Method [J].
Li, Xuelong ;
Zhang, Han ;
Wang, Rong ;
Nie, Feiping .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (01) :330-344
[18]   Consensus Graph Learning for Multi-View Clustering [J].
Li, Zhenglai ;
Tang, Chang ;
Liu, Xinwang ;
Zheng, Xiao ;
Zhang, Wei ;
Zhu, En .
IEEE TRANSACTIONS ON MULTIMEDIA, 2022, 24 :2461-2472
[19]   ABSLearn: a GNN-based framework for aliasing and buffer-size information retrieval [J].
Liang, Ke ;
Tan, Jim ;
Zeng, Dongrui ;
Huang, Yongzhe ;
Huang, Xiaolei ;
Tan, Gang .
PATTERN ANALYSIS AND APPLICATIONS, 2023, 26 (03) :1171-1189
[20]  
Liang K, 2022, Arxiv, DOI [arXiv:2212.05767, DOI 10.48550/ARXIV.2212.05767]