Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts

被引:2
作者
Yi S.-Y. [1 ]
Mao Z. [2 ]
Ju W. [2 ]
Zhou Y.-D. [1 ]
Liu L. [2 ]
Luo X. [3 ]
Zhang M. [2 ]
机构
[1] School of Statistics and Data Science, Nankai University, Tianjin
[2] School of Computer Science, Peking University, Beijing
[3] Department of Computer Science, University of California, Los Angeles, 90095, CA
来源
IEEE Transactions on Big Data | 2023年 / 9卷 / 06期
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Balanced contrastive learning; class-imbalanced learning; hard class extraction; multi-expert learning;
D O I
10.1109/TBDATA.2023.3313029
中图分类号
学科分类号
摘要
Graph classification, aiming at learning the graph-level representations for effective class assignments, has received outstanding achievements, which heavily relies on high-quality datasets that have balanced class distribution. In fact, most real-world graph data naturally presents a long-tailed form, where the head classes occupy much more samples than the tail classes, it thus is essential to study the graph-level classification over long-tailed data while still remaining largely unexplored. However, most existing long-tailed learning methods in visions fail to jointly optimize the representation learning and classifier training, as well as neglect the mining of the hard-to-classify classes. Directly applying existing methods to graphs may lead to sub-optimal performance, since the model trained on graphs would be more sensitive to the long-tailed distribution due to the complex topological characteristics. Hence, in this paper, we propose a novel long-tailed graph-level classification framework via Collaborative Multi-expert Learning (CoMe) to tackle the problem. To equilibrate the contributions of head and tail classes, we first develop balanced contrastive learning from the view of representation learning, and then design an individual-expert classifier training based on hard class mining. In addition, we execute gated fusion and disentangled knowledge distillation among the multiple experts to promote the collaboration in a multi-expert framework. Comprehensive experiments are performed on seven widely-used benchmark datasets to demonstrate the superiority of our method CoMe over state-of-the-art baselines. © 2023 IEEE.
引用
收藏
页码:1683 / 1696
页数:13
相关论文
empty
未找到相关数据