A Multichannel Convolutional Decoding Network for Graph Classification

被引:8
作者
Guang, Mingjian [1 ,2 ]
Yan, Chungang [1 ,2 ]
Xu, Yuhua [1 ,2 ]
Wang, Junli [1 ,2 ]
Jiang, Changjun [1 ,2 ]
机构
[1] Tongji Univ, Key Lab Embedded Syst & Serv Comp, Minist Educ, Shanghai 201804, Peoples R China
[2] Tongji Univ, Natl Prov Minist Joint Collaborat Innovat Ctr Fina, Shanghai 201804, Peoples R China
关键词
Decoding; Convolution; Task analysis; Convolutional neural networks; Data mining; Training; Computational complexity; Global-to-local; graph classification; graph convolutional network (GCN); regularization;
D O I
10.1109/TNNLS.2023.3266243
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph convolutional networks (GCNs) have shown superior performance on graph classification tasks, and their structure can be considered as an encoder-decoder pair. However, most existing methods lack the comprehensive consideration of global and local in decoding, resulting in the loss of global information or ignoring some local information of large graphs. And the commonly used cross-entropy loss is essentially an encoder-decoder global loss, which cannot supervise the training states of the two local components (encoder and decoder). We propose a multichannel convolutional decoding network (MCCD) to solve the above-mentioned problems. MCCD first adopts a multichannel GCN encoder, which has better generalization than a single-channel GCN encoder since different channels can extract graph information from different perspectives. Then, we propose a novel decoder with a global-to-local learning pattern to decode graph information, and this decoder can better extract global and local information. We also introduce a balanced regularization loss to supervise the training states of the encoder and decoder so that they are sufficiently trained. Experiments on standard datasets demonstrate the effectiveness of our MCCD in terms of accuracy, runtime, and computational complexity.
引用
收藏
页码:13206 / 13216
页数:11
相关论文
共 65 条
  • [1] Anava Oren., 2016, Advances in Neural Information Processing Systems, V29, P4916
  • [2] Baek J., 2021, ICLR, P1
  • [3] Bianchi FM, 2020, PR MACH LEARN RES, V119
  • [4] Recursive Neural Networks for Density Estimation Over Generalized Random Graphs
    Bongini, Marco
    Rigutini, Leonardo
    Trentin, Edmondo
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (11) : 5441 - 5458
  • [5] Shortest-path kernels on graphs
    Borgwardt, KM
    Kriegel, HP
    [J]. Fifth IEEE International Conference on Data Mining, Proceedings, 2005, : 74 - 81
  • [6] Bruna J., 2014, PROC 2 INT C LEARN R, P1
  • [7] GLiT: Neural Architecture Search for Global and Local Image Transformer
    Chen, Boyu
    Li, Peixia
    Li, Chuming
    Li, Baopu
    Bai, Lei
    Lin, Chen
    Sun, Ming
    Yan, Junjie
    Ouyang, Wanli
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 12 - 21
  • [8] Financial time series forecasting with multi-modality graph neural network
    Cheng, Dawei
    Yang, Fangzhou
    Xiang, Sheng
    Liu, Jin
    [J]. PATTERN RECOGNITION, 2022, 121
  • [9] Defferrard M, 2016, ADV NEUR IN, V29
  • [10] Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171