Contrastive fine-tuning for low-resource graph-level transfer learning

被引:1
作者
Duan, Yutai [1 ]
Liu, Jie [1 ]
Chen, Shaowei [1 ]
Wu, Jianhua [1 ]
机构
[1] Nankai Univ, Coll Artificial Intelligence, Engn Res Ctr Trusted Behav Intelligence, Natl Key Lab Intelligent Tracking & Forecasting In, Tianjin, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural networks; Graph-level tasks; Low-resource scenarios; Transfer learning; Contrastive learning;
D O I
10.1016/j.ins.2023.120066
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to insufficient supervision and the gap between pre-training pretext tasks and downstream tasks, transferring pre-trained graph neural networks (GNNs) to downstream tasks in low-resource scenarios remains challenging. In this paper, a Contrastive Fine-tuning (Con-tuning) framework is proposed for low-resource graph-level transfer learning, and a graph-level supervised contrastive learning (SCL) task is designed within the framework as the first attempt to introduce SCL for finetuning processes of pre-trained GNNs. The SCL task compensates for the insufficient supervision problem in low-resource scenarios and narrows the gap between pretext tasks and downstream tasks. To further reinforce the supervision signal in the SCL task, we devise a graphon theory based labeled graph generator to extract the generalized knowledge of a specific class of graphs. Based on this knowledge, graph-level templates are generated for each class and used as contrastive samples in the SCL task. Then, the proposed Con-tuning framework jointly learns the SCL task and downstream tasks to effectively fine-tune the pre-trained GNNs for downstream tasks. Extensive experiments with eight real-world datasets show that Con-tuning framework enables pre-trained GNNs to achieve better performance on graph-level downstream tasks in low-resource settings.
引用
收藏
页数:12
相关论文
共 49 条
  • [1] A novel un-supervised burst time dependent plasticity learning approach for biologically pattern recognition networks
    Amiri, Masoud
    Jafari, Amir Homayoun
    Makkiabadi, Bahador
    Nazari, Soheila
    Van Hulle, Marc M.
    [J]. INFORMATION SCIENCES, 2023, 622 : 1 - 15
  • [2] Brown T. B., 2020, ARXIV
  • [3] MATRIX ESTIMATION BY UNIVERSAL SINGULAR VALUE THRESHOLDING
    Chatterjee, Sourav
    [J]. ANNALS OF STATISTICS, 2015, 43 (01) : 177 - 214
  • [4] Comparison of Random Forest and Pipeline Pilot Naive Bayes in Prospective QSAR Predictions
    Chen, Bin
    Sheridan, Robert P.
    Hornak, Viktor
    Voigt, Johannes H.
    [J]. JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2012, 52 (03) : 792 - 803
  • [5] Chen T, 2020, PR MACH LEARN RES, V119
  • [6] Chen YS, 2021, AAAI CONF ARTIF INTE, V35, P12692
  • [7] Devlin J, 2019, Arxiv, DOI arXiv:1810.04805
  • [8] Diaconis P, 2007, Arxiv, DOI arXiv:0712.2749
  • [9] Dou ZY, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P1192
  • [10] Few-Shot Graph Learning for Molecular Property Prediction
    Guo, Zhichun
    Zhang, Chuxu
    Yu, Wenhao
    Herr, John
    Wiest, Olaf
    Jiang, Meng
    Chawla, Nitesh, V
    [J]. PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 2559 - 2567