A Hierarchical Mixture-Of-Experts Framework for Few Labeled Node Classification

被引:0
|
作者
Wang, Yimeng [1 ,2 ]
Yang, Zhiyao [1 ,2 ]
Che, Xiangjiu [1 ,2 ]
机构
[1] Jilin Univ, Coll Comp Sci & Technol, Changchun 130012, Jilin, Peoples R China
[2] Jilin Univ, Key Lab Symbol Computat & Knowledge Engn MOE, Changchun 130012, Jilin, Peoples R China
基金
中国国家自然科学基金;
关键词
Node classification; Mixture of experts; Data augmentation; Few labeled graph;
D O I
10.1016/j.neunet.2025.107285
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Few Labeled Node Classification (FLNC) is a challenging subtask of node classification, where training nodes are extremely limited, often with only one or two labels per class. While Graph Neural Networks (GNNs) show promise, they often suffer from feature convergence. A common method to address this challenge is multi-perspective feature extraction, with the Mixture of Experts (MoE) model being a popular approach. However, directly applying MoE to FLNC frequently results in overfitting. To address these issues, we propose the Hierarchical Mixture-of-Experts (HMoE) framework. First, we mitigate overfitting by applying three data augmentation techniques to enrich input features. Next, we design a novel hierarchical mixture-of-experts encoder to achieve diversified feature representations, where the first layer extracts unique feature information, and the second layer refines shared information. Additionally, we design an auxiliary task to distinguish between original and augmented data, using a gradient reversal mechanism to enhance the feature representation ability of graph data. The experimental results show that HMoE outperforms the baseline methods, achieving an average 1.2% performance improvement across six datasets.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] A Multilevel Mixture-of-Experts Framework for Pedestrian Classification
    Enzweiler, Markus
    Gavrila, Dariu M.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2011, 20 (10) : 2967 - 2979
  • [2] Extension of mixture-of-experts networks for binary classification of hierarchical data
    Ng, Shu-Kay
    McLachlan, Geoffrey J.
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2007, 41 (01) : 57 - 67
  • [3] Semi-supervised mixture-of-experts classification
    Karakoulas, G
    Salakhutdinov, R
    FOURTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2004, : 138 - 145
  • [4] A mixture-of-experts framework for adaptive Kalman filtering
    Chaer, WS
    Bishop, RH
    Ghosh, J
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1997, 27 (03): : 452 - 464
  • [5] METAL: A framework for mixture-of-experts task and attention learning
    Miriana, Maryam S.
    Araabi, Babak N.
    Ahmadabadi, Majid Nili
    Siegwart, Roland R.
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2012, 23 (04) : 111 - 128
  • [6] Hierarchical Mixture-of-Experts approach for neural compact modeling of MOSFETs
    Park, Chanwoo
    Vincent, Premkumar
    Chong, Soogine
    Park, Junghwan
    Cha, Ye Sle
    Cho, Hyunbo
    SOLID-STATE ELECTRONICS, 2023, 199
  • [7] Hierarchical mixture-of-experts models for count variables with excessive zeros
    Park, Myung Hyun
    Kim, Joseph H. T.
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2022, 51 (12) : 4072 - 4096
  • [8] Spatial Mixture-of-Experts
    Dryden, Nikoli
    Hoefler, Torsten
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] Multiscale Segmentation of Elevation Images Using a Mixture-of-Experts Framework
    Nagarajan, K.
    Slatton, K. C.
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2009, 6 (04) : 865 - 869
  • [10] A Mixture-of-Experts Prediction Framework for Evolutionary Dynamic Multiobjective Optimization
    Rambabu, Rethnaraj
    Vadakkepat, Prahlad
    Tan, Kay Chen
    Jiang, Min
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (12) : 5099 - 5112