Unsupervised Graph Neural Architecture Search with Disentangled Self-supervision

被引:0
作者
Zhang, Zeyang [1 ,2 ]
Wang, Xin [1 ]
Zhang, Ziwei [1 ]
Shen, Guangyao [2 ]
Shen, Shiqi [2 ]
Zhu, Wenwu [1 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, BNRist, Beijing, Peoples R China
[2] Tencent, Wechat, Shenzhen, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023) | 2023年
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The existing graph neural architecture search (GNAS) methods heavily rely on supervised labels during the search process, failing to handle ubiquitous scenarios where supervisions are not available. In this paper, we study the problem of unsupervised graph neural architecture search, which remains unexplored in the literature. The key problem is to discover the latent graph factors that drive the formation of graph data as well as the underlying relations between the factors and the optimal neural architectures. Handling this problem is challenging given that the latent graph factors together with architectures are highly entangled due to the nature of the graph and the complexity of the neural architecture search process. To address the challenge, we propose a novel Disentangled Self-supervised Graph Neural Architecture Search (DSGAS) model, which is able to discover the optimal architectures capturing various latent graph factors in a self-supervised fashion based on unlabeled graph data. Specifically, we first design a disentangled graph super-network capable of incorporating multiple architectures with factor-wise disentanglement, which are optimized simultaneously. Then, we estimate the performance of architectures under different factors by our proposed self-supervised training with joint architecture-graph disentanglement. Finally, we propose a contrastive search with architecture augmentations to discover architectures with factor-specific expertise. Extensive experiments on 11 real-world datasets demonstrate that the proposed DSGAS model is able to achieve state-of-the-art performance against several baseline methods in an unsupervised manner.
引用
收藏
页数:16
相关论文
共 117 条
[1]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[2]   Rethinking Graph Neural Architecture Search from Message-passing [J].
Cai, Shaofei ;
Li, Liang ;
Deng, Jincan ;
Zhang, Beichen ;
Zha, Zheng-Jun ;
Su, Li ;
Huang, Qingming .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :6653-6662
[3]  
Cha Stephen, 2022, ARXIV
[4]  
Chen HB, 2021, ADV NEUR IN, V34
[5]   Reprise: A Design Tool for Specifying, Generating, and Customizing 3D Printable Adaptations on Everyday Objects [J].
Chen, Xiang 'Anthony' ;
Kim, Jeeeun ;
Mankofr, Jennifer ;
Grossman, Tovi ;
Coros, Stelian ;
Hudson, Scott E. .
UIST 2016: PROCEEDINGS OF THE 29TH ANNUAL SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, 2016, :29-39
[6]   Neural Architecture Search for Transformers: A Survey [J].
Chitty-Venkata, Krishna Teja ;
Emani, Murali ;
Vishwanath, Venkatram ;
Somani, Arun K. .
IEEE ACCESS, 2022, 10 :108374-108412
[7]   FairNAS: Rethinking Evaluation Fairness of Weight Sharing Neural Architecture Search [J].
Chu, Xiangxiang ;
Zhang, Bo ;
Xu, Ruijun .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :12219-12228
[8]   Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search [J].
Chu, Xiangxiang ;
Zhou, Tianbao ;
Zhang, Bo ;
Li, Jixiang .
COMPUTER VISION - ECCV 2020, PT XV, 2020, 12360 :465-480
[9]  
Cooray T, 2022, AAAI CONF ARTIF INTE, P6420
[10]   STRUCTURE ACTIVITY RELATIONSHIP OF MUTAGENIC AROMATIC AND HETEROAROMATIC NITRO-COMPOUNDS - CORRELATION WITH MOLECULAR-ORBITAL ENERGIES AND HYDROPHOBICITY [J].
DEBNATH, AK ;
DECOMPADRE, RLL ;
DEBNATH, G ;
SHUSTERMAN, AJ ;
HANSCH, C .
JOURNAL OF MEDICINAL CHEMISTRY, 1991, 34 (02) :786-797