Molformer: Motif-Based Transformer on 3D Heterogeneous Molecular Graphs

被引:0
作者
Wu, Fang [1 ,3 ]
Radev, Dragomir [2 ]
Li, Stan Z. [1 ]
机构
[1] Westlake Univ, Sch Engn, Hangzhou, Peoples R China
[2] Yale Univ, Dept Comp Sci, New Haven, CT USA
[3] Tsinghua Univ, Inst AI Ind Res, Beijing, Peoples R China
来源
THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 4 | 2023年
关键词
DATABASE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Procuring expressive molecular representations underpins AI-driven molecule design and scientific discovery. The research mainly focuses on atom-level homogeneous molecular graphs, ignoring the rich information in subgraphs or motifs. However, it has been widely accepted that substructures play a dominant role in identifying and determining molecular properties. To address such issues, we formulate heterogeneous molecular graphs (HMGs), and introduce a novel architecture to exploit both molecular motifs and 3D geometry. Precisely, we extract functional groups as motifs for small molecules and employ reinforcement learning to adaptively select quaternary amino acids as motif candidates for proteins. Then HMGs are constructed with both atom-level and motif-level nodes. To better accommodate those HMGs, we introduce a variant of the Transformer named Molformer, which adopts a heterogeneous self-attention layer to distinguish the interactions between multi-level nodes. Besides, it is also coupled with a multi-scale mechanism to capture fine-grained local patterns with increasing contextual scales. An attentive farthest point sampling algorithm is also proposed to obtain the molecular representations. We validate Molformer across a broad range of domains, including quantum chemistry, physiology, and biophysics. Extensive experiments show that Molformer outperforms or achieves the comparable performance of several state-of-the-art baselines. Our work provides a promising way to utilize informative motifs from the perspective of multi-level graph construction. The code is available at https://github.com/smiles724/Molformer.
引用
收藏
页码:5312 / 5320
页数:9
相关论文
共 58 条
[1]  
Anand-Achim N., 2021, bioRxiv
[2]  
Anderson B, 2019, Arxiv, DOI [arXiv:1906.04015, 10.48550/arXiv.1906.04015]
[3]  
Fuchs FB, 2020, Arxiv, DOI arXiv:2006.10503
[4]   970 Million Druglike Small Molecules for Virtual Screening in the Chemical Universe Database GDB-13 [J].
Blum, Lorenz C. ;
Reymond, Jean-Louis .
JOURNAL OF THE AMERICAN CHEMICAL SOCIETY, 2009, 131 (25) :8732-+
[5]  
Cantoni V., 2011, INT JOINT C BIOM ENG, P97
[6]  
Chen BS, 2019, Arxiv, DOI arXiv:1905.12712
[7]   Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals [J].
Chen, Chi ;
Ye, Weike ;
Zuo, Yunxing ;
Zheng, Chen ;
Ong, Shyue Ping .
CHEMISTRY OF MATERIALS, 2019, 31 (09) :3564-3572
[8]  
Chithrananda S, 2020, Arxiv, DOI [arXiv:2010.09885, 10.48550/arXiv.2010.09885]
[9]  
Cho HYC, 2019, Arxiv, DOI arXiv:1811.09794
[10]  
Duvenaud D, 2015, Arxiv, DOI arXiv:1509.09292