Reverse graph self-attention for target-directed atomic importance estimation

被引:5
|
作者
Na, Gyoung S. [1 ]
Kim, Hyun Woo [1 ]
机构
[1] Korea Res Inst Chem Technol KRICT, 141 Gajeong Ro, Deajeon, South Korea
关键词
Scientific application; Representation learning; Graph neural networks; Attention mechanism; DATABASE;
D O I
10.1016/j.neunet.2020.09.022
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Estimating the importance of each atom in a molecule is one of the most appealing and challenging problems in chemistry, physics, and materials science. The most common way to estimate the atomic importance is to compute the electronic structure using density functional theory (DFT), and then to interpret it using domain knowledge of human experts. However, this conventional approach is impractical to the large molecular database because DFT calculation requires large computation, specifically, O(n(4)) time complexity w.r.t. the number of electronic basis functions. Furthermore, the calculation results should be manually interpreted by human experts to estimate the atomic importance in terms of the target molecular property. To tackle this problem, we first exploit the machine learning-based approach for the atomic importance estimation based on the reverse self attention on graph neural networks and integrating it with graph-based molecular description. Our method provides an efficiently-automated and target-directed way to estimate the atomic importance without any domain knowledge of chemistry and physics. (c) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1 / 10
页数:10
相关论文
共 50 条
  • [1] Feature Importance Estimation with Self-Attention Networks
    Skrlj, Blaz
    Dzeroski, Saso
    Lavrac, Nada
    Petkovic, Matej
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1491 - 1498
  • [2] Self-Attention Graph Pooling
    Lee, Junhyun
    Lee, Inyeop
    Kang, Jaewoo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [3] GSSA: Pay attention to graph feature importance for GCN via statistical self-attention
    Zheng, Jin
    Wang, Yang
    Xu, Wanjun
    Gan, Zilu
    Li, Ping
    Lv, Jiancheng
    NEUROCOMPUTING, 2020, 417 : 458 - 470
  • [4] GSSA: Pay attention to graph feature importance for GCN via statistical self-attention
    Zheng J.
    Wang Y.
    Xu W.
    Gan Z.
    Li P.
    Lv J.
    Neurocomputing, 2020, 417 : 458 - 470
  • [5] Self-Attention Factor Graph Neural Network for Multiagent Collaborative Target Tracking
    Xu, Cheng
    Su, Ran
    Wang, Ran
    Duan, Shihong
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (20): : 32381 - 32392
  • [6] Target-directed attention: Sequential decision-making for gaze planning
    Vogel, Julia
    de Freitas, Nando
    2008 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-9, 2008, : 2372 - 2379
  • [7] Target-directed visual attention is a prerequisite for action-specific perception
    Canal-Bruland, Rouwen
    Zhu, Frank F.
    van der Kamp, John
    Masters, Rich S. W.
    ACTA PSYCHOLOGICA, 2011, 136 (03) : 285 - 289
  • [8] Universal Graph Transformer Self-Attention Networks
    Dai Quoc Nguyen
    Tu Dinh Nguyen
    Dinh Phung
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 193 - 196
  • [9] Global Self-Attention as a Replacement for Graph Convolution
    Hussain, Md Shamim
    Zaki, Mohammed J.
    Subramanian, Dharmashankar
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 655 - 665
  • [10] SparseBERT: Rethinking the Importance Analysis in Self-attention
    Shi, Han
    Gao, Jiahui
    Ren, Xiaozhe
    Xu, Hang
    Liang, Xiaodan
    Li, Zhenguo
    Kwok, James T.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139