Reverse graph self-attention for target-directed atomic importance estimation

被引:5
|
作者
Na, Gyoung S. [1 ]
Kim, Hyun Woo [1 ]
机构
[1] Korea Res Inst Chem Technol KRICT, 141 Gajeong Ro, Deajeon, South Korea
关键词
Scientific application; Representation learning; Graph neural networks; Attention mechanism; DATABASE;
D O I
10.1016/j.neunet.2020.09.022
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Estimating the importance of each atom in a molecule is one of the most appealing and challenging problems in chemistry, physics, and materials science. The most common way to estimate the atomic importance is to compute the electronic structure using density functional theory (DFT), and then to interpret it using domain knowledge of human experts. However, this conventional approach is impractical to the large molecular database because DFT calculation requires large computation, specifically, O(n(4)) time complexity w.r.t. the number of electronic basis functions. Furthermore, the calculation results should be manually interpreted by human experts to estimate the atomic importance in terms of the target molecular property. To tackle this problem, we first exploit the machine learning-based approach for the atomic importance estimation based on the reverse self attention on graph neural networks and integrating it with graph-based molecular description. Our method provides an efficiently-automated and target-directed way to estimate the atomic importance without any domain knowledge of chemistry and physics. (c) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1 / 10
页数:10
相关论文
共 50 条
  • [21] Joint state and parameter estimation for a target-directed nonlinear dynamic system model
    Togneri, R
    Deng, L
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2003, 51 (12) : 3061 - 3070
  • [22] SAG-DTA: Prediction of Drug-Target Affinity Using Self-Attention Graph Network
    Zhang, Shugang
    Jiang, Mingjian
    Wang, Shuang
    Wang, Xiaofeng
    Wei, Zhiqiang
    Li, Zhen
    INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2021, 22 (16)
  • [23] Self-Attention Based Sequential Recommendation With Graph Convolutional Networks
    Seng, Dewen
    Wang, Jingchang
    Zhang, Xuefeng
    IEEE ACCESS, 2024, 12 : 32780 - 32787
  • [24] Multi-scale self-attention mixup for graph classification *
    Kong, Youyong
    Li, Jiaxing
    Zhang, Ke
    Wu, Jiasong
    PATTERN RECOGNITION LETTERS, 2023, 168 : 100 - 106
  • [25] Dynamic Graph Embedding via Self-Attention in the Lorentz Space
    Duan, Dingyang
    Zha, Daren
    Lie, Zeyi
    Chen, Yu
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 199 - 204
  • [26] Deep relational self-Attention networks for scene graph generation
    Li, Ping
    Yu, Zhou
    Zhan, Yibing
    Pattern Recognition Letters, 2022, 153 : 200 - 206
  • [27] Deep relational self-Attention networks for scene graph generation
    Li, Ping
    Yu, Zhou
    Zhan, Yibing
    PATTERN RECOGNITION LETTERS, 2022, 153 : 200 - 206
  • [28] What Dense Graph Do You Need for Self-Attention?
    Wang, Yuxing
    Lee, Chu-Tak
    Guo, Qipeng
    Yin, Zhangyue
    Zhou, Yunhua
    Huang, Xuanjing
    Qiu, Xipeng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [29] Gaze estimation via self-attention augmented convolutions
    Vieira, Gabriel Lefundes
    Oliveira, Luciano
    2021 34TH SIBGRAPI CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI 2021), 2021, : 49 - 56
  • [30] Bathymetry estimation for coastal regions using self-attention
    Zhang, Xiaoxiong
    Al Shehhi, Maryam R.
    SCIENTIFIC REPORTS, 2025, 15 (01):