Exploration of transferable and uniformly accurate neural network interatomic potentials using optimal experimental design

被引:14
|
作者
Zaverkin, Viktor [1 ]
Kaestner, Johannes [1 ]
机构
[1] Univ Stuttgart, Inst Theoret Chem, Pfaffenwaldring 55, D-70569 Stuttgart, Germany
来源
MACHINE LEARNING-SCIENCE AND TECHNOLOGY | 2021年 / 2卷 / 03期
关键词
molecular machine learning; atomistic neural networks; active learning; optimal experimental design; computational chemistry; FORCE-FIELDS; MOLECULES;
D O I
10.1088/2632-2153/abe294
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine learning has been proven to have the potential to bridge the gap between the accuracy of ab initio methods and the efficiency of empirical force fields. Neural networks are one of the most frequently used approaches to construct high-dimensional potential energy surfaces. Unfortunately, they lack an inherent uncertainty estimation which is necessary for efficient and automated sampling through the chemical and conformational space to find extrapolative configurations. The identification of the latter is needed for the construction of transferable and uniformly accurate potential energy surfaces. In this paper, we propose an active learning approach that uses the estimated model's output variance derived in the framework of the optimal experimental design. This method has several advantages compared to the established active learning approaches, e.g. Query-by-Committee, Monte Carlo dropout, feature and latent distances, in terms of the predictive power and computational efficiency. We have shown that the application of the proposed active learning scheme leads to transferable and uniformly accurate potential energy surfaces constructed using only a small fraction of data points. Additionally, it is possible to define a natural threshold value for the proposed uncertainty metric which offers the possibility to generate highly informative training data on-the-fly.
引用
收藏
页数:19
相关论文
共 20 条
  • [1] Neural network exploration using optimal experiment design
    Cohn, DA
    NEURAL NETWORKS, 1996, 9 (06) : 1071 - 1083
  • [2] Optimal experimental design with fast neural network surrogate models
    Stuckner, Joshua
    Piekenbrock, Matthew
    Arnold, Steven M.
    Ricks, Trenton M.
    COMPUTATIONAL MATERIALS SCIENCE, 2021, 200
  • [3] Accurate, scalable, and efficient Bayesian optimal experimental design with derivative-informed neural operators
    Go, Jinwoo
    Chen, Peng
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2025, 438
  • [4] AP-Net: An atomic-pairwise neural network for smooth and transferable interaction potentials
    Glick, Zachary L.
    Metcalf, Derek P.
    Koutsoukas, Alexios
    Spronk, Steven A.
    Cheney, Daniel L.
    Sherrill, C. David
    JOURNAL OF CHEMICAL PHYSICS, 2020, 153 (04)
  • [5] Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments
    Zaverkin, Viktor
    Holzmueller, David
    Steinwart, Ingo
    Kaestner, Johannes
    JOURNAL OF CHEMICAL THEORY AND COMPUTATION, 2021, 17 (10) : 6658 - 6670
  • [6] Optimal experimental design for reservoir property estimates in geothermal exploration
    Ralf Seidler
    Kateryna Padalkina
    H. Martin Bücker
    Anozie Ebigbo
    Michael Herty
    Gabriele Marquart
    Jan Niederau
    Computational Geosciences, 2016, 20 : 375 - 383
  • [7] Optimal experimental design for reservoir property estimates in geothermal exploration
    Seidler, Ralf
    Padalkina, Kateryna
    Buecker, H. Martin
    Ebigbo, Anozie
    Herty, Michael
    Marquart, Gabriele
    Niederau, Jan
    COMPUTATIONAL GEOSCIENCES, 2016, 20 (02) : 375 - 383
  • [8] Information-driven optimal experimental design with deep neural network surrogate model for composite materials
    Jang, Kyung Suk
    Yun, Gun Jin
    MECHANICS OF ADVANCED MATERIALS AND STRUCTURES, 2024, 31 (01) : 210 - 217
  • [9] Optimal Experimental Design for a Bistable Gene Regulatory Network
    Braniff, Nathan
    Richards, Addison
    Ingalls, Brian
    IFAC PAPERSONLINE, 2019, 52 (26): : 255 - 261
  • [10] Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting
    Thaler, Stephan
    Zavadlav, Julija
    NATURE COMMUNICATIONS, 2021, 12 (01)