Benchmarking graph neural networks for materials chemistry

被引:220
作者
Fung, Victor [1 ]
Zhang, Jiaxin [2 ]
Juarez, Eric [1 ]
Sumpter, Bobby G. [1 ]
机构
[1] Oak Ridge Natl Lab, Ctr Nanophase Mat Sci, Oak Ridge, TN 37830 USA
[2] Oak Ridge Natl Lab, Comp Sci & Math Div, Oak Ridge, TN USA
关键词
MACHINE; REPOSITORY; MOLECULES;
D O I
10.1038/s41524-021-00554-0
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
Graph neural networks (GNNs) have received intense interest as a rapidly expanding class of machine learning models remarkably well-suited for materials applications. To date, a number of successful GNNs have been proposed and demonstrated for systems ranging from crystal stability to electronic property prediction and to surface chemistry and heterogeneous catalysis. However, a consistent benchmark of these models remains lacking, hindering the development and consistent evaluation of new models in the materials field. Here, we present a workflow and testing platform, MatDeepLearn, for quickly and reproducibly assessing and comparing GNNs and other machine learning models. We use this platform to optimize and evaluate a selection of top performing GNNs on several representative datasets in computational materials chemistry. From our investigations we note the importance of hyperparameter selection and find roughly similar performances for the top models once optimized. We identify several strengths in GNNs over conventional models in cases with compositionally diverse datasets and in its overall flexibility with respect to inputs, due to learned rather than defined representations. Meanwhile several weaknesses of GNNs are also observed including high data requirements, and suggestions for further improvement for applications in materials chemistry are discussed.
引用
收藏
页数:8
相关论文
共 53 条
[1]   The 2019 materials by design roadmap [J].
Alberi, Kirstin ;
Nardelli, Marco Buongiorno ;
Zakutayev, Andriy ;
Mitas, Lubos ;
Curtarolo, Stefano ;
Jain, Anubhav ;
Fornari, Marco ;
Marzari, Nicola ;
Takeuchi, Ichiro ;
Green, Martin L. ;
Kanatzidis, Mercouri ;
Toney, Mike F. ;
Butenko, Sergiy ;
Meredig, Bryce ;
Lany, Stephan ;
Kattner, Ursula ;
Davydov, Albert ;
Toberer, Eric S. ;
Stevanovic, Vladan ;
Walsh, Aron ;
Park, Nam-Gyu ;
Aspuru-Guzik, Alan ;
Tabor, Daniel P. ;
Nelson, Jenny ;
Murphy, James ;
Setlur, Anant ;
Gregoire, John ;
Li, Hong ;
Xiao, Ruijuan ;
Ludwig, Alfred ;
Martin, Lane W. ;
Rappe, Andrew M. ;
Wei, Su-Huai ;
Perkins, John .
JOURNAL OF PHYSICS D-APPLIED PHYSICS, 2019, 52 (01)
[2]  
[Anonymous], 2019, P THE INT C LEARNING
[3]  
[Anonymous], 2020, NEURIPS
[4]  
Anthony W, 2020, COMPOSITIONALLY REST
[5]   Convolutional Neural Network of Atomic Surface Structures To Predict Binding Energies for High-Throughput Screening of Catalysts [J].
Back, Seoin ;
Yoon, Junwoong ;
Tian, Nianhan ;
Zhong, Wen ;
Tran, Kevin ;
Ulissi, Zachary W. .
JOURNAL OF PHYSICAL CHEMISTRY LETTERS, 2019, 10 (15) :4401-4408
[6]   Emerging materials intelligence ecosystems propelled by machine learning [J].
Batra, Rohit ;
Song, Le ;
Ramprasad, Rampi .
NATURE REVIEWS MATERIALS, 2021, 6 (08) :655-678
[7]  
Battaglia PW, 2018, ABS180601261 CORR
[8]   Constructing high-dimensional neural network potentials: A tutorial review [J].
Behler, Joerg .
INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, 2015, 115 (16) :1032-1050
[9]   Atom-centered symmetry functions for constructing high-dimensional neural network potentials [J].
Behler, Joerg .
JOURNAL OF CHEMICAL PHYSICS, 2011, 134 (07)
[10]  
Bergstra J., 2013, P 30 INT C INT C CMA, V28, P115, DOI DOI 10.5555/3042817.3042832