[en] The rapid development and large body of literature on machine learning interatomic potentials (MLIPs) can make it difficult to know how to proceed for researchers who are not experts but wish to use these tools. The spirit of this review is to help such researchers by serving as a practical, accessible guide to the state-of-the-art in MLIPs. This review paper covers a broad range of topics related to MLIPs, including (i) central aspects of how and why MLIPs are enablers of many exciting advancements in molecular modeling, (ii) the main underpinnings of different types of MLIPs, including their basic structure and formalism, (iii) the potentially transformative impact of universal MLIPs for both organic and inorganic systems, including an overview of the most recent advances, capabilities, downsides, and potential applications of this nascent class of MLIPs, (iv) a practical guide for estimating and understanding the execution speed of MLIPs, including guidance for users based on hardware availability, type of MLIP used, and prospective simulation size and time, (v) a manual for what MLIP a user should choose for a given application by considering hardware resources, speed requirements, energy and force accuracy requirements, as well as guidance for choosing pre-trained potentials or fitting a new potential from scratch, (vi) discussion around MLIP infrastructure, including sources of training data, pre-trained potentials, and hardware resources for training, (vii) summary of some key limitations of present MLIPs and current approaches to mitigate such limitations, including methods of including long-range interactions, handling magnetic systems, and treatment of excited states, and finally (viii) we finish with some more speculative thoughts on what the future holds for the development and application of MLIPs over the next 3–10+ years.
Disciplines :
Chemistry
Author, co-author :
Jacobs, Ryan; Department of Materials Science and Engineering, University of Wisconsin-Madison, Madison, United States
Morgan, Dane; Department of Materials Science and Engineering, University of Wisconsin-Madison, Madison, United States
Attarian, Siamak; Department of Materials Science and Engineering, University of Wisconsin-Madison, Madison, United States
Meng, Jun; Department of Materials Science and Engineering, University of Wisconsin-Madison, Madison, United States
Shen, Chen; Department of Materials Science and Engineering, University of Wisconsin-Madison, Madison, United States
Wu, Zhenghao; Department of Chemistry and Materials Science, Xi'an Jiaotong-Liverpool University, Suzhou, China
Xie, Clare Yijia; John A, Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, United States
Yang, Julia H.; Harvard University Center for the Environment, Harvard University, Cambridge, United States ; John A, Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, United States
Artrith, Nongnuch; Materials Chemistry and Catalysis, Debye Institute for Nanomaterials Science, Utrecht University, Utrecht, Netherlands
Blaiszik, Ben; Globus, University of Chicago, Chicago, United States ; Data Science and Learning Division, Argonne National Laboratory, Lemont, United States
Ceder, Gerbrand; Department of Materials Science and Engineering, University of California, Berkeley, United States ; Materials Sciences Division, Lawrence Berkeley National Laboratory, United States
Choudhary, Kamal; Material Measurement Laboratory, National Institute of Standards and Technology, Gaithersburg, United States
Csanyi, Gabor; Department of Engineering, University of Cambridge, Cambridge, United Kingdom
Cubuk, Ekin Dogus; Google DeepMind, Mountain View, United States
Deng, Bowen; Department of Materials Science and Engineering, University of California, Berkeley, United States ; Materials Sciences Division, Lawrence Berkeley National Laboratory, United States
Drautz, Ralf; Interdisciplinary Centre for Advanced Materials Simulation (ICAMS), Ruhr-Universität Bochum, Bochum, Germany
Fu, Xiang; Fundamental AI Research (FAIR) at Meta, United States
Godwin, Jonathan; Orbital Materials, London, United Kingdom
Honavar, Vasant; Department of Computer Science and Engineering, The Pennsylvania State University, University Park, United States ; College of Information Sciences and Technology, The Pennsylvania State University, University Park, United States ; Artificial Intelligence Research Laboratory, The Pennsylvania State University, United States ; Center for Artificial Intelligence Foundations and Scientific Applications, The Pennsylvania State University, United States
Isayev, Olexandr; Department of Chemistry, Mellon College of Science, Carnegie Mellon University, United States ; Computational Biology Department, School of Computer Science, Carnegie Mellon University, Pittsburgh, United States
Johansson, Anders; John A, Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, United States
Martiniani, Stefano; Courant Institute of Mathematical Sciences, New York University, New York, United States ; Center for Soft Matter Research, Department of Physics, New York University, New York, United States ; Simons Center for Computational Physical Chemistry, Department of Chemistry, New York University, New York, United States
Ong, Shyue Ping; Aiiso Yufeng Li Family Department of Chemical and Nano Engineering, University of California, San Diego, La Jolla, United States
POLTAVSKYI, Igor ; University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Physics and Materials Science (DPHYMS)
Schmidt, K.J.; Globus, University of Chicago, Chicago, United States ; Data Science and Learning Division, Argonne National Laboratory, Lemont, United States
Takamoto, So; Preferred Networks, Inc., Chiyoda-ku, Japan
Thompson, Aidan P.; Center for Computing Research, Sandia National Laboratories, Albuquerque, Mexico
Westermayr, Julia; Wilhelm-Ostwald-Institut für Physikalische und Theoretische Chemie, Universität Leipzig, Germany
Wood, Brandon M.; Fundamental AI Research (FAIR) at Meta, United States
Kozinsky, Boris; John A, Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, United States
KC thanks the National Institute of Standards and Technology for funding, computational, and data management resources. This work was performed with funding from the CHIPS Metrology Program, part of CHIPS for America, National Institute of Standards and Technology, U.S. Department of Commerce. Certain commercial equipment, instruments, software, or materials are identified in this paper in order to specify the experimental procedure adequately. Such identifications are not intended to imply recommendation or endorsement by NIST, nor it is intended to imply that the materials or equipment identified are necessarily the best available for the purpose.Funding for the \u201CMachine Learning Potentials \u2013 Status and Future (MLIP-SAFE)\u201D workshop and development of this paper was provided by the National Science Foundation through an AI Institute Planning Grant, Award Number 2020243. KC thanks the National Institute of Standards and Technology for funding, computational, and data management resources. This work was performed with funding from the CHIPS Metrology Program, part of CHIPS for America, National Institute of Standards and Technology, U.S. Department of Commerce. Certain commercial equipment, instruments, software, or materials are identified in this paper in order to specify the experimental procedure adequately. Such identifications are not intended to imply recommendation or endorsement by NIST, nor it is intended to imply that the materials or equipment identified are necessarily the best available for the purpose. SM acknowledges support from NSF Grant OAC-2311632 and the Simons Center for Computational Physical Chemistry (Simons Foundation grant 839534, MT).Funding for the \u201CMachine Learning Potentials \u2013 Status and Future (MLP-SAFE)\u201D workshop and development of this paper was provided by the National Science Foundation through an AI Institute Planning Grant, Award Number 2020243.
Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R., Machine learning force fields. Chem. Rev. 121 (2021), 10142–10186, 10.1021/acs.chemrev.0c01111.
Wan, K., He, J., Shi, X., Construction of high accuracy machine learning interatomic potential for surface/interface of nanomaterials—a review. Adv. Mater., 36, 2024, 10.1002/adma.202305758.
Duignan, T.T., The potential of neural network potentials. ACS Phys. Chem. Au 4 (2024), 232–241, 10.1021/acsphyschemau.4c00004.
Smith, J.S., Isayev, O., Roitberg, A.E., ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci. 8 (2017), 3192–3203, 10.1039/C6SC05720A.
Musaelian, A., Batzner, S., Johansson, A., Sun, L., Owen, C.J., Kornbluth, M., Kozinsky, B., Learning local equivariant representations for large-scale atomistic dynamics. Nat. Commun., 14, 2023, 579, 10.1038/s41467-023-36329-y.
Drautz, R., Atomic cluster expansion for accurate and transferable interatomic potentials. PhysRevB, 99, 2019, 10.1103/PhysRevB.99.014104.
Artrith, N., Urban, A., An implementation of artificial neural-network potentials for atomistic materials simulations: performance for TiO2. Comput. Mater. Sci 114 (2016), 135–150, 10.1016/j.commatsci.2015.11.047.
Artrith, N., Urban, A., Ceder, G., Efficient and accurate machine-learning interpolation of atomic energies in compositions with many species. PhysRevB, 96, 2017, 014112, 10.1103/PhysRevB.96.014112.
Choudhary, K., DeCost, B., Major, L., Butler, K., Thiyagalingam, J., Tavazza, F., Unified graph neural network force-field for the periodic table: solid state applications. Digital Discovery 2 (2023), 346–355, 10.1039/d2dd00096b.
D.M. Anstine, R. Zubatyuk, O. Isayev, AIMNet2: A Neural Network Potential to Meet your Neutral, Charged, Organic, and Elemental-Organic Needs, ChemRxiv (2023). 10.26434/chemrxiv-2023-296ch.
Behler, J., Parrinello, M., Generalized neural-network representation of high-dimensional potential-energy surfaces. PhysRevLett 98 (2007), 1–4, 10.1103/PhysRevLett.98.146401.
Deng, B., Zhong, P., Jun, K., Riebesell, J., Han, K., Bartel, C.J., Ceder, G., CHGNet: Pretrained universal neural network potential for charge-informed atomistic modeling. Nat. Mach. Intell., 2023, 10.1038/s42256-023-00716-3.
Zeng, J., Zhang, D., Lu, D., Mo, P., Li, Z., Chen, Y., Rynik, M., Huang, L., Li, Z., Shi, S., Wang, Y., Ye, H., Tuo, P., Yang, J., Ding, Y., Li, Y., Tisi, D., Zeng, Q., Bao, H., Xia, Y., Huang, J., Muraoka, K., Wang, Y., Chang, J., Yuan, F., Bore, S.L., Cai, C., Lin, Y., Wang, B., Xu, J., Zhu, J.X., Luo, C., Zhang, Y., Goodall, R.E.A., Liang, W., Singh, A.K., Yao, S., Zhang, J., Wentzcovitch, R., Han, J., Liu, J., Jia, W., York, D.M., Weinan, E., Car, R., Zhang, L., Wang, H., DeePMD-kit v2: A software package for deep potential models. J. Chem. Phys., 159, 2023, 10.1063/5.0155600.
Wang, H., Zhang, L., Han, J., DeePMD-kit, W.E., A deep learning package for many-body potential energy representation and molecular dynamics. Comput Phys Commun 228 (2018), 178–184, 10.1016/j.cpc.2018.03.016.
Rodriguez, A., Lin, C., Yang, H., Al-Fahdi, M., Shen, C., Choudhary, K., Zhao, Y., Hu, J., Cao, B., Zhang, H., Hu, M., Million-scale data integrated deep neural network for phonon properties of heuslers spanning the periodic table. NPJ Comput. Mater., 9, 2023, 20, 10.1038/s41524-023-00974-0.
L. Barroso-Luque, M. Shuaibi, X. Fu, B. Wood, M. Dzamba, M. Gao, A. Rizvi, Open Materials 2024 (OMat24) Inorganic Materials Dataset and Models, (n.d.).
Vandermause, J., Torrisi, S.B., Batzner, S., Xie, Y., Sun, L., Kolpak, A.M., Kozinsky, B., On-the-fly active learning of interpretable Bayesian force fields for atomistic rare events. NPJ Comput. Mater., 6, 2020, 20, 10.1038/s41524-020-0283-z.
Bartók, A.P., Payne, M.C., Kondor, R., Csányi, G., Gaussian approximation potentials: the accuracy of quantum mechanics, without the electrons. PhysRevLett, 104, 2010, 10.1103/PhysRevLett.104.136403.
Xie, F., Lu, T., Meng, S., Liu, M., GPTFF: a high-accuracy out-of-the-box universal AI force field for arbitrary inorganic materials. Sci Bull (Beijing), 2024, 10.1016/j.scib.2024.08.039.
Merchant, A., Batzner, S., Schoenholz, S.S., Aykol, M., Cheon, G., Cubuk, E.D., Scaling deep learning for materials discovery. Nature 624 (2023), 80–85, 10.1038/s41586-023-06735-9.
Bochkarev, A., Lysogorskiy, Y., Drautz, R., Graph atomic cluster expansion for semilocal interactions beyond equivariant message passing. PhysRevX, 14, 2024, 021036, 10.1103/PhysRevX.14.021036.
H. Yang, C. Hu, Y. Zhou, X. Liu, Y. Shi, J. Li, G. Li, C. Zeni, M. Horton, R. Pinsler, MatterSim: A Deep Learning Atomistic Model Across Elements, Temperatures and Pressures, (n.d.).
I. Batatia, D. Kovacs, G. Simm, C. Ortner, G. Csanyi, MACE: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields, in: 36th Conference on Neural Information Processing Systems (NeurIPS 2022), 2022.
I. Batatia, P. Benner, Y. Chiang, A.M. Elena, D.P. Kovács, J. Riebesell, X.R. Advincula, M. Asta, M. Avaylon, W.J. Baldwin, F. Berger, N. Bernstein, A. Bhowmik, S.M. Blau, V. Cărare, J.P. Darby, S. De, F. Della Pia, V.L. Deringer, R. Elijošius, Z. El-Machachi, F. Falcioni, E. Fako, A.C. Ferrari, A. Genreith-Schriever, J. George, R.E.A. Goodall, C.P. Grey, P. Grigorev, S. Han, W. Handley, H.H. Heenen, K. Hermansson, C. Holm, J. Jaafar, S. Hofmann, K.S. Jakob, H. Jung, V. Kapil, A.D. Kaplan, N. Karimitari, J.R. Kermode, N. Kroupa, J. Kullgren, M.C. Kuner, D. Kuryla, G. Liepuoniute, J.T. Margraf, I.-B. Magdău, A. Michaelides, J.H. Moore, A.A. Naik, S.P. Niblett, S.W. Norwood, N. O'Neill, C. Ortner, K.A. Persson, K. Reuter, A.S. Rosen, L.L. Schaaf, C. Schran, B.X. Shi, E. Sivonxay, T.K. Stenczel, V. Svahn, C. Sutton, T.D. Swinburne, J. Tilly, C. van der Oord, E. Varga-Umbrich, T. Vegge, M. Vondrák, Y. Wang, W.C. Witt, F. Zills, G. Csányi, A foundation model for atomistic materials chemistry, ArXiv (2023). http://arxiv.org/abs/2401.00096.
D.P. Kovács, J.H. Moore, N.J. Browning, I. Batatia, J.T. Horton, V. Kapil, W.C. Witt, I.-B. Magdău, D.J. Cole, G. Csányi, MACE-OFF23: Transferable Machine Learning Force Fields for Organic Molecules, (2023). http://arxiv.org/abs/2312.15211.
Shapeev, A.V., Moment tensor potentials: A class of systematically improvable interatomic potentials. Multiscale Model. Simul., 14, 2016, 10.1137/15M1054183.
Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B., E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nat. Commun., 13, 2022, 10.1038/s41467-022-29939-5.
M. Neumann, J. Gin, B. Rhodes, S. Bennett, Z. Li, H. Choubisa, A. Hussey, Orb: A Fast, Scalable Neural Network Potential, (n.d.).
Takamoto, S., Okanohara, D., Li, Q.-J., Li, J., Towards universal neural network interatomic potential. J. Materiomics 9 (2023), 447–454, 10.1016/j.jmat.2022.12.007.
Park, Y., Kim, J., Hwang, S., Han, S., Scalable parallel algorithm for graph neural network interatomic potentials in molecular dynamics simulations. J. Chem. Theory Comput. 20 (2024), 4857–4868, 10.1021/acs.jctc.4c00190.
Schütt, K.T., Arbabzadah, F., Chmiela, S., Müller, K.R., Tkatchenko, A., Quantum-chemical insights from deep tensor neural networks. Nat. Commun., 8, 2017, 13890, 10.1038/ncomms13890.
Thompson, A.P., Swiler, L.P., Trott, C.R., Foiles, S.M., Tucker, G.J., Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials. J. Comput. Phys., 285, 2015, 10.1016/j.jcp.2014.12.018.
Chen, C., Ong, S.P., A universal graph deep learning interatomic potential for the periodic table. Nat. Comput. Sci. 2 (2022), 718–728, 10.1038/s43588-022-00349-3.
A.P. Thompson, H.M. Aktulga, R. Berger, D.S. Bolintineanu, W.M. Brown, P.S. Crozier, P.J. in ’t Veld, A. Kohlmeyer, S.G. Moore, T.D. Nguyen, R. Shan, M.J. Stevens, J. Tranchida, C. Trott, S.J. Plimpton, LAMMPS - a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales, Comput Phys Commun 271 (2022) 108171. 10.1016/j.cpc.2021.108171.
Daw, M.S., Baskes, M.I., Embedded-atom method: Derivation and application to impurities, surfaces, and other defects in metals. PhysRevB 29 (1984), 6443–6453, 10.1103/PhysRevB.29.6443.
Daw, M.S., Foiles, S.M., Baskes, M.I., The embedded-atom method: a review of theory and applications. Mater. Sci. Rep. 9 (1993), 251–310, 10.1016/0920-2307(93)90001-U.
A. Rohskopf, C. Sievers, N. Lubbers, M.A. Cusentino, J. Goff, J. Janssen, M. McCarthy, D.M. de Oca Zapiain, S. Nikolov, K. Sargsyan, D. Sema, E. Sikorski, L. Williams, A.P. Thompson, M.A. Wood, FitSNAP: Atomistic machine learning with LAMMPS, J Open Source Softw 8 (2023) 5118. 10.21105/joss.05118.
Y. Mishin, Machine-learning interatomic potentials for materials science, Acta Mater 214 (2021) 116980. 10.1016/j.actamat.2021.116980.
Daw, M.S., Foiles, S.M., Baskes, M.I., The embedded-atom method: a review of theory and applications. Mater. Sci. Rep., 9, 1993, 10.1016/0920-2307(93)90001-U.
Morrow, J.D., Gardner, J.L.A., Deringer, V.L., How to validate machine-learned interatomic potentials. J. Chem. Phys., 158, 2023, 10.1063/5.0139611.
Artrith, N., Behler, J., High-dimensional neural network potentials for metal surfaces: A prototype study for copper. PhysRevB, 85, 2012, 045439, 10.1103/PhysRevB.85.045439.
Cooper, A.M., Kästner, J., Urban, A., Artrith, N., Efficient training of ANN potentials by including atomic forces via Taylor expansion and application to water and a transition-metal oxide. NPJ Comput. Mater., 6, 2020, 54, 10.1038/s41524-020-0323-8.
Rohskopf, A., Goff, J., Sema, D., Gordiz, K., Nguyen, N.C., Henry, A., Thompson, A.P., Wood, M.A., Exploring model complexity in machine learned potentials for simulated properties. J. Mater. Res. 38 (2023), 5136–5150, 10.1557/s43578-023-01152-0.
Batatia, I., Batzner, S., Kovács, D.P., Musaelian, A., Simm, G.N.C., Drautz, R., Ortner, C., Kozinsky, B., Csányi, G., The design space of E(3)-equivariant atom-centred interatomic potentials. Nat Mach Intell 7 (2025), 56–67, 10.1038/s42256-024-00956-x.
Lysogorskiy, Y., van der Oord, C., Bochkarev, A., Menon, S., Rinaldi, M., Hammerschmidt, T., Mrovec, M., Thompson, A., Csányi, G., Ortner, C., Drautz, R., Performant implementation of the atomic cluster expansion (PACE) and application to copper and silicon. NPJ Comput. Mater. 7 (2021), 1–12, 10.1038/s41524-021-00559-9.
P. Weiner, P. Kollman, AMBER: Assisted model building with energy refinement. A general program for modeling molecules and their interactions, J Comput Chem 2 (1981).
A.K. Rappe, C.J. Casewit, K.S. Colwell, W.A. Goddard III, W.M. Skiff, UFF, a Full Periodic Table Force Field for Molecular Mechanics and Molecular Dynamics Simulations, 1992. https://pubs.acs.org/sharingguidelines.
Case, D.A., Cheatham, T.E., Darden, T., Gohlke, H., Luo, R., Merz, K.M., Onufriev, A., Simmerling, C., Wang, B., Woods, R.J., The Amber biomolecular simulation programs. J. Comput. Chem. 26 (2005), 1668–1688, 10.1002/jcc.20290.
Smith, J.S., Isayev, O., Roitberg, A.E., ANI-1, A data set of 20 million calculated off-equilibrium conformations for organic molecules. Sci. Data, 4, 2017, 170193, 10.1038/sdata.2017.193.
Devereux, C., Smith, J.S., Huddleston, K.K., Barros, K., Zubatyuk, R., Isayev, O., Roitberg, A.E., Extending the applicability of the ANI deep learning molecular potential to sulfur and halogens. J. Chem. Theory Comput. 16 (2020), 4192–4202, 10.1021/acs.jctc.0c00121.
Zhang, S., Makoś, M.Z., Jadrich, R.B., Kraka, E., Barros, K., Nebgen, B.T., Tretiak, S., Isayev, O., Lubbers, N., Messerly, R.A., Smith, J.S., Exploring the frontiers of condensed-phase chemistry with a general reactive machine learning potential. Nat. Chem., 2024, 10.1038/s41557-023-01427-3.
https://matlantis.com/, (2023).
Liao, Y.-L., Wood, B., Das, A., Smidt, T., EquiformerV2: improved equivariant transformer for scaling to higher-degree representations. ArXiv, 2023.
Chen, C., Ye, W., Zuo, Y., Zheng, C., Ong, S.P., Graph networks as a universal machine learning framework for molecules and crystals. Chem. Mater. 31 (2019), 3564–3572, 10.1021/acs.chemmater.9b01294.
Choudhary, K., DeCost, B., Atomistic line graph neural network for improved materials property predictions. npj Comput. Mater., 7, 2021, 185, 10.1038/s41524-021-00650-1.
Y. Zhou, S. Hu, C. Wang, L.-W. Wang, G. Tan, W. Jia, FastCHGNet: Training one Universal Interatomic Potential to 1.5 Hours with 32 GPUs, ArXiv (2024). 10.48550/arXiv.2412.20796.
H. Yu, M. Giantomassi, G. Materzanini, J. Wang, G.-M. Rignanese, Systematic assessment of various universal machine-learning interatomic potentials, ArXiv (2024). http://arxiv.org/abs/2403.05729.
B. Focassio, L.P.M. Freitas, G.R. Schleder, Performance Assessment of Universal Machine Learning Interatomic Potentials: Challenges and Directions for Materials’ Surfaces, ArXiv (2024). http://arxiv.org/abs/2403.04217.
Deng, B., Choi, Y., Zhong, P., Riebesell, J., Anand, S., Li, Z., Jun, K., Persson, K., Ceder, G., Overcoming systematic softening in universal machine learning interatomic potentials by fine-tuning. ArXiv, 2024.
J. Riebesell, R.E.A. Goodall, P. Benner, Y. Chiang, B. Deng, A.A. Lee, A. Jain, K.A. Persson, Matbench Discovery – A framework to evaluate machine learning crystal stability predictions, (2023). http://arxiv.org/abs/2308.14920.
Wang, H.-C., Botti, S., Marques, M.A.L., Predicting stable crystalline compounds using chemical similarity. NPJ Comput. Mater., 7, 2021, 12, 10.1038/s41524-020-00481-6.
Casillas-Trujillo, L., Parackal, A.S., Armiento, R., Alling, B., Evaluating and improving the predictive accuracy of mixing enthalpies and volumes in disordered alloys from universal pretrained machine learning potentials. PhysRevMater, 8, 2024, 113803, 10.1103/PhysRevMaterials.8.113803.
D. Wines, K. Choudhary, CHIPS-FF: Evaluating Universal Machine Learning Force Fields for Material Properties, ArXiv (2024). 10.48550/arXiv.2412.10516.
Takamoto, S., Shinagawa, C., Motoki, D., Nakago, K., Li, W., Kurata, I., Watanabe, T., Yayama, Y., Iriguchi, H., Asano, Y., Onodera, T., Ishii, T., Kudo, T., Ono, H., Sawada, R., Ishitani, R., Ong, M., Yamaguchi, T., Kataoka, T., Hayashi, A., Charoenphakdee, N., Ibuka, T., Towards universal neural network potential for material discovery applicable to arbitrary combination of 45 elements. Nat. Commun., 13, 2022, 10.1038/s41467-022-30687-9.
N. Shoghi, A. Kolluru, J. Kitchin, Z. Ulissi, L.C. Zitnick, B.M. Wood, From Molecules to Materials: Pre-training Large Generalizable Models for Atomic Property Prediction, ArXiv (2024). 10.48550/arXiv.2310.16802.
T. Xie, X. Fu, O.-E. Ganea, R. Barzilay, T. Jaakkola, Crystal Diffusion Variational Autoencoder for Periodic Material Generation, ArXiv:2110.06197 (2021). http://arxiv.org/abs/2110.06197.
D. Wines, T. Xie, K. Choudhary, Inverse Design of Next-generation Superconductors Using Data-driven Deep Generative Models, ArXiv:2304.08446 (2023). 10.1021/acs.jpclett.3c01260.
C. Zeni, R. Pinsler, D. Zugner, A. Fowler, M. Horton, X. Fu, S. Shysheya, J. Crabbe, L. Sun, J. Smith, B. Nguyen, H. Schulz, S. Lewis, C.-W. Huang, Z. Lu, Y. Zhou, H. Yang, H. Hao, J. Li, R. Tomioka, T. Xie, MatterGen: a generative model for inorganic materials design, ArXiv:2312.03687v2 (2023).
T.M. Nguyen, S.A. Tawfik, T. Tran, S. Gupta, S. Rana, S. Venkatesh, Efficient Symmetry-Aware Materials Generation via Hierarchical Generative Flow Networks, ArXiv (2024). 10.48550/arXiv.2411.04323.
S. Yang, K. Cho, A. Merchant, P. Abbeel, D. Schuurmans, I. Mordatch, E.D. Cubuk, Scalable Diffusion for Materials Generation, ArXiv (2024). 10.48550/arXiv.2311.09235.
N. Gruver, A. Sriram, A. Madotto, A.G. Wilson, C.L. Zitnick, Z. Ulissi, Fine-Tuned Language Models Generate Stable Inorganic Materials as Text, (2024). http://arxiv.org/abs/2402.04379.
Choudhary, K., AtomGPT: atomistic generative pretrained transformer for forward and inverse materials design. J. Phys. Chem. Lett. 15 (2024), 6909–6917, 10.1021/acs.jpclett.4c01126.
N. Bernstein, From GAP to ACE to MACE, ArXiv (2024).
LAMMPS Benchmarks, (n.d.). https://www.lammps.org/bench.html#oneproc (accessed October 30, 2023).
D. Lu, H. Wang, M. Chen, L. Lin, R. Car, W. E, W. Jia, L. Zhang, 86 PFLOPS Deep Potential Molecular Dynamics simulation of 100 million atoms with ab initio accuracy, Comput Phys Commun 259 (2021). 10.1016/j.cpc.2020.107624.
Wen, M., Afshar, Y., Elliott, R.S., Tadmor, E.B., KLIFF: A framework to develop physics-based and machine learning interatomic potentials. Comput. Phys. Commun., 272, 2022, 108218, 10.1016/j.cpc.2021.108218.
K. Nguyen-Cong, J.T. Willman, S.G. Moore, A.B. Belonoshko, R. Gayatri, E. Weinberg, M.A. Wood, A.P. Thompson, I.I. Oleynik, Billion atom molecular dynamics simulations of carbon at extreme conditions and experimental time and length scales, in: International Conference for High Performance Computing, Networking, Storage and Analysis, SC, 2021. 10.1145/3458817.3487400.
Musaelian, A., Johansson, A., Batzner, S., Kozinsky, B., Scaling the leading accuracy of deep equivariant models to biomolecular simulations of realistic size. ArXiv, 2023.
Johannson, A., Xie, Y., Owen, C., Lim, J., Sun, L., Vandermause, J., Kozinsky, B., Micron-scale heterogeneous catalysis with Bayesian force fields from first principles and active learning. ArXiv, 2022.
Zhang, X., Sundram, S., Oppelstrup, T., Kokkila-Schumacher, S.I.L., Carpenter, T.S., Ingólfsson, H.I., Streitz, F.H., Lightstone, F.C., Glosli, J.N., DdcMD: A fully GPU-accelerated molecular dynamics program for the Martini force field. J. Chem. Phys., 153, 2020, 10.1063/5.0014500.
MN-Core, (n.d.).
Frank, J.T., Unke, O.T., Müller, K.-R., Chmiela, S., A Euclidean transformer for fast and stable machine learned force fields. Nat. Commun., 15, 2024, 6539, 10.1038/s41467-024-50620-6.
The ColabFit Exchange: Data for Advanced Materials Science, (2024). https://materials.colabfit.org/ (accessed July 4, 2024).
NIST Interatomic Potentials Respository, (2024). https://www.ctcms.nist.gov/potentials/ (accessed July 4, 2024).
Open Knowledgebase of Interatomic Models (OpenKIM), (2009). https://openkim.org/ (accessed May 7, 2024).
M3Gnet Github repository, (2023). https://github.com/materialsvirtuallab/m3gnet (accessed May 6, 2024).
CHGNet Github repository, (2024). https://github.com/CederGroupHub/chgnet (accessed May 6, 2024).
MACE Github repository, (2024). https://github.com/ACEsuit/mace (accessed May 6, 2024).
J.D. Morrow, J.L.A. Gardner, V.L. Deringer, How to validate machine-learned interatomic potentials, J Chem Phys 158 (2023). 10.1063/5.0139611.
R.S.J.-N.W.C.L.C.W.R.V.A.C.Y.X.Y.M.Y.M.J.P.Y.S. Xiaoliang Pan, The Training of Machine Learning Potentials for Reactive Systems: A Colab Tutorial on Basic Models, ChemRxiv (2023).
Tokita, A.M., Behler, J., How to train a neural network potential. J. Chem. Phys., 159, 2023, 10.1063/5.0160326.
Attarian, S., Shen, C., Morgan, D., Szlufarska, I., Best practices for fitting machine learning interatomic potentials for molten salts: A case study using NaCl-MgCl2. Comput. Mater. Sci, 246, 2025, 113409, 10.1016/j.commatsci.2024.113409.
Owen, C., Torrisi, S., Xie, Y., Batzner, S., Bystrom, K., Coulter, J., Musaelian, A., Sun, L., Kozinsky, B., Complexity of many-body interactions in transition metals via machine-learned force fields from the TM23 Data Set. ArXiv, 2023.
Jinnouchi, R., Miwa, K., Karsai, F., Kresse, G., Asahi, R., On-the-fly active learning of interatomic potentials for large-scale atomistic simulations. J. Phys. Chem. Lett., 11, 2020, 10.1021/acs.jpclett.0c01061.
Xiang Fu, Zhanghao Wu, Wujie Wang, Tian Xie, Sinan Keten, Rafael Gomez-Bombarelli, Tommi Jaakkola, Forces are not Enough: Benchmark and Critical Evaluation for Machine Learning Force Fields with Molecular Simulations, ArXiv (2022). 10.48550/arXiv.2210.07237.
Y. Zhai, A. Caruso, S.L. Bore, Z. Luo, F. Paesani, A “short blanket” dilemma for a state-of-the-art neural network potential for water: Reproducing experimental properties or the physics of the underlying many-body interactions?, J Chem Phys 158 (2023). 10.1063/5.0142843.
Lysogorskiy, Y., Bochkarev, A., Mrovec, M., Drautz, R., Active learning strategies for atomic cluster expansion models. PhysRevMater, 7, 2022, 43801, 10.1103/PhysRevMaterials.7.043801.
Podryabinkin, E.V., Shapeev, A.V., Active learning of linearly parametrized interatomic potentials. Comput. Mater. Sci 140 (2017), 171–180, 10.1016/j.commatsci.2017.08.031.
Artrith, N., Behler, J., High-dimensional neural network potentials for metal surfaces: a prototype study for copper. Phys. Rev. B: Condens. Matter Mater. Phys., 85, 2012, 10.1103/PhysRevB.85.045439.
Smith, J.S., Nebgen, B., Lubbers, N., Isayev, O., Roitberg, A.E., Less is more: sampling chemical space with active learning. J. Chem. Phys., 148, 2018, 10.1063/1.5023802.
Jinnouchi, R., Karsai, F., Kresse, G., On-the-fly machine learning force field generation: application to melting points. PhysRevB, 100, 2019, 10.1103/PhysRevB.100.014105.
Jinnouchi, R., Lahnsteiner, J., Karsai, F., Kresse, G., Bokdam, M., Phase transitions of hybrid perovskites simulated by machine-learning force fields trained on the fly with bayesian inference. PhysRevLett, 122, 2019, 10.1103/PhysRevLett.122.225701.
Vandermause, J., Xie, Y., Lim, J.S., Owen, C.J., Kozinsky, B., Active learning of reactive Bayesian force fields applied to heterogeneous catalysis dynamics of H/Pt. Nat. Commun., 13, 2022, 10.1038/s41467-022-32294-0.
Kulichenko, M., Barros, K., Lubbers, N., Li, Y.W., Messerly, R., Tretiak, S., Smith, J.S., Nebgen, B., Uncertainty-driven dynamics for active learning of interatomic potentials. Nat. Comput. Sci. 3 (2023), 230–239, 10.1038/s43588-023-00406-5.
van der Oord, C., Sachs, M., Kovács, D.P., Ortner, C., Csányi, G., Hyperactive learning for data-driven interatomic potentials. NPJ Comput. Mater., 9, 2023, 168, 10.1038/s41524-023-01104-6.
Ward, L., Dunn, A., Faghaninia, A., Zimmermann, N.E.R., Bajaj, S., Wang, Q., Montoya, J., Chen, J., Bystrom, K., Dylla, M., Chard, K., Asta, M., Persson, K.A., Je, G., Foster, I., Jain, A., Matminer: an open source toolkit for materials data mining. Comput. Mater. Sci 152 (2018), 60–69, 10.1016/j.commatsci.2018.05.018.
Attarian, S., Morgan, D., Szlufarska, I., Thermophysical properties of FLiBe using moment tensor potentials. J. Mol. Liq., 368, 2022, 120803, 10.1016/j.molliq.2022.120803.
Chen, C., Deng, Z., Tran, R., Tang, H., Chu, I.H., Ong, S.P., Accurate force field for molybdenum by machine learning large materials data. PhysRevMater, 1, 2017, 10.1103/PhysRevMaterials.1.043603.
Yang, M., Bonati, L., Polino, D., Parrinello, M., Using metadynamics to build neural network potentials for reactive events: the case of urea decomposition in water. Catal. Today 387 (2022), 143–149, 10.1016/j.cattod.2021.03.018.
Rodriguez, A., Lam, S., Hu, M., Thermodynamic and transport properties of LiF and FLiBe molten salts with deep learning potentials. ACS Appl. Mater. Interfaces 13 (2021), 55367–55379, 10.1021/acsami.1c17942.
Zeng, J., Cao, L., Xu, M., Zhu, T., Zhang, J.Z.H., Complex reaction processes in combustion unraveled by neural network-based molecular dynamics simulation. Nat. Commun. 11 (2020), 1–9, 10.1038/s41467-020-19497-z.
Loose, T., Sahrmann, P., Qu, T., Voth, G., Coarse-graining with equivariant neural networks: a path towards accurate and data-efficient models. ArXiv, 2023.
Vita, J., Fuemmeler, E., Gupta, A., Wolfe, G., Tao, A., Elliott, R., Martiniani, S., Tadmor, E., ColabFit Exchange: open-access datasets for data-driven interatomic potentials. ArXiv, 2023.
Andolina, C.M., Saidi, W.A., Highly transferable atomistic machine-learning potentials from curated and compact datasets across the periodic table, Digital. Discovery 2 (2023), 1070–1077, 10.1039/D3DD00046J.
E. Fuemmeler, G. Wolfe, A. Gupta, J.A. Vita, E.B. Tadmor, S. Martiniani, Advancing the ColabFit Exchange towards a Web-scale Data Source for Machine Learning Interatomic Potentials, in: AI for Accelerated Materials Design - NeurIPS 2024, 2024.
B. Blaiszik, W. Engler, K. Schmidt, Garden: A FAIR Framework for Publishing and Applying AI Models for Translational Research in Science, Engineering, Education, and Industry, (2024). https://thegardens.ai/#/home (accessed February 20, 2024).
Jacobs, R., Schultz, L., Scourtas, A., Schmidt, K.J., Price-Skelly, O., Engler, W., Foster, I., Blaiszik, B., Voyles, P., Morgan, D., Machine Learning Materials Properties with Accurate Predictions, Uncertainty Estimates, Domain Guidance, and Persistent Online Accessibility. Machine Learning Science and Technology, 5, 2024, 4, 10.1088/2632-2153/ad95db.
Blaiszik, B., Chard, K., Pruyne, J., Ananthakrishnan, R., Tuecke, S., Foster, I., The materials data facility: data services to advance materials science research. JOM 68 (2016), 2045–2052, 10.1007/s11837-016-2001-3.
Blaiszik, B., Ward, L., Schwarting, M., Gaff, J., Chard, R., Pike, D., Chard, K., Foster, I., A data ecosystem to support machine learning in materials science. MRS Commun. 9 (2019), 1125–1133, 10.1557/mrc.2019.118.
Schmidt, K., Scourtas, A., Ward, L., Wangen, S., Schwarting, M., Darling, I., Truelove, E., Ambadkar, A., Bose, R., Katok, Z., Wei, J., Li, X., Jacobs, R., Schultz, L., Kim, D., Ferris, M., Voyles, P.M., Morgan, D., Foster, I., Blaiszik, B., Foundry-ML - software and services to simplify access to machine learning datasets in materials science. J Open Source Softw 9, 5467, 2024, 10.21105/joss.05467.
S.S. Schoenholz, E.D. Cubuk, JAX, M.D. A framework for differentiable physics, Adv Neural Inf Process Syst 2020-Decem (2020). 10.1088/1742-5468/ac3ae9.
A. Hjorth Larsen, J. JØrgen Mortensen, J. Blomqvist, I.E. Castelli, R. Christensen, M. Dułak, J. Friis, M.N. Groves, B. Hammer, C. Hargus, E.D. Hermes, P.C. Jennings, P. Bjerre Jensen, J. Kermode, J.R. Kitchin, E. Leonhard Kolsbjerg, J. Kubal, K. Kaasbjerg, S. Lysgaard, J. Bergmann Maronsson, T. Maxson, T. Olsen, L. Pastewka, A. Peterson, C. Rostgaard, J. SchiØtz, O. Schütt, M. Strange, K.S. Thygesen, T. Vegge, L. Vilhelmsen, M. Walter, Z. Zeng, K.W. Jacobsen, The atomic simulation environment - A Python library for working with atoms, Journal of Physics Condensed Matter 29 (2017). 10.1088/1361-648X/aa680e.
Anstine, D.M., Isayev, O., Machine learning interatomic potentials and long-range physics. Chem. A Eur. J. 127 (2023), 2417–2431, 10.1021/acs.jpca.2c06778.
Glick, Z.L., Metcalf, D.P., Koutsoukas, A., Spronk, S.A., Cheney, D.L., Sherrill, C.D., AP-Net: An atomic-pairwise neural network for smooth and transferable interaction potentials. J. Chem. Phys., 153, 2020, 10.1063/5.0011521.
Pan, G., Ding, J., Du, Y., Lee, D.-J., Lu, Y., A DFT accurate machine learning description of molten ZnCl2 and its mixtures: 2. Potential development and properties prediction of ZnCl2-NaCl-KCl ternary salt for CSP. Comput Mater Sci 187, 2021, 110055, 10.1016/j.commatsci.2020.110055.
Bu, M., Liang, W., Lu, G., Molecular dynamics simulations on AlCl3-LiCl molten salt with deep learning potential. Comput. Mater. Sci, 210, 2022, 111494, 10.1016/j.commatsci.2022.111494.
Chahal, R., Roy, S., Brehm, M., Banerjee, S., Bryantsev, V., Lam, S.T., Transferable deep learning potential reveals intermediate-range ordering effects in LiF–NaF–ZrF 4 Molten Salt. JACS Au 2 (2022), 2693–2702, 10.1021/jacsau.2c00526.
Zhang, L., Wang, H., Muniz, M.C., Panagiotopoulos, A.Z., Car, R., Weinan, E., A deep potential model with long-range electrostatic interactions. J. Chem. Phys., 156, 2022, 10.1063/5.0083669.
Gao, A., Remsing, R.C., Self-consistent determination of long-range electrostatics in neural network potentials. Nat. Commun., 13, 2022, 1572, 10.1038/s41467-022-29243-2.
A. Kabylda, J.T. Frank, S.S. Dou, A. Khabibrakhmanov, L.M. Sandonas, O.T. Unke, S. Chmiela, K.-R. Muller, A. Tkatchenko, Molecular Simulations with a Pretrained Neural Network and Universal Pairwise Force Fields, (2024). 10.26434/chemrxiv-2024-bdfr0.
Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R., Machine learning of accurate energy-conserving molecular force fields. Sci. Adv., 2017, 10.1126/sciadv.1603015.
von Barth, U., Hedin, L., A local exchange-correlation potential for the spin polarized case. i. Journal of Physics c: Solid State Physics 5, 1972, 1629–1642, 10.1088/0022-3719/5/13/012.
Stoner, E., Collective electron ferronmagnetism. Proc. R. Soc. Lond. A 165 (1938), 372–414, 10.1098/rspa.1938.0066.
Drautz, R., Pettifor, D.G., Valence-dependent analytic bond-order potential for transition metals. PhysRevB, 74, 2006, 174117, 10.1103/PhysRevB.74.174117.
Heisenberg, W., Zur Theorie des Ferromagnetismus. Z. Angew. Phys., 1928.
Tranchida, J., Plimpton, S.J., Thibaudeau, P., Thompson, A.P., Massively parallel symplectic algorithm for coupled magnetic spin dynamics and molecular dynamics. J. Comput. Phys. 372 (2018), 406–425, 10.1016/j.jcp.2018.06.042.
Eckhoff, M., Behler, J., High-dimensional neural network potentials for magnetic systems using spin-dependent atom-centered symmetry functions. npj Comput. Mater., 7, 2021, 170, 10.1038/s41524-021-00636-z.
Novikov, I., Grabowski, B., Körmann, F., Shapeev, A., Magnetic Moment Tensor Potentials for collinear spin-polarized materials reproduce different magnetic states of bcc Fe. npj Comput. Mater., 8, 2022, 13, 10.1038/s41524-022-00696-9.
Domina, M., Cobelli, M., Sanvito, S., Spectral neighbor representation for vector fields: Machine learning potentials including spin. PhysRevB, 105, 2022, 214439, 10.1103/PhysRevB.105.214439.
Chapman, J.B.J., Ma, P.-W., A machine-learned spin-lattice potential for dynamic simulations of defective magnetic iron. Sci. Rep., 12, 2022, 22451, 10.1038/s41598-022-25682-5.
Drautz, R., Atomic cluster expansion of scalar, vectorial, and tensorial properties including magnetism and charge transfer. PhysRevB, 102, 2020, 10.1103/PhysRevB.102.024104.
M. Rinaldi, M. Mrovec, A. Bochkarev, Y. Lysogorskiy, R. Drautz, Non-collinear Magnetic Atomic Cluster Expansion for Iron, (2023). http://arxiv.org/abs/2305.15137.
Yu, H., Zhong, Y., Ji, J., Gong, X., Xiang, H., Time-reversal equivariant neural network potential and Hamiltonian for magnetic materials. ArXiv, 2022.
Yu, H., Zhong, Y., Hong, L., Xu, C., Ren, W., Gong, X., Xiang, H., Spin-dependent graph neural network potential for magnetic materials. ArXiv, 2023.
Westermayr, J., Marquetand, P., Machine learning for electronically excited states of molecules. Chem. Rev. 121 (2021), 9873–9926, 10.1021/acs.chemrev.0c00749.
Westermayr, J., Gastegger, M., Marquetand, P., Combining SchNet and SHARC: the SchNarc machine learning approach for excited-state dynamics. J. Phys. Chem. Lett. 11 (2020), 3828–3834, 10.1021/acs.jpclett.0c00527.
Westermayr, J., Gastegger, M., Menger, M.F.S.J., Mai, S., González, L., Marquetand, P., Machine learning enables long time scale molecular photodynamics simulations. Chem. Sci. 10 (2019), 8100–8107, 10.1039/C9SC01742A.
Zhang, Y., Maurer, R.J., Jiang, B., Symmetry-adapted high dimensional neural network representation of electronic friction tensor of adsorbates on metals. J. Phys. Chem. C 124 (2020), 186–195, 10.1021/acs.jpcc.9b09965.
Zhang, Y., Jiang, B., Universal machine learning for the response of atomistic systems to external fields. Nat. Commun., 14, 2023, 6424, 10.1038/s41467-023-42148-y.
Fonseca, G., Poltavsky, I., Vassilev-Galindo, V., Tkatchenko, A., Improving molecular force fields across configurational space by combining supervised and unsupervised machine learning. J. Chem. Phys., 154, 2021, 10.1063/5.0035530.
T. Kipf, M. Welling, Semi-Supervised Classification with Graph Convolutional Networks, ArXiv, 2017.
OpenKIM: Interatomic Potentials and Analytics for Molecular Simulation, (2023). https://openkim.org/ (accessed October 30, 2023).
Jinnouchi, R., Karsai, F., Kresse, G., On-the-fly machine learning force field generation: application to melting points. PhysRevB, 100, 2019, 014105, 10.1103/PhysRevB.100.014105.
Santos, K., Moore, S., Oppelstrup, T., Sharifian, A., Sharapov, I., Thompson, A., Kalchev, D., Perez, D., Schreiber, R., Breaking the molecular dynamics timescale barrier using a wafer-scale system. ArXiv, 2024.
Allen, A., Lubbers, N., Matin, S., Smith, J., Messerly, R., Tretiak, S., Barros, K., Learning Together: towards foundational models for machine learning interatomic potentials with meta-learning. ArXiv, 2023.
Shiota, T., Ishihara, K., Do, T.M., Mori, T., Mizukamai, W., Taming multi-domain, -fidelity data: towards foundation models for atomistic scale simulations. ArXiv, 10.48550/arXiv, 2024, 2412.13088.
Yu, Z., Annamareddy, A., Morgan, D., Wang, B., How close are the classical two-body potentials to ab initio calculations? Insights from linear machine learning based force matching. J. Chem. Phys., 160, 2024, 10.1063/5.0175756.
Ellis, J.A., Fiedler, L., Popoola, G.A., Modine, N.A., Stephens, J.A., Thompson, A.P., Cangi, A., Rajamanickam, S., Accelerating finite-temperature Kohn-Sham density functional theory with deep neural networks. PhysRevB, 104, 2021, 035120, 10.1103/PhysRevB.104.035120.
Hermann, J., Schätzle, Z., Noé, F., Deep-neural-network solution of the electronic Schrödinger equation. Nat. Chem. 12 (2020), 891–897, 10.1038/s41557-020-0544-y.
Shen, C., Attarian, S., Zhang, Y., Zhang, H., Asta, M., Szlufarska, I., Morgan, D., SuperSalt: Equivariant Neural Network Force Fields for Multicomponent Molten Salts System. ArXiv, 2024, 10.48550/arxiv.2412.19353.
Bochkarev, A., Lysogorskiy, Y., Ortner, C., Csanyi, G., Drautz, R., Multilayer atomic cluster expansion for semilocal interactions. Phys. Rev. Research, 4, 2022, L042019, 10.1103/PhysRevResearch.4.L042019.