Crossref journal-article
American Association for the Advancement of Science (AAAS)
Science Advances (221)
Abstract

The law of energy conservation is used to develop an efficient machine learning approach to construct accurate force fields.

Bibliography

Chmiela, S., Tkatchenko, A., Sauceda, H. E., Poltavsky, I., Schütt, K. T., & Müller, K.-R. (2017). Machine learning of accurate energy-conserving molecular force fields. Science Advances, 3(5).

Authors 6
  1. Stefan Chmiela (first)
  2. Alexandre Tkatchenko (additional)
  3. Huziel E. Sauceda (additional)
  4. Igor Poltavsky (additional)
  5. Kristof T. Schütt (additional)
  6. Klaus-Robert Müller (additional)
References 36 Referenced 898
  1. 10.1103/PhysRevLett.98.146401
  2. 10.1063/1.2746232
  3. 10.1103/PhysRevLett.104.136403
  4. J. Behler, Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys. 134, 074106 (2011). (10.1063/1.3553717) / J. Chem. Phys. / Atom-centered symmetry functions for constructing high-dimensional neural network potentials by Behler J. (2011)
  5. J. Behler, Neural network potential-energy surfaces in chemistry: A tool for large-scale simulations. Phys. Chem. Chem. Phys. 13, 17930–17955 (2011). (10.1039/c1cp21668f) / Phys. Chem. Chem. Phys. / Neural network potential-energy surfaces in chemistry: A tool for large-scale simulations by Behler J. (2011)
  6. K. V. J. Jose, N. Artrith, J. Behler, Construction of high-dimensional neural network potentials using environment-dependent atom pairs. J. Chem. Phys. 136, 194111 (2011). (10.1063/1.4712397) / J. Chem. Phys. / Construction of high-dimensional neural network potentials using environment-dependent atom pairs by Jose K. V. J. (2011)
  7. 10.1103/PhysRevB.87.184115
  8. A. P. Bartók, G. Csányi, Gaussian approximation potentials: A brief tutorial introduction. Int. J. Quantum Chem. 115, 1051–1057 (2015). (10.1002/qua.24927) / Int. J. Quantum Chem. / Gaussian approximation potentials: A brief tutorial introduction by Bartók A. P. (2015)
  9. S. De, A. P. Bartók, G. Csányi, M. Ceriotti, Comparing molecules and solids across structural and alchemical space. Phys. Chem. Chem. Phys. 18, 13754–13769 (2016). (10.1039/C6CP00415F) / Phys. Chem. Chem. Phys. / Comparing molecules and solids across structural and alchemical space by De S. (2016)
  10. 10.1103/PhysRevLett.108.058301
  11. G. Montavon, M. Rupp, V. Gobre, A. Vazquez-Mayagoitia, K. Hansen, A. Tkatchenko, K.-R. Müller, O. A. von Lilienfeld, Machine learning of molecular electronic properties in chemical compound space. New J. Phys. 15, 095003 (2013). (10.1088/1367-2630/15/9/095003) / New J. Phys. / Machine learning of molecular electronic properties in chemical compound space by Montavon G. (2013)
  12. K. Hansen, G. Montavon, F. Biegler, S. Fazli, M. Rupp, M. Scheffler, O. A. von Lilienfeld, A. Tkatchenko, K.-R. Müller, Assessment and validation of machine learning methods for predicting molecular atomization energies. J. Chem. Theory Comput. 9, 3404–3419 (2013). (10.1021/ct400195d) / J. Chem. Theory Comput. / Assessment and validation of machine learning methods for predicting molecular atomization energies by Hansen K. (2013)
  13. K. Hansen, F. Biegler, R. Ramakrishnan, W. Pronobis, O. A. von Lilienfeld, K.-R. Müller, A. Tkatchenko, Machine learning predictions of molecular properties: Accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett. 6, 2326–2331 (2015). (10.1021/acs.jpclett.5b00831) / J. Phys. Chem. Lett. / Machine learning predictions of molecular properties: Accurate many-body potentials and nonlocality in chemical space by Hansen K. (2015)
  14. M. Rupp, R. Ramakrishnan, O. A. von Lilienfeld, Machine learning for quantum mechanical properties of atoms in molecules. J. Phys. Chem. Lett. 6, 3309–3313 (2015). (10.1021/acs.jpclett.5b01456) / J. Phys. Chem. Lett. / Machine learning for quantum mechanical properties of atoms in molecules by Rupp M. (2015)
  15. V. Botu, R. Ramprasad, Learning scheme to predict atomic forces and accelerate materials simulations. Phys. Rev. B 92, 094306 (2015). (10.1103/PhysRevB.92.094306) / Phys. Rev. B / Learning scheme to predict atomic forces and accelerate materials simulations by Botu V. (2015)
  16. M. Hirn, N. Poilvert, S. Mallat, Quantum energy regression using scattering transforms. CoRR arXiv:1502.02077 (2015). / CoRR / Quantum energy regression using scattering transforms by Hirn M. (2015)
  17. 10.1063/1.4966192
  18. Z. Li, J. R. Kermode, A. De Vita, Molecular dynamics with on-the-fly machine learning of quantum-mechanical forces. Phys. Rev. Lett. 114, 096405 (2015). (10.1103/PhysRevLett.114.096405) / Phys. Rev. Lett. / Molecular dynamics with on-the-fly machine learning of quantum-mechanical forces by Li Z. (2015)
  19. C. A. Micchelli, M. A. Pontil, On learning vector-valued functions. Neural Comput. 17, 177–204 (2005). (10.1162/0899766052530802) / Neural Comput. / On learning vector-valued functions by Micchelli C. A. (2005)
  20. A. Caponnetto, C. A. Micchelli, M. Pontil, Y. Ying, Universal multi-task kernels. J. Mach. Learn. Res. 9, 1615–1646 (2008). / J. Mach. Learn. Res. / Universal multi-task kernels by Caponnetto A. (2008)
  21. V. Sindhwani H. Q. Minh A. C. Lozano Scalable matrix-valued kernel learning for high-dimensional nonlinear multivariate regression and granger causality in Proceedings of the 29th Conference on Uncertainty in Artificial Intelligence (UAI’13) 12 to 14 July 2013.
  22. B. Matérn Spatial Variation Lecture Notes in Statistics (Springer-Verlag 1986). (10.1007/978-1-4615-7892-5)
  23. I. S. Gradshteyn I. M. Ryzhik Table of Integrals Series and Products A. Jeffrey D. Zwillinger Eds. (Academic Press ed. 7 2007).
  24. T. Gneiting, W. Kleiber, M. Schlather, Matérn cross-covariance functions for multivariate random fields. J. Am. Stat. Assoc. 105, 1167–1177 (2010). (10.1198/jasa.2010.tm09420) / J. Am. Stat. Assoc. / Matérn cross-covariance functions for multivariate random fields by Gneiting T. (2010)
  25. H. Helmholtz, Über Integrale der hydrodynamischen Gleichungen, welche den Wirbelbewegungen entsprechen. Angew. Math. 1858, 25–55 (2009). / Angew. Math. / Über Integrale der hydrodynamischen Gleichungen, welche den Wirbelbewegungen entsprechen by Helmholtz H. (2009)
  26. W. H. Press S. A. Teukolsky W. T. Vetterling B. P. Flannery Numerical Recipes: The Art of Scientific Computing (Cambridge Univ. Press ed. 3 2007).
  27. 10.1103/PhysRevLett.77.3865
  28. 10.1103/PhysRevLett.102.073005
  29. M. Ceriotti, J. More, D. E. Manolopoulos, i-PI: A Python interface for ab initio path integral molecular dynamics simulations. Comput. Phys. Commun. 185, 1019–1026 (2014). (10.1016/j.cpc.2013.10.027) / Comput. Phys. Commun. / i-PI: A Python interface for ab initio path integral molecular dynamics simulations by Ceriotti M. (2014)
  30. I. Poltavsky, A. Tkatchenko, Modeling quantum nuclei with perturbed path integral molecular dynamics. Chem. Sci. 7, 1368–1372 (2016). (10.1039/C5SC03443D) / Chem. Sci. / Modeling quantum nuclei with perturbed path integral molecular dynamics by Poltavsky I. (2016)
  31. A. J. Smola B. Schölkopf Learning with Kernels: Support Vector Machines Regularization Optimization and Beyond (MIT Press 2001).
  32. 10.1103/PhysRevLett.108.253002
  33. J. C. Snyder, M. Rupp, K.-R. Müller, K. Burke, Nonlinear gradient denoising: Finding accurate extrema from inaccurate functional derivatives. Int. J. Quantum Chem. 115, 1102–1114 (2015). (10.1002/qua.24937) / Int. J. Quantum Chem. / Nonlinear gradient denoising: Finding accurate extrema from inaccurate functional derivatives by Snyder J. C. (2015)
  34. 10.1162/089976698300017467
  35. B. Schölkopf, S. Mika, C. J. C. Burges, P. Knirsch, K.-R. Müller, G. Ratsch, A. J. Smola, Input space versus feature space in kernel-based methods. IEEE Trans. Neural Netw. Learn. Syst. 10, 1000–1017 (1999). (10.1109/72.788641) / IEEE Trans. Neural Netw. Learn. Syst. / Input space versus feature space in kernel-based methods by Schölkopf B. (1999)
  36. K.-R. Müller, S. Mika, G. Rätsch, K. Tsuda, B. Schölkopf, An introduction to kernel-based learning algorithms. IEEE Trans. Neural Netw. Learn. Syst. 12, 181–201 (2001). (10.1109/72.914517) / IEEE Trans. Neural Netw. Learn. Syst. / An introduction to kernel-based learning algorithms by Müller K.-R. (2001)
Dates
Type When
Created 8 years, 3 months ago (May 5, 2017, 8:40 p.m.)
Deposited 1 year, 7 months ago (Jan. 9, 2024, 11:30 a.m.)
Indexed 22 minutes ago (Aug. 21, 2025, 6:32 a.m.)
Issued 8 years, 3 months ago (May 5, 2017)
Published 8 years, 3 months ago (May 5, 2017)
Published Print 8 years, 3 months ago (May 5, 2017)
Funders 2
  1. Deutsche Forschungsgemeinschaft 10.13039/501100001659

    Region: Europe

    gov (National government)

    Labels3
    1. German Research Association
    2. German Research Foundation
    3. DFG
    Awards2
    1. ID0EVBAI16416
    2. MU 987/20-1
  2. Ministry of Education, Science and Technology 10.13039/501100004085

    Region: Asia

    gov (National government)

    Labels2
    1. Korean Ministry of Education, Science and Technology
    2. MEST
    Awards2
    1. ID0EQGAI16417
    2. 2012-005741

@article{Chmiela_2017, title={Machine learning of accurate energy-conserving molecular force fields}, volume={3}, ISSN={2375-2548}, url={http://dx.doi.org/10.1126/sciadv.1603015}, DOI={10.1126/sciadv.1603015}, number={5}, journal={Science Advances}, publisher={American Association for the Advancement of Science (AAAS)}, author={Chmiela, Stefan and Tkatchenko, Alexandre and Sauceda, Huziel E. and Poltavsky, Igor and Schütt, Kristof T. and Müller, Klaus-Robert}, year={2017}, month=may }