Sampling thermodynamic systems with neural network surrogates

Authors

DOI:

https://doi.org/10.56919/usci.1122.043

Keywords:

Neural networks, surrogate modeling, ising model, statistical sampling, Monte Carlo simulation, machine learning

Abstract

Traditional sampling methods such as the Monte Carlo method are computationally expensive and not feasible for studying large and complex systems. These methods are essential for developing new materials, optimizing chemical reactions, and understanding biological processes. However, simulating thermodynamic systems for physically relevant system sizes is computationally challenging. This is partly due to the exponential growth of the configuration space with the system size. With the current Monte Carlo methods, studying the same system for different investigation of its properties means repeating the expensive computation multiple times. In this article, I showed that thermodynamic systems can be sampled using a surrogate neural network model thereby avoiding the computationally expensive proposal Monte Carlo methods for subsequent investigations. To demonstrate the method, I trained a feed-forward neural network surrogate for the Boltzmann distribution of the Ising model. This approach would potentially help accelerate Monte Carlo simulations towards understanding the physics of novel materials and some biological processes.

References

Anderson, D. F., Higham, D. J., & Sun, Y. (2018). Computational Complexity Analysis for Monte Carlo Approximations of Classically Scaled Population Processes. Multiscale Modeling & Simulation, 16(3), 1206–1226. https://doi.org/10.1137/17M113816

Barzegar, A., Pattison, C., Wang, W., & Katzgraber, H. G. (2018). Optimization of population annealing Monte Carlo for large-scale spin-glass simulations. Physical Review E, 98(5), 053308. https://doi.org/10.1103/PhysRevE.98.05330

Bayrakci, A. A., Demir, A., & Tasiran, S. (2010). Fast Monte Carlo Estimation of Timing Yield With Importance Sampling and Transistor-Level Circuit Simulation. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 29(9), 1328–1341. https://doi.org/10.1109/TCAD.2010.204904

Blöte, H. W. J., & Deng, Y. (2002). Cluster Monte Carlo simulation of the transverse Ising model. Physical Review E, 66(6), 066110. https://doi.org/10.1103/PhysRevE.66.066110

Botu, V., Batra, R., Chapman, J., & Ramprasad, R. (2017). Machine Learning Force Fields: Construction, Validation, and Outlook. The Journal of Physical Chemistry C, 121(1), 511–522. https://doi.org/10.1021/acs.jpcc.6b10908

Broecker, P., Carrasquilla, J., Melko, R. G., & Trebst, S. (2017). Machine learning quantum phases of matter beyond the fermion sign problem. Scientific Reports, 7(1), 8823. https://doi.org/10.1038/s41598-017-09098-0

Cai, S., Wang, Z., Wang, S., Perdikaris, P., & Karniadakis, G. E. (2021). Physics-Informed Neural Networks for Heat Transfer Problems. Journal of Heat Transfer, 143(6), 060801. https://doi.org/10.1115/1.4050542

Elvira, V., Martino, L., Luengo, D., & Bugallo, M. F. (2015). Efficient Multiple Importance Sampling Estimators. IEEE Signal Processing Letters, 22(10), 1757–1761. https://doi.org/10.1109/LSP.2015.2432078

Gkeka, P., Stoltz, G., Barati Farimani, A., Belkacemi, Z., Ceriotti, M., Chodera, J. D., Dinner, A. R., Ferguson, A. L., Maillet, J.-B., Minoux, H., Peter, C., Pietrucci, F., Silveira, A., Tkatchenko, A., Trstanova, Z., Wiewiora, R., & Lelièvre, T. (2020). Machine Learning Force Fields and Coarse-Grained Variables in Molecular Dynamics: Application to Materials and Biological Systems. Journal of Chemical Theory and Computation, 16(8), 4757–4775. https://doi.org/10.1021/acs.jctc.0c00355

Husic, B. E., Charron, N. E., Lemm, D., Wang, J., Pérez, A., Majewski, M., Krämer, A., Chen, Y., Olsson, S., de Fabritiis, G., Noé, F., & Clementi, C. (2020). Coarse graining molecular dynamics with graph neural networks. The Journal of Chemical Physics, 153(19), 194101. https://doi.org/10.1063/5.0026133

Hylton, T. (2020). Thermodynamic Neural Network. Entropy, 22(3), 256. https://doi.org/10.3390/e22030256

Kosmatopoulos, E. B., & Christodoulou, M. A. (1994). The Boltzmann g-RHONN: A learning machine for estimating unknown probability distributions. Neural Networks, 7(2), 271–278. https://doi.org/10.1016/0893-6080(94)90021-3

Laubscher, R. (2021). Simulation of multi-species flow and heat transfer using physics-informed neural networks. Physics of Fluids, 33(8), 087101. https://doi.org/10.1063/5.0058529

Lee, E. M. Y., Ludwig, T., Yu, B., Singh, A. R., Gygi, F., Nørskov, J. K., & de Pablo, J. J. (2021). Neural Network Sampling of the Free Energy Landscape for Nitrogen Dissociation on Ruthenium. The Journal of Physical Chemistry Letters, 12(11), 2954–2962. https://doi.org/10.1021/acs.jpclett.1c00195

MacKay, D. J. C. (1995). Developments in Probabilistic Modelling with Neural Networks—Ensemble Learning. In B. Kappen & S. Gielen (Eds.), Neural Networks: Artificial Intelligence and Industrial Applications (pp. 191–198). Springer London. https://doi.org/10.1007/978-1-4471-3087-1_37

McNaughton, B., Milošević, M. V., Perali, A., & Pilati, S. (2020). Boosting Monte Carlo simulations of spin glasses using autoregressive neural networks. Physical Review E, 101(5), 053312. https://doi.org/10.1103/PhysRevE.101.053312

Nicoli, K. A., Anders, C., Funcke, L., Hartung, T., Jansen, K., Kessel, P., Nakajima, S., & Stornati, P. (2021a). Machine learning of thermodynamic observables in the presence of mode collapse. ArXiv Preprint ArXiv:2111.11303.

Nicoli, K. A., Anders, C. J., Funcke, L., Hartung, T., Jansen, K., Kessel, P., Nakajima, S., & Stornati, P. (2021b). Estimation of Thermodynamic Observables in Lattice Field Theories with Deep Generative Models. Physical Review Letters, 126(3), 032001. https://doi.org/10.1103/PhysRevLett.126.032001

Poltavsky, I., & Tkatchenko, A. (2021). Machine Learning Force Fields: Recent Advances and Remaining Challenges. The Journal of Physical Chemistry Letters, 12(28), 6551–6564. https://doi.org/10.1021/acs.jpclett.1c01204

Shapiro, A. (2011). Computational Complexity of Stochastic Programming: Monte Carlo Sampling Approach. Proceedings of the International Congress of Mathematicians 2010 (ICM 2010), 2979–2995. https://doi.org/10.1142/9789814324359_0175

Shukla, K., Jagtap, A. D., & Karniadakis, G. E. (2021). Parallel physics-informed neural networks via domain decomposition. Journal of Computational Physics, 447, 110683. https://doi.org/10.1016/j.jcp.2021.110683

Sultan, M. M., Wayment-Steele, H. K., & Pande, V. S. (2018). Transferable Neural Networks for Enhanced Sampling of Protein Dynamics. Journal of Chemical Theory and Computation, 14(4), 1887–1894. https://doi.org/10.1021/acs.jctc.8b00025

Unke, O. T., Chmiela, S., Sauceda, H. E., Gastegger, M., Poltavsky, I., Schütt, K. T., Tkatchenko, A., & Müller, K.-R. (2021). Machine Learning Force Fields. Chemical Reviews, 121(16), 10142–10186. https://doi.org/10.1021/acs.chemrev.0c01111

Wang, Y., Ribeiro, J. M. L., & Tiwary, P. (2019). Past–future information bottleneck for sampling molecular reaction coordinate simultaneously with thermodynamics and kinetics. Nature Communications, 10(1), 3573. https://doi.org/10.1038/s41467-019-11405-4

Wu, D., Rossi, R., & Carleo, G. (2021). Unbiased Monte Carlo cluster updates with autoregressive neural networks. Physical Review Research, 3(4), L042024. https://doi.org/10.1103/PhysRevResearch.3.L042024

Downloads

Published

2022-09-30

How to Cite

Ibrahim, Y. (2022). Sampling thermodynamic systems with neural network surrogates. UMYU Scientifica, 1(1), 336–341. https://doi.org/10.56919/usci.1122.043