Skip to main content
eScholarship
Open Access Publications from the University of California

Dynamical Large Deviations of Two-Dimensional Kinetically Constrained Models Using a Neural-Network State Ansatz

Abstract

We use a neural-network ansatz originally designed for the variational optimization of quantum systems to study dynamical large deviations in classical ones. We use recurrent neural networks to describe the large deviations of the dynamical activity of model glasses, kinetically constrained models in two dimensions. We present the first finite size-scaling analysis of the large-deviation functions of the two-dimensional Fredrickson-Andersen model, and explore the spatial structure of the high-activity sector of the South-or-East model. These results provide a new route to the study of dynamical large-deviation functions, and highlight the broad applicability of the neural-network state ansatz across domains in physics.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View