Learning Generalized Statistical Mechanics with Matrix Product States

We introduce a variational algorithm based on Matrix Product States that is trained by minimizing a generalized free energy defined using Tsallis entropy instead of the standard Gibbs entropy. As a result, our model can generate the probability distributions associated with generalized statistical m...

Full description

Saved in:
Bibliographic Details
Main Authors Díez-Valle, Pablo, Martínez-García, Fernando, García-Ripoll, Juan José, Porras, Diego
Format Journal Article
LanguageEnglish
Published 12.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We introduce a variational algorithm based on Matrix Product States that is trained by minimizing a generalized free energy defined using Tsallis entropy instead of the standard Gibbs entropy. As a result, our model can generate the probability distributions associated with generalized statistical mechanics. The resulting model can be efficiently trained, since the resulting free energy and its gradient can be calculated exactly through tensor network contractions, as opposed to standard methods which require estimating the Gibbs entropy by sampling. We devise a variational annealing scheme by ramping up the inverse temperature, which allows us to train the model while avoiding getting trapped in local minima. We show the validity of our approach in Ising spin-glass problems by comparing it to exact numerical results and quasi-exact analytical approximations. Our work opens up new possibilities for studying generalized statistical physics and solving combinatorial optimization problems with tensor networks.
DOI:10.48550/arxiv.2409.08352