Learning Generalized Statistical Mechanics with Matrix Product States
We introduce a variational algorithm based on Matrix Product States that is trained by minimizing a generalized free energy defined using Tsallis entropy instead of the standard Gibbs entropy. As a result, our model can generate the probability distributions associated with generalized statistical m...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
12.09.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We introduce a variational algorithm based on Matrix Product States that is
trained by minimizing a generalized free energy defined using Tsallis entropy
instead of the standard Gibbs entropy. As a result, our model can generate the
probability distributions associated with generalized statistical mechanics.
The resulting model can be efficiently trained, since the resulting free energy
and its gradient can be calculated exactly through tensor network contractions,
as opposed to standard methods which require estimating the Gibbs entropy by
sampling. We devise a variational annealing scheme by ramping up the inverse
temperature, which allows us to train the model while avoiding getting trapped
in local minima. We show the validity of our approach in Ising spin-glass
problems by comparing it to exact numerical results and quasi-exact analytical
approximations. Our work opens up new possibilities for studying generalized
statistical physics and solving combinatorial optimization problems with tensor
networks. |
---|---|
DOI: | 10.48550/arxiv.2409.08352 |