On the descriptive power of Neural-Networks as constrained Tensor Networks with exponentially large bond dimension
In many cases, Neural networks can be mapped into tensor networks with an exponentially large bond dimension. Here, we compare different sub-classes of neural network states, with their mapped tensor network counterpart for studying the ground state of short-range Hamiltonians. We show that when map...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , , , |
Format | Paper Journal Article |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
24.09.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In many cases, Neural networks can be mapped into tensor networks with an exponentially large bond dimension. Here, we compare different sub-classes of neural network states, with their mapped tensor network counterpart for studying the ground state of short-range Hamiltonians. We show that when mapping a neural network, the resulting tensor network is highly constrained and thus the neural network states do in general not deliver the naive expected drastic improvement against the state-of-the-art tensor network methods. We explicitly show this result in two paradigmatic examples, the 1D ferromagnetic Ising model and the 2D antiferromagnetic Heisenberg model, addressing the lack of a detailed comparison of the expressiveness of these increasingly popular, variational ans\"atze. |
---|---|
ISSN: | 2331-8422 |
DOI: | 10.48550/arxiv.1905.11351 |