A Proof of the Bloch Theorem for Lattice Models
The Bloch theorem is a powerful theorem stating that the expectation value of the U(1) current operator averaged over the entire space vanishes in large quantum systems. The theorem applies to the ground state and to the thermal equilibrium at a finite temperature, irrespective of the details of the...
Saved in:
Published in | Journal of statistical physics Vol. 177; no. 4; pp. 717 - 726 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.11.2019
Springer Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The Bloch theorem is a powerful theorem stating that the expectation value of the U(1) current operator averaged over the entire space vanishes in large quantum systems. The theorem applies to the ground state and to the thermal equilibrium at a finite temperature, irrespective of the details of the Hamiltonian as far as all terms in the Hamiltonian are finite ranged. In this work we present a simple yet rigorous proof for general lattice models. For large but finite systems, we find that both the discussion and the conclusion are sensitive to the boundary condition one assumes: under the periodic boundary condition, one can only prove that the current expectation value is inversely proportional to the linear dimension of the system, while the current expectation value completely vanishes before taking the thermodynamic limit when the open boundary condition is imposed. We also provide simple tight-binding models that clarify the limitation of the theorem in dimensions higher than one. |
---|---|
ISSN: | 0022-4715 1572-9613 |
DOI: | 10.1007/s10955-019-02386-1 |