Neural-network quantum state tomography
The experimental realization of increasingly complex synthetic quantum systems calls for the development of general theoretical methods to validate and fully exploit quantum resources. Quantum state tomography (QST) aims to reconstruct the full quantum state from simple measurements, and therefore p...
Saved in:
Published in | Nature physics Vol. 14; no. 5; pp. 447 - 450 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
01.05.2018
Nature Publishing Group |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The experimental realization of increasingly complex synthetic quantum systems calls for the development of general theoretical methods to validate and fully exploit quantum resources. Quantum state tomography (QST) aims to reconstruct the full quantum state from simple measurements, and therefore provides a key tool to obtain reliable analytics
1
–
3
. However, exact brute-force approaches to QST place a high demand on computational resources, making them unfeasible for anything except small systems
4
,
5
. Here we show how machine learning techniques can be used to perform QST of highly entangled states with more than a hundred qubits, to a high degree of accuracy. We demonstrate that machine learning allows one to reconstruct traditionally challenging many-body quantities—such as the entanglement entropy—from simple, experimentally accessible measurements. This approach can benefit existing and future generations of devices ranging from quantum computers to ultracold-atom quantum simulators
6
–
8
.
Unsupervised machine learning techniques can efficiently perform quantum state tomography of large, highly entangled states with high accuracy, and allow the reconstruction of many-body quantities from simple experimentally accessible measurements. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1745-2473 1745-2481 |
DOI: | 10.1038/s41567-018-0048-5 |