Exploring Quantum Perceptron and Quantum Neural Network structures with a teacher-student scheme

Near-term quantum devices can be used to build quantum machine learning models, such as quantum kernel methods and quantum neural networks (QNN) to perform classification tasks. There have been many proposals how to use variational quantum circuits as quantum perceptrons or as QNNs. The aim of this...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Aikaterini, Gratsea, Huembeli, Patrick
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 25.11.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Near-term quantum devices can be used to build quantum machine learning models, such as quantum kernel methods and quantum neural networks (QNN) to perform classification tasks. There have been many proposals how to use variational quantum circuits as quantum perceptrons or as QNNs. The aim of this work is to systematically compare different QNN architectures and to evaluate their relative expressive power with a teacher-student scheme. Specifically, the teacher model generates the datasets mapping random inputs to outputs which then have to be learned by the student models. This way, we avoid training on arbitrary data sets and allow to compare the learning capacity of different models directly via the loss, the prediction map, the accuracy and the relative entropy between the prediction maps. We focus particularly on a quantum perceptron model inspired by the recent work of Tacchino et. al. \cite{Tacchino1} and compare it to the data re-uploading scheme that was originally introduced by Pérez-Salinas et. al. \cite{data_re-uploading}. We discuss alterations of the perceptron model and the formation of deep QNN to better understand the role of hidden units and non-linearities in these architectures.
ISSN:2331-8422