Realization of Long Short-Term Memory Networks on Quantum Circuits

As one of the well-known methods to solve natural language processing (NLP) problems, the long short-term memory neural network (LSTM) has been designed to implement on quantum computers. However, existing method requires the number of qubits is equal to the dimension of the word vectors, and may re...

Full description

Saved in:
Bibliographic Details
Published in2022 13th Asian Control Conference (ASCC) pp. 2360 - 2366
Main Authors Hou, Xiaokai, Yang, Yingli, Wang, Xiaoting
Format Conference Proceeding
LanguageEnglish
Published ACA 04.05.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:As one of the well-known methods to solve natural language processing (NLP) problems, the long short-term memory neural network (LSTM) has been designed to implement on quantum computers. However, existing method requires the number of qubits is equal to the dimension of the word vectors, and may result in a surge of the qubit number. To address this problem, we propose a duplication-free quantum long short-term memory neural network (DQLSTM). Specifically, our DQLSTM adopts the amplitude encoding method to store the classical information, and further can reduce the requirement for the number of qubits. Our numerical results show that our DQLSTM has similar effectiveness to the classical counterpart in a Chinese sentiment analysis task, and outperforms several previous proposals in the same task.
ISSN:2770-8373
DOI:10.23919/ASCC56756.2022.9828335