A Sparse Connected Long Short-Term Memory With Sharing Weight for Time Series Prediction

The development of the mobile Internet and the success of deep learning in many applications have driven the need to deploy and apply deep learning models on mobile devices under the condition of limited resources. Long Short-Term Memory (LSTM), as a special scheme in deep learning, can learn long-d...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 8; pp. 66856 - 66866
Main Authors Xiong, Liyan, Ling, Xiangzheng, Huang, Xiaohui, Tang, Hong, Yuan, Weimin, Huang, Weichun
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The development of the mobile Internet and the success of deep learning in many applications have driven the need to deploy and apply deep learning models on mobile devices under the condition of limited resources. Long Short-Term Memory (LSTM), as a special scheme in deep learning, can learn long-distance dependencies hidden in time series. However, the high computational complexity of LSTM-related structures and the need for a large number of resources for training have become obstacles to their deployment on mobile devices. In order to reduce the resource requirements and computational costs of LSTMs, we use pruning strategies to preserve important connections during the training phase. After training, we reduce the complexity of LSTMs network by sharing weight strategy.Based on these strategies, we propose a sparse connected LSTM with a sharing weight (SCLSTM) model. The experimental results on the real data sets show that SCLSTM with 0.88% neural connections can obtain prediction capabilities comparable to densely connected LSTM. Moreover, SCLSTM can solve the problem of overfitting to some extent. The results of experiments demonstrate that SCLSTM can perform better than the-state-of-arts algorithm on mobile devices of limited resources.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.2984796