ABC LSTM Optimizing Parameters of Deep LSTM using ABC Algorithm for Big Datasets

Long Short Term Memory Network is the variant of RNN (Recurrent Neural Network) popularly used in various domains, particularly for sequence prediction tasks. For deep networks, number of hidden layers in the network is high and thus, the time complexity of the network increases. Moreover, with the...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of engineering and advanced technology Vol. 9; no. 5; pp. 221 - 226
Main Authors Mittal, Shweta, Sangwan, Om Prakash
Format Journal Article
LanguageEnglish
Published 30.06.2020
Online AccessGet full text

Cover

Loading…
More Information
Summary:Long Short Term Memory Network is the variant of RNN (Recurrent Neural Network) popularly used in various domains, particularly for sequence prediction tasks. For deep networks, number of hidden layers in the network is high and thus, the time complexity of the network increases. Moreover, with the increase in the size of datasets, it becomes very difficult to tune these complex networks manually (as the network may take several days/weeks to run). Thus, to minimize the time required to run an algorithm and for better accuracy, there is a need to automate the task of tuning the parameters of the network. To automatically tune the parameters of the networks, various researchers have used numerous Metaheuristic approaches like Ant Colony Optimization, Genetic Algorithm, Simulated Annealing etc. in the past which provides us with the near optimal solution. In the proposed ABC_LSTM algorithm, traditional Artificial Bee Colony algorithm has been implemented to optimize the number of hidden neurons of LSTM networks with 2 hidden layers. Based on the experimental results, it can be concluded that up to a certain point increasing the number of bees and iterations gives us the solution with the least MAE value, thereby improving the accuracy of the model.
ISSN:2249-8958
2249-8958
DOI:10.35940/ijeat.D7649.069520