LSTM Hyperparameters optimization with Hparam parameters for Bitcoin Price Prediction

Machine learning and deep learning algorithms produce very different results with different examples of their hyperparameters. Algorithm parameters require optimization because they aren't specific for all problems. In this paper Long Short-Term Memory (LSTM), eight different hyperparameters (g...

Full description

Saved in:
Bibliographic Details
Published inSakarya university journal of computer and information sciences Vol. 6; no. 1; pp. 1 - 9
Main Authors KERVANCI, I.sibel, AKAY, Fatih
Format Journal Article
LanguageEnglish
Published Sakarya University 30.04.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Machine learning and deep learning algorithms produce very different results with different examples of their hyperparameters. Algorithm parameters require optimization because they aren't specific for all problems. In this paper Long Short-Term Memory (LSTM), eight different hyperparameters (go-backward, epoch, batch size, dropout, activation function, optimizer, learning rate and, number of layers) were used to examine to daily and hourly Bitcoin datasets. The effects of each parameter on the daily dataset on the results were evaluated and explained These parameters were examined with hparam properties of Tensorboard. As a result, it was seen that examining all combinations of parameters with hparam produced the best test Mean Square Error (MSE) values with hourly dataset 0.000043633 and daily dataset 0.00073843. Both datasets produced better results with the tanh activation function. Finally, when the results are interpreted, the daily dataset produces better results with a small learning rate and small dropout values, whereas the hourly dataset produces better results with a large learning rate and large dropout values.
ISSN:2636-8129
2636-8129
DOI:10.35377/saucis...1172027