LSTM Hyperparameters optimization with Hparam parameters for Bitcoin Price Prediction
Machine learning and deep learning algorithms produce very different results with different examples of their hyperparameters. Algorithm parameters require optimization because they aren't specific for all problems. In this paper Long Short-Term Memory (LSTM), eight different hyperparameters (g...
Saved in:
Published in | Sakarya university journal of computer and information sciences Vol. 6; no. 1; pp. 1 - 9 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Sakarya University
30.04.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Machine learning and deep learning algorithms produce very different results with different examples of their hyperparameters. Algorithm parameters require optimization because they aren't specific for all problems. In this paper Long Short-Term Memory (LSTM), eight different hyperparameters (go-backward, epoch, batch size, dropout, activation function, optimizer, learning rate and, number of layers) were used to examine to daily and hourly Bitcoin datasets. The effects of each parameter on the daily dataset on the results were evaluated and explained These parameters were examined with hparam properties of Tensorboard. As a result, it was seen that examining all combinations of parameters with hparam produced the best test Mean Square Error (MSE) values with hourly dataset 0.000043633 and daily dataset 0.00073843. Both datasets produced better results with the tanh activation function. Finally, when the results are interpreted, the daily dataset produces better results with a small learning rate and small dropout values, whereas the hourly dataset produces better results with a large learning rate and large dropout values. |
---|---|
ISSN: | 2636-8129 2636-8129 |
DOI: | 10.35377/saucis...1172027 |