Compressed auto-encoder building block for deep learning network
Deep learning algorithm has been widely used in many area which is one of the most important representation learning algorithms in machine learning tasks. Deep learning network is stacked by the building blocks such as the restricted Boltzmann machine(RBM) and the auto-encoder, convolutional buildin...
Saved in:
Published in | 2016 3rd International Conference on Informative and Cybernetics for Computational Social Systems (ICCSS) pp. 131 - 136 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.08.2016
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Deep learning algorithm has been widely used in many area which is one of the most important representation learning algorithms in machine learning tasks. Deep learning network is stacked by the building blocks such as the restricted Boltzmann machine(RBM) and the auto-encoder, convolutional building block. After stacking the building blocks layers and layers, the improvement of the deep learning network would be notable. In this paper, we proposed a new deep learning building block that inspired by the auto-encoder, which is the compressed auto-encoder with fewer layers and parameters compared with the auto-encoder, and we put forward a bidirectional gradient decent method to update the parameters of this building block. As the experimental results show that improves the performance of the auto-encoder in accuracy of the reconstruction data. It keeps declining the error while the results of rbm or the auto-encoder becomes saturation, and some analysis are given in this paper. |
---|---|
DOI: | 10.1109/ICCSS.2016.7586437 |