A Further Investigation on the Reliability of Extreme Learning Machines
Research community has recently put more attention to the Extreme Learning Machines (ELMs) algorithm in Neural Network (NN) area. The ELMs are much faster than the traditional gradient-descent-based learning algorithms due to its analytical determination of output weights with the random choice of i...
Saved in:
Published in | 2014 IEEE International Conference on Data Mining Workshop pp. 1031 - 1037 |
---|---|
Main Authors | , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.12.2014
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Research community has recently put more attention to the Extreme Learning Machines (ELMs) algorithm in Neural Network (NN) area. The ELMs are much faster than the traditional gradient-descent-based learning algorithms due to its analytical determination of output weights with the random choice of input weights and hidden layer bias. However, since the input weights and bias are randomly assigned and not adjusted, the ELMs model shows an instability if we repeat the experiments many times. Such instability makes the ELMs less reliable than other computational intelligence models. In our investigation, we try to solve this problem by using the Random Production in the first layer of the ELMs. Thus, we can reduce the chance of using random weight assignment in ELMs by removing the bias in the hidden layer. Experiment son different data sets demonstrate that the proposed model has higher stability and reliability than the classical ELMs. |
---|---|
ISSN: | 2375-9232 2375-9259 |
DOI: | 10.1109/ICDMW.2014.117 |