Merging Back-propagation and Hebbian Learning Rules for Robust Classifications
By imposing saturation requirements on hidden-layer neural activations, a new learning algorithm is developed to improve robustness on classification performance of a multi-layer Perceptron. Derivatives of the sigmoid functions at hidden-layers are added to the standard output error with relative si...
Saved in:
Published in | Neural networks Vol. 9; no. 7; pp. 1213 - 1222 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Oxford
Elsevier Ltd
01.10.1996
Elsevier Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | By imposing saturation requirements on hidden-layer neural activations, a new learning algorithm is developed to improve robustness on classification performance of a multi-layer Perceptron. Derivatives of the sigmoid functions at hidden-layers are added to the standard output error with relative significance factors, and the total error is minimized by the steepest-descent method. The additional gradient-descent terms become Hebbian, and this new algorithm merges two popular learning algorithms, i.e., error back-propagation and Hebbian learning rules. Only slight modifications are needed for the standard back-propagation algorithm, and additional computational requirements are negligible. This saturation requirement effectively reduces output sensitivity to the input, which results in improved robustness and better generalization for classifier networks. Also distributed representations at hidden-layers are successfully suppressed to accomplish efficient utilization of hidden neurons. Computer simulations demonstrates much faster learning convergence as well as improved robustness for classifications and hetero-associations of binary patterns. Copyright © 1996 Elsevier Science Ltd |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 ObjectType-Article-2 ObjectType-Feature-1 |
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/0893-6080(96)00042-1 |