On the overtraining phenomenon of backpropagation neural networks

A very important subject for the consolidation of neural networks is the study of their capabilities. In this paper, the relationships between network size, training set size and generalization capabilities are examined. The phenomenon of overtraining in backpropagation networks is discussed and an...

Full description

Saved in:
Bibliographic Details
Published inMathematics and computers in simulation Vol. 40; no. 5; pp. 507 - 521
Main Authors Tzafestas, S.G., Dalianis, P.J., Anthopoulos, G.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 1996
Online AccessGet full text

Cover

Loading…
More Information
Summary:A very important subject for the consolidation of neural networks is the study of their capabilities. In this paper, the relationships between network size, training set size and generalization capabilities are examined. The phenomenon of overtraining in backpropagation networks is discussed and an extension to an existing algorithm is described. The extended algorithm provides a new energy function and its advantages, such as improved plasticity and performance along with its dynamic properties, are explained. The algorithm is applied to some common problems (XOR, numeric character recognition and function approximation) and simulation results are presented and discussed.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0378-4754
1872-7166
DOI:10.1016/0378-4754(95)00003-8