Reducing the number of neurons in radial basis function networks with dynamic decay adjustment

Classification is a common task for supervised neural networks. A specific radial basis function network for classification is the so-called RBF network with dynamic decay adjustment (RBFN-DDA). Fast training and good classification performance are properties of this network. RBFN-DDA is a dynamical...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 62; pp. 79 - 91
Main Author Paetz, Jürgen
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.12.2004
Subjects
Online AccessGet full text
ISSN0925-2312
1872-8286
DOI10.1016/j.neucom.2003.12.004

Cover

Loading…
More Information
Summary:Classification is a common task for supervised neural networks. A specific radial basis function network for classification is the so-called RBF network with dynamic decay adjustment (RBFN-DDA). Fast training and good classification performance are properties of this network. RBFN-DDA is a dynamically growing network, i.e. neurons are inserted during training. A drawback of RBFN-DDA is its greedy insertion behavior. Too many superfluous neurons are inserted for noisy data, overlapping data or for outliers. We propose an online technique to reduce the number of neurons during training. We achieve our goal by deleting neurons after each training of one epoch. By using the improved algorithm on benchmark data and current medical data, the number of neurons is reduced clearly (up to 93.9% less neurons). Thus, we achieve a network with less complexity compared to the original RBFN-DDA.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2003.12.004