Some n-bit parity problems are solvable by feedforward networks with less than n hidden units

Starting with two hidden units, we train a simple single hidden layer feedforward neural network to solve the n-bit parity problem. If the network fails to recognize correctly all the input patterns, an additional hidden unit is added to the hidden layer and the network is retrained. This process is...

Full description

Saved in:
Bibliographic Details
Published inProceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan) Vol. 1; pp. 305 - 308 vol.1
Main Authors Setiono, R., Lucas Chi Kwong Hui
Format Conference Proceeding
LanguageEnglish
Published IEEE 1993
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Starting with two hidden units, we train a simple single hidden layer feedforward neural network to solve the n-bit parity problem. If the network fails to recognize correctly all the input patterns, an additional hidden unit is added to the hidden layer and the network is retrained. This process is repeated until a network that correctly classifies all the input patterns has been constructed. Using a variant of the quasi-Newton methods for training, we have been able to find networks with a single layer containing less than n hidden units that solve the n-bit parity problem for some value of n. This proves the power of combining quasi-Newton method and node incremental approach.
ISBN:0780314212
9780780314214
DOI:10.1109/IJCNN.1993.713918