Some n-bit parity problems are solvable by feedforward networks with less than n hidden units
Starting with two hidden units, we train a simple single hidden layer feedforward neural network to solve the n-bit parity problem. If the network fails to recognize correctly all the input patterns, an additional hidden unit is added to the hidden layer and the network is retrained. This process is...
Saved in:
Published in | Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan) Vol. 1; pp. 305 - 308 vol.1 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
1993
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Starting with two hidden units, we train a simple single hidden layer feedforward neural network to solve the n-bit parity problem. If the network fails to recognize correctly all the input patterns, an additional hidden unit is added to the hidden layer and the network is retrained. This process is repeated until a network that correctly classifies all the input patterns has been constructed. Using a variant of the quasi-Newton methods for training, we have been able to find networks with a single layer containing less than n hidden units that solve the n-bit parity problem for some value of n. This proves the power of combining quasi-Newton method and node incremental approach. |
---|---|
ISBN: | 0780314212 9780780314214 |
DOI: | 10.1109/IJCNN.1993.713918 |