Training artificial neural networks using variable precision incremental communication

We have earlier proposed incremental inter-node communication to reduce the communication cost as well as time of the learning process in artificial neural networks. In the incremental communication, instead of communicating the full magnitude of an input (output) variable of a neuron, only the incr...

Full description

Saved in:
Bibliographic Details
Published inProceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94) Vol. 3; pp. 1409 - 1414 vol.3
Main Authors Ghorbani, A.A., Bhavsar, V.C.
Format Conference Proceeding
LanguageEnglish
Published IEEE 1994
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We have earlier proposed incremental inter-node communication to reduce the communication cost as well as time of the learning process in artificial neural networks. In the incremental communication, instead of communicating the full magnitude of an input (output) variable of a neuron, only the increment/decrement to the previous value of the variable, using reduced precision, is sent on a communication link. In this paper, a variable precision incremental communication scheme is proposed. Variable precision, which can be implemented in either hardware or software, can further reduce the complexity of intercommunication and speed up the computations in massively parallel computers. This scheme is applied to the multilayer feedforward networks and simulation studies are carried out. The results of our simulations reveal that, regardless of the degree of the complexity of the problems used, variable precision scheme has stable convergence behavior and shows considerable degree of saving in terms of the number of bits used for communications.< >
ISBN:078031901X
9780780319011
DOI:10.1109/ICNN.1994.374492