Shift-add neural architecture
This article is focused on implementation of artificial neural networks in hardware. We give an overview of the shift-add neural arithmetics, which provide a complete set of functions suitable for fast perceptron and RBF network implementations. The set consists of logarithm, exponent, multiplicatio...
Saved in:
Published in | ICECS'99. Proceedings of ICECS '99. 6th IEEE International Conference on Electronics, Circuits and Systems (Cat. No.99EX357) Vol. 1; pp. 411 - 414 vol.1 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
1999
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This article is focused on implementation of artificial neural networks in hardware. We give an overview of the shift-add neural arithmetics, which provide a complete set of functions suitable for fast perceptron and RBF network implementations. The set consists of logarithm, exponent, multiplication, square, square root, sigmoid-like and Gauss-like functions. All functions are linearly approximated to be easy implementable. Furthermore, we show the gate-level implementation of all functions provided by the shift-add arithmetics. Only adders and barrel shifters are necessary to accomplish all functions. The functions are optimized for very short propagation delay (a few nanoseconds). The shift-add architecture was evaluated by both software simulation and on-chip design. Results of the on-chip design are presented in the last section of this article. |
---|---|
ISBN: | 0780356829 9780780356825 |
DOI: | 10.1109/ICECS.1999.812310 |