Logarithmic number system for deep learning
In this paper the logarithmic Number System (LNS) is adopted to implement Long-Short Term Memory (LSTM), the basic component of a deep learning network type. Initially, piece wise approximations to activation functions σ and tanh are proposed and evaluated in LNS. Secondly, LNS multipliers and adder...
Saved in:
Published in | 2018 7th International Conference on Modern Circuits and Systems Technologies (MOCAST) pp. 1 - 4 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.05.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper the logarithmic Number System (LNS) is adopted to implement Long-Short Term Memory (LSTM), the basic component of a deep learning network type. Initially, piece wise approximations to activation functions σ and tanh are proposed and evaluated in LNS. Secondly, LNS multipliers and adders are implemented for wordlengths of 9,10 and 11 bits. The circuits are implemented in an 90-nm 1.0 V CMOS standard-cell library and quantitative results are reported. Results demonstrate that LNS is a good candidate for data representation and processing in deep learning networks, as area reduction of up to 36% is possible. |
---|---|
DOI: | 10.1109/MOCAST.2018.8376572 |