Logarithmic number system for deep learning

In this paper the logarithmic Number System (LNS) is adopted to implement Long-Short Term Memory (LSTM), the basic component of a deep learning network type. Initially, piece wise approximations to activation functions σ and tanh are proposed and evaluated in LNS. Secondly, LNS multipliers and adder...

Full description

Saved in:
Bibliographic Details
Published in2018 7th International Conference on Modern Circuits and Systems Technologies (MOCAST) pp. 1 - 4
Main Authors Kouretas, I., Paliouras, V.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper the logarithmic Number System (LNS) is adopted to implement Long-Short Term Memory (LSTM), the basic component of a deep learning network type. Initially, piece wise approximations to activation functions σ and tanh are proposed and evaluated in LNS. Secondly, LNS multipliers and adders are implemented for wordlengths of 9,10 and 11 bits. The circuits are implemented in an 90-nm 1.0 V CMOS standard-cell library and quantitative results are reported. Results demonstrate that LNS is a good candidate for data representation and processing in deep learning networks, as area reduction of up to 36% is possible.
DOI:10.1109/MOCAST.2018.8376572