LATIM: Loading-Aware Offline Training Method for Inverter-Based Memristive Neural Networks

In this brief, we present a high accuracy training method for inverter-based memristive neural networks ( IM -NNs). The method, which relies on accurate modeling of the circuit element characteristics, is called LATIM (Loading-Aware offline Training method for Inverter-based Memristive NNs). In LATI...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on circuits and systems. II, Express briefs Vol. 68; no. 10; pp. 3346 - 3350
Main Authors Vahdat, Shaghayegh, Kamal, Mehdi, Afzali-Kusha, Ali, Pedram, Massoud
Format Journal Article
LanguageEnglish
Published New York IEEE 01.10.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this brief, we present a high accuracy training method for inverter-based memristive neural networks ( IM -NNs). The method, which relies on accurate modeling of the circuit element characteristics, is called LATIM (Loading-Aware offline Training method for Inverter-based Memristive NNs). In LATIM, an approximation method is proposed to estimate the effective load of the memristive crossbar (as the synapses) while two NNs are utilized to predict the voltage transfer characteristic (VTC) of the inverters (as the activation functions). Efficacy of the proposed method is compared with the recent offline training methods for IM -NNs, called PHAX and RIM. Simulation results reveal that LATIM can predict the output voltage of the IM -NNs, on average, by <inline-formula> <tex-math notation="LaTeX">14\times </tex-math></inline-formula> (<inline-formula> <tex-math notation="LaTeX">6\times </tex-math></inline-formula>) and <inline-formula> <tex-math notation="LaTeX">29\times </tex-math></inline-formula> (<inline-formula> <tex-math notation="LaTeX">4\times </tex-math></inline-formula>) smaller error for the MNIST and Fashion MNIST datasets, respectively, compared to those of PHAX (RIM) method. In addition, IM -NNs trained by LATIM consume, on average, 62% and 53% lower energy compared to PHAX and RIM methods due to proper sizing of the inverters.
ISSN:1549-7747
1558-3791
DOI:10.1109/TCSII.2021.3072289