Two-dimensional materials-based probabilistic synapses and reconfigurable neurons for measuring inference uncertainty using Bayesian neural networks

Artificial neural networks have demonstrated superiority over traditional computing architectures in tasks such as pattern classification and learning. However, they do not measure uncertainty in predictions, and hence they can make wrong predictions with high confidence, which can be detrimental fo...

Full description

Saved in:
Bibliographic Details
Published inNature communications Vol. 13; no. 1; pp. 6139 - 10
Main Authors Sebastian, Amritanand, Pendurthi, Rahul, Kozhakhmetov, Azimkhan, Trainor, Nicholas, Robinson, Joshua A., Redwing, Joan M., Das, Saptarshi
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 17.10.2022
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Artificial neural networks have demonstrated superiority over traditional computing architectures in tasks such as pattern classification and learning. However, they do not measure uncertainty in predictions, and hence they can make wrong predictions with high confidence, which can be detrimental for many mission-critical applications. In contrast, Bayesian neural networks (BNNs) naturally include such uncertainty in their model, as the weights are represented by probability distributions (e.g. Gaussian distribution). Here we introduce three-terminal memtransistors based on two-dimensional (2D) materials, which can emulate both probabilistic synapses as well as reconfigurable neurons. The cycle-to-cycle variation in the programming of the 2D memtransistor is exploited to achieve Gaussian random number generator-based synapses, whereas 2D memtransistor based integrated circuits are used to obtain neurons with hyperbolic tangent and sigmoid activation functions. Finally, memtransistor-based synapses and neurons are combined in a crossbar array architecture to realize a BNN accelerator for a data classification task. Designing efficient Bayesian neural networks remains a challenge. Here, the authors use the cycle variation in the programming of the 2D memtransistors to achieve Gaussian random number generator-based synapses, and combine it with the complementary 2D memtransistors-based tanh function to implement a Bayesian neural network.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2041-1723
2041-1723
DOI:10.1038/s41467-022-33699-7