POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with Integer Quadratic Integrate-and-Fire Neurons

The inner operations of the human brain as a biological processing system remain largely a mystery. Inspired by the function of the human brain and based on the analysis of simple neural network systems in other species, such as Drosophila, neuromorphic computing systems have attracted considerable...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Zuo-Wei Yeh, Chia-Hua Hsu, White, Alexander, Chen-Fu, Yeh, Wu, Wen-Chieh, Cheng-Te, Wang, Chung-Chuan Lo, Tang, Kea-Tiong
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 19.01.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The inner operations of the human brain as a biological processing system remain largely a mystery. Inspired by the function of the human brain and based on the analysis of simple neural network systems in other species, such as Drosophila, neuromorphic computing systems have attracted considerable interest. In cellular-level connectomics research, we can identify the characteristics of biological neural network, called population, which constitute not only recurrent fullyconnection in network, also an external-stimulus and selfconnection in each neuron. Relying on low data bandwidth of spike transmission in network and input data, Spiking Neural Networks exhibit low-latency and low-power design. In this study, we proposed a configurable population-based digital spiking neuromorphic processor in 180nm process technology with two configurable hierarchy populations. Also, these neurons in the processor can be configured as novel models, integer quadratic integrate-and-fire neuron models, which contain an unsigned 8-bit membrane potential value. The processor can implement intelligent decision making for avoidance in real-time. Moreover, the proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
ISSN:2331-8422
DOI:10.48550/arxiv.2201.07490