ARTIFICIAL NEURAL NETWORK COMPUTATION ACCELERATION APPARATUS FOR DISTRIBUTED PROCESSING, ARTIFICIAL NEURAL NETWORK ACCELERATION SYSTEM USING SAME, AND ARTIFICIAL NEURAL NETWORK ACCELERATION METHOD THEREFOR

An artificial neural network computation acceleration apparatus for distributed processing, according to the present invention, can comprise: an external main memory for storing input data and synapse weights for input neurons; an internal buffer memory for storing a synapse weight and input data re...

Full description

Saved in:
Bibliographic Details
Main Authors LEE SANG-HEON, KIM BONG-JEONG, KIM JU-HYOK
Format Patent
LanguageChinese
English
Published 14.05.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:An artificial neural network computation acceleration apparatus for distributed processing, according to the present invention, can comprise: an external main memory for storing input data and synapse weights for input neurons; an internal buffer memory for storing a synapse weight and input data required for each cycle constituting the artificial neural network computation; a DMA module for directly transmitting/receiving data to/from the external main memory and the internal buffer memory; a neural network computation device for repetitively processing, for each cycle constituting the artificial neural network computation, a series of sequential steps of reading the synapse weight and the input data stored in the internal buffer memory so as to perform an artificial neural network computation and storing the computation result in the external main memory; a CPU for controlling an operation of storing the input data and the synapse weights for the input neurons in the external main memory and the internal bu
Bibliography:Application Number: CN201980065991