The Brain-Inspired Decoder for Natural Visual Image Reconstruction

Decoding images from brain activity has been a challenge. Owing to the development of deep learning, there are available tools to solve this problem. The decoded image, which aims to map neural spike trains to low-level visual features and high-level semantic information space. Recently, there are a...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Li, Wenyi, Zheng, Shengjie, Liao, Yufan, Hong, Rongqi, Chen, Weiliang, He, Chenggnag, Li, Xiaojian
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 18.07.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Decoding images from brain activity has been a challenge. Owing to the development of deep learning, there are available tools to solve this problem. The decoded image, which aims to map neural spike trains to low-level visual features and high-level semantic information space. Recently, there are a few studies of decoding from spike trains, however, these studies pay less attention to the foundations of neuroscience and there are few studies that merged receptive field into visual image reconstruction. In this paper, we propose a deep learning neural network architecture with biological properties to reconstruct visual image from spike trains. As far as we know, we implemented a method that integrated receptive field property matrix into loss function at the first time. Our model is an end-to-end decoder from neural spike trains to images. We not only merged Gabor filter into auto-encoder which used to generate images but also proposed a loss function with receptive field properties. We evaluated our decoder on two datasets which contain macaque primary visual cortex neural spikes and salamander retina ganglion cells (RGCs) spikes. Our results show that our method can effectively combine receptive field features to reconstruct images, providing a new approach to visual reconstruction based on neural information.
ISSN:2331-8422