Parallel and Recurrent Cascade Models as a Unifying Force for Understanding Subcellular Computation
•Biophysical models can capture dendritic computation, but they are too complicated for algorithmic-level models.•Parallel, recurrent cascade models (PRC) models instead treat an individual neuron as a deep, recurrent neural network.•PRC models can capture sub-cellular dendritic computations, such a...
Saved in:
Published in | Neuroscience Vol. 489; pp. 200 - 215 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
Elsevier Ltd
01.05.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •Biophysical models can capture dendritic computation, but they are too complicated for algorithmic-level models.•Parallel, recurrent cascade models (PRC) models instead treat an individual neuron as a deep, recurrent neural network.•PRC models can capture sub-cellular dendritic computations, such as NMDA spikes or coincidence detection.•PRC models can also be trained with gradient descent to solve machine learning tasks.•PRC models provide the means to understand dendritic computation at the algorithmic level.
Neurons are very complicated computational devices, incorporating numerous non-linear processes, particularly in their dendrites. Biophysical models capture these processes directly by explicitly modelling physiological variables, such as ion channels, current flow, membrane capacitance, etc. However, another option for capturing the complexities of real neural computation is to use cascade models, which treat individual neurons as a cascade of linear and non-linear operations, akin to a multi-layer artificial neural network. Recent research has shown that cascade models can capture single-cell computation well, but there are still a number of sub-cellular, regenerative dendritic phenomena that they cannot capture, such as the interaction between sodium, calcium, and NMDA spikes in different compartments. Here, we propose that it is possible to capture these additional phenomena using parallel, recurrent cascade models, wherein an individual neuron is modelled as a cascade of parallel linear and non-linear operations that can be connected recurrently, akin to a multi-layer, recurrent, artificial neural network. Given their tractable mathematical structure, we show that neuron models expressed in terms of parallel recurrent cascades can themselves be integrated into multi-layered artificial neural networks and trained to perform complex tasks. We go on to discuss potential implications and uses of these models for artificial intelligence. Overall, we argue that parallel, recurrent cascade models provide an important, unifying tool for capturing single-cell computation and exploring the algorithmic implications of physiological phenomena. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0306-4522 1873-7544 |
DOI: | 10.1016/j.neuroscience.2021.07.026 |