Meta-learning synaptic plasticity and memory addressing for continual familiarity detection

Over the course of a lifetime, we process a continual stream of information. Extracted from this stream, memories must be efficiently encoded and stored in an addressable manner for retrieval. To explore potential mechanisms, we consider a familiarity detection task in which a subject reports whethe...

Full description

Saved in:
Bibliographic Details
Published inNeuron (Cambridge, Mass.) Vol. 110; no. 3; pp. 544 - 557.e8
Main Authors Tyulmankov, Danil, Yang, Guangyu Robert, Abbott, L.F.
Format Journal Article
LanguageEnglish
Published United States Elsevier Inc 02.02.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Over the course of a lifetime, we process a continual stream of information. Extracted from this stream, memories must be efficiently encoded and stored in an addressable manner for retrieval. To explore potential mechanisms, we consider a familiarity detection task in which a subject reports whether an image has been previously encountered. We design a feedforward network endowed with synaptic plasticity and an addressing matrix, meta-learned to optimize familiarity detection over long intervals. We find that anti-Hebbian plasticity leads to better performance than Hebbian plasticity and replicates experimental results such as repetition suppression. A combinatorial addressing function emerges, selecting a unique neuron as an index into the synaptic memory matrix for storage or retrieval. Unlike previous models, this network operates continuously and generalizes to intervals it has not been trained on. Our work suggests a biologically plausible mechanism for continual learning and demonstrates an effective application of machine learning for neuroscience discovery. •Meta-learning is used to discover network architectures and plasticity rules•Anti-Hebbian plasticity emerges as the mechanism for encoding familiarity•Strong feedforward synapses emerge as an addressing function for storage and retrieval•Experimental features such as repetition suppression are reproduced Tyulmankov et al. use meta-learning to build neural network models for continual familiarity detection. They show that anti-Hebbian plasticity is the preferred mechanism for optimizing memory capacity and propose strong feedforward weights as an explicit addressing mechanism for selecting memory locations during storage and retrieval.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Conceptualization, DT, GRY, LFA; Methodology, DT, GRY, LFA; Software, DT; Investigation, DT; Writing – Original Draft, DT, Writing – Review & Editing, DT, GRY, LFA; Visualization, DT; Supervision, GRY, LFA; Funding Acquisition, LFA
Author Contributions
ISSN:0896-6273
1097-4199
1097-4199
DOI:10.1016/j.neuron.2021.11.009