Synergy, Redundancy, and Independence in Population Codes, Revisited

Decoding the activity of a population of neurons is a fundamental problem in neuroscience. A key aspect of this problem is determining whether correlations in the activity, i.e., noise correlations, are important. If they are important, then the decoding problem is high dimensional: decoding algorit...

Full description

Saved in:
Bibliographic Details
Published inThe Journal of neuroscience Vol. 25; no. 21; pp. 5195 - 5206
Main Authors Latham, Peter E, Nirenberg, Sheila
Format Journal Article
LanguageEnglish
Published United States Soc Neuroscience 25.05.2005
Society for Neuroscience
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Decoding the activity of a population of neurons is a fundamental problem in neuroscience. A key aspect of this problem is determining whether correlations in the activity, i.e., noise correlations, are important. If they are important, then the decoding problem is high dimensional: decoding algorithms must take the correlational structure in the activity into account. If they are not important, or if they play a minor role, then the decoding problem can be reduced to lower dimension and thus made more tractable. The issue of whether correlations are important has been a subject of heated debate. The debate centers around the validity of the measures used to address it. Here, we evaluate three of the most commonly used ones: synergy, DeltaI(shuffled), and DeltaI. We show that synergy and DeltaI(shuffled) are confounded measures: they can be zero when correlations are clearly important for decoding and positive when they are not. In contrast, DeltaI is not confounded. It is zero only when correlations are not important for decoding and positive only when they are; that is, it is zero only when one can decode exactly as well using a decoder that ignores correlations as one can using a decoder that does not, and it is positive only when one cannot decode as well. Finally, we show that DeltaI has an information theoretic interpretation; it is an upper bound on the information lost when correlations are ignored.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0270-6474
1529-2401
DOI:10.1523/JNEUROSCI.5319-04.2005