Brief technical note on linearizing recurrent neural networks (RNNs) before vs after the pointwise nonlinearity

Linearization of the dynamics of recurrent neural networks (RNNs) is often used to study their properties. The same RNN dynamics can be written in terms of the ``activations" (the net inputs to each unit, before its pointwise nonlinearity) or in terms of the ``activities" (the output of ea...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Pagan, Marino, Valente, Adrian, Ostojic, Srdjan, Brody, Carlos D
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 07.09.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Linearization of the dynamics of recurrent neural networks (RNNs) is often used to study their properties. The same RNN dynamics can be written in terms of the ``activations" (the net inputs to each unit, before its pointwise nonlinearity) or in terms of the ``activities" (the output of each unit, after its pointwise nonlinearity); the two corresponding linearizations are different from each other. This brief and informal technical note describes the relationship between the two linearizations, between the left and right eigenvectors of their dynamics matrices, and shows that some context-dependent effects are readily apparent under linearization of activity dynamics but not linearization of activation dynamics.
ISSN:2331-8422