Dynamic Capacity Estimation in Hopfield Networks
Understanding the memory capacity of neural networks remains a challenging problem in implementing artificial intelligence systems. In this paper, we address the notion of capacity with respect to Hopfield networks and propose a dynamic approach to monitoring a network's capacity. We define our...
Saved in:
Main Authors | , |
---|---|
Format | Journal Article |
Language | English |
Published |
14.09.2017
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Understanding the memory capacity of neural networks remains a challenging
problem in implementing artificial intelligence systems. In this paper, we
address the notion of capacity with respect to Hopfield networks and propose a
dynamic approach to monitoring a network's capacity. We define our
understanding of capacity as the maximum number of stored patterns which can be
retrieved when probed by the stored patterns. Prior work in this area has
presented static expressions dependent on neuron count $N$, forcing network
designers to assume worst-case input characteristics for bias and correlation
when setting the capacity of the network. Instead, our model operates
simultaneously with the learning Hopfield network and concludes on a capacity
estimate based on the patterns which were stored. By continuously updating the
crosstalk associated with the stored patterns, our model guards the network
from overwriting its memory traces and exceeding its capacity. We simulate our
model using artificially generated random patterns, which can be set to a
desired bias and correlation, and observe capacity estimates between 93% and
97% accurate. As a result, our model doubles the memory efficiency of Hopfield
networks in comparison to the static and worst-case capacity estimate while
minimizing the risk of lost patterns. |
---|---|
DOI: | 10.48550/arxiv.1709.05340 |