Self-organizing incremental associative memory model under capacity constraint

Due to the advantages of self-organizing neural network like parallelism, fault freedom and self-learning, it has been widely used all over the place. However, in traditional associative memory neural networks, the number of network nodes will unlimitedly grow when they incrementally learning more a...

Full description

Saved in:
Bibliographic Details
Published inJisuanji Kexue yu Tansuo / Journal of Computer Science and Frontiers Vol. 10; no. 1; pp. 130 - 141
Main Authors Sun, Tao, Xie, Zhenping, Wang, Shitong, Liu, Yuan
Format Journal Article
LanguageChinese
Published 01.01.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Due to the advantages of self-organizing neural network like parallelism, fault freedom and self-learning, it has been widely used all over the place. However, in traditional associative memory neural networks, the number of network nodes will unlimitedly grow when they incrementally learning more and more samples, which inevitably leads to an unaffordable overhead of computation and storage. To solve this problem, this paper proposes a self-organizing incremental associative memory model under capacity constraint. By limiting the number of network nodes and introducing a self-competition strategy between network nodes, new model is capable of incrementally learning large-scale samples and can gain equivalent associative memory performance only requiring lower computing demand. The reasonability of model is proved by theoretical analysis. Moreover, the experimental results demonstrate that new model can effectively control computing consumption, improve the efficiency of incrementally learning new samples, an
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
content type line 23
ObjectType-Feature-2
ISSN:1673-9418
DOI:10.3778/j.issn.1673-9418.1505007