STRUCTURED SPARSE MEMORY HIERARCHY FOR DEEP LEARNING
A memory system and a method are disclosed for training a neural network model. A decompressor unit decompresses an activation tensor to a first predetermined sparsity density based on the activation tensor being compressed, and decompresses an weight tensor to a second predetermined sparsity densit...
Saved in:
Main Authors | , , |
---|---|
Format | Patent |
Language | English |
Published |
21.03.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | A memory system and a method are disclosed for training a neural network model. A decompressor unit decompresses an activation tensor to a first predetermined sparsity density based on the activation tensor being compressed, and decompresses an weight tensor to a second predetermined sparsity density based on the weight tensor being compressed. A buffer unit receives the activation tensor at the first predetermined sparsity density and the weight tensor at the second predetermined sparsity density. A neural processing unit receives the activation tensor and the weight tensor from the buffer unit and computes a result for the activation tensor and the weight tensor based on first predetermined sparsity density of the activation tensor and based on the second predetermined sparsity density of the weight tensor. |
---|---|
Bibliography: | Application Number: US202217988739 |