Methods and Apparatuses for Bottleneck Stages in Neural-Network Processing

Methods and apparatuses herein improve bottleneck-layer processing in neural networks. Example advantages include reducing the number of accesses needed to external memory, allowing processing to run in parallel in successive bottleneck layers, based on the use of partial convolutional results, and...

Full description

Saved in:
Bibliographic Details
Main Authors Berkeman, Anders, Karlsson, Sven
Format Patent
LanguageEnglish
Published 21.12.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Methods and apparatuses herein improve bottleneck-layer processing in neural networks. Example advantages include reducing the number of accesses needed to external memory, allowing processing to run in parallel in successive bottleneck layers, based on the use of partial convolutional results, and balancing the amount of "local" memory used for storing convolutional results against the computational overhead of recomputing partial results. One aspect of the methods and apparatuses involves co-locating arithmetic and logical operators and temporary storage in the same data path, with the approach yielding both higher performance and greater energy efficiency in the implementation of bottleneck layers for neural network processing.
Bibliography:Application Number: US202018031065