AdaptHD: Adaptive Efficient Training for Brain-Inspired Hyperdimensional Computing
Brain-inspired Hyperdimensional (HD) computing is a promising solution for energy-efficient classification. HD emulates cognition tasks by exploiting long-size vectors instead of working with numeric values used in contemporary processors. However, the existing HD computing algorithms have lack of c...
Saved in:
Published in | 2019 IEEE Biomedical Circuits and Systems Conference (BioCAS) pp. 1 - 4 |
---|---|
Main Authors | , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.10.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Brain-inspired Hyperdimensional (HD) computing is a promising solution for energy-efficient classification. HD emulates cognition tasks by exploiting long-size vectors instead of working with numeric values used in contemporary processors. However, the existing HD computing algorithms have lack of controllability on the training iterations which often results in slow training or divergence. In this work, we propose AdaptHD, an adaptive learning approach based on HD computing to address the HD training issues. AdaptHD introduces the definition of learning rate in HD computing and proposes two approaches for adaptive training: iteration-dependent and data-dependent. In the iteration-dependent approach, AdaptHD uses a large learning rate to speedup the training procedure in the first iterations, and then adaptively reduces the learning rate depending on the slope of the error rate. In the data-dependent approach, AdaptHD changes the learning rate for each data point depending on how far off the data was misclassified. Our evaluations on a wide range of classification applications show that AdaptHD achieves 6.9× speedup and 6.3× energy efficiency improvement during training as compared to the state-of-the-art HD computing algorithm. |
---|---|
DOI: | 10.1109/BIOCAS.2019.8918974 |