Effective Prevention of Semantic Drift in Continual Deep Learning
Lifelong machine learning or continual learning models attempt to learn incrementally by accumulating knowledge across a sequence of tasks. Therefore, these models learn better and faster. They are used in various intelligent systems that have to interact with humans or any dynamic environment. Dyna...
Saved in:
Published in | Intelligent Data Engineering and Automated Learning – IDEAL 2022 pp. 456 - 464 |
---|---|
Main Authors | , |
Format | Book Chapter |
Language | English |
Published |
Cham
Springer International Publishing
|
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Lifelong machine learning or continual learning models attempt to learn incrementally by accumulating knowledge across a sequence of tasks. Therefore, these models learn better and faster. They are used in various intelligent systems that have to interact with humans or any dynamic environment. Dynamically expandable networks are continual deep learning models that allow its architecture to expand with a sequence of tasks. The model retains knowledge from the previous tasks that results in high performance on newer tasks. The existing models use Minkowski distance measures to separate nodes of the current network, resulting in higher catastrophic forgetting. These measures are susceptible to high dimensional sparse vectors, resulting in sub-optimum performance. We propose ang-DEN, as a dynamically expanding continual learning architecture that use angular distance metric. It addresses semantic drift through better separation of nodes achieving 97% average accuracy with an improvement of 1.3% across all tasks on MNIST variant datasets. |
---|---|
Bibliography: | K. Saadi—Independent Researcher.This work was supported by Fatima Al-Fihri predoctoral fellowship program (https://fatimafellowship.com/). |
ISBN: | 9783031217524 3031217527 |
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-031-21753-1_44 |