BOWLL: A Deceptively Simple Open World Lifelong Learner
The quest to improve scalar performance numbers on predetermined benchmarks seems to be deeply engraved in deep learning. However, the real world is seldom carefully curated and applications are seldom limited to excelling on test sets. A practical system is generally required to recognize novel con...
Saved in:
Main Authors | , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
07.02.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The quest to improve scalar performance numbers on predetermined benchmarks
seems to be deeply engraved in deep learning. However, the real world is seldom
carefully curated and applications are seldom limited to excelling on test
sets. A practical system is generally required to recognize novel concepts,
refrain from actively including uninformative data, and retain previously
acquired knowledge throughout its lifetime. Despite these key elements being
rigorously researched individually, the study of their conjunction, open world
lifelong learning, is only a recent trend. To accelerate this multifaceted
field's exploration, we introduce its first monolithic and much-needed
baseline. Leveraging the ubiquitous use of batch normalization across deep
neural networks, we propose a deceptively simple yet highly effective way to
repurpose standard models for open world lifelong learning. Through extensive
empirical evaluation, we highlight why our approach should serve as a future
standard for models that are able to effectively maintain their knowledge,
selectively focus on informative data, and accelerate future learning. |
---|---|
DOI: | 10.48550/arxiv.2402.04814 |