BOWLL: A Deceptively Simple Open World Lifelong Learner

The quest to improve scalar performance numbers on predetermined benchmarks seems to be deeply engraved in deep learning. However, the real world is seldom carefully curated and applications are seldom limited to excelling on test sets. A practical system is generally required to recognize novel con...

Full description

Saved in:
Bibliographic Details
Main Authors Kamath, Roshni, Mitchell, Rupert, Paul, Subarnaduti, Kersting, Kristian, Mundt, Martin
Format Journal Article
LanguageEnglish
Published 07.02.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The quest to improve scalar performance numbers on predetermined benchmarks seems to be deeply engraved in deep learning. However, the real world is seldom carefully curated and applications are seldom limited to excelling on test sets. A practical system is generally required to recognize novel concepts, refrain from actively including uninformative data, and retain previously acquired knowledge throughout its lifetime. Despite these key elements being rigorously researched individually, the study of their conjunction, open world lifelong learning, is only a recent trend. To accelerate this multifaceted field's exploration, we introduce its first monolithic and much-needed baseline. Leveraging the ubiquitous use of batch normalization across deep neural networks, we propose a deceptively simple yet highly effective way to repurpose standard models for open world lifelong learning. Through extensive empirical evaluation, we highlight why our approach should serve as a future standard for models that are able to effectively maintain their knowledge, selectively focus on informative data, and accelerate future learning.
DOI:10.48550/arxiv.2402.04814