LocalNorm: Robust Image Classification Through Dynamically Regularized Normalization

While modern convolutional neural networks achieve outstanding accuracy on many image classification tasks, they are, once trained, much more sensitive to image degradation compared to humans. Much of this sensitivity is caused by the resultant shift in data distribution. As we show, dynamically rec...

Full description

Saved in:
Bibliographic Details
Published inArtificial Neural Networks and Machine Learning - ICANN 2021 Vol. 12894; pp. 240 - 252
Main Authors Yin, Bojian, Scholte, H. Steven, Bohté, Sander
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2021
Springer International Publishing
SeriesLecture Notes in Computer Science
Online AccessGet full text

Cover

Loading…
More Information
Summary:While modern convolutional neural networks achieve outstanding accuracy on many image classification tasks, they are, once trained, much more sensitive to image degradation compared to humans. Much of this sensitivity is caused by the resultant shift in data distribution. As we show, dynamically recalculating summary statistics for normalization over batches at test-time improves network robustness, but at the expense of accuracy. Here, we describe a variant of Batch Normalization, LocalNorm, that regularizes the normalization layer in the spirit of Dropout during training, while dynamically adapting to the local image intensity and contrast at test-time. We show that the resulting deep neural networks are much more resistant to noise-induced image degradation, while achieving the same or slightly better accuracy on non-degraded classical benchmarks and where calculating single image summary statistics at test-time suffices. In computational terms, LocalNorm adds negligible training cost and little or no cost at inference time, and can be applied to pre-trained networks in a straightforward manner.
ISBN:9783030863791
3030863794
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-030-86380-7_20