Color Channel Compensation (3C): A Fundamental Pre-Processing Step for Image Enhancement

This article introduces a novel solution to improve image enhancement in terms of color appearance. Our approach, called Color Channel Compensation (3C), overcomes artifacts resulting from the severely non-uniform color spectrum distribution encountered in images captured under hazy night-time condi...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 29; pp. 2653 - 2665
Main Authors Ancuti, Codruta O., Ancuti, Cosmin, De Vleeschouwer, Christophe, Sbert, Mateu
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This article introduces a novel solution to improve image enhancement in terms of color appearance. Our approach, called Color Channel Compensation (3C), overcomes artifacts resulting from the severely non-uniform color spectrum distribution encountered in images captured under hazy night-time conditions, underwater, or under non-uniform artificial illumination. Our solution is founded on the observation that, under such adverse conditions, the information contained in at least one color channel is close to completely lost, making the traditional enhancing techniques subject to noise and color shifting. In those cases, our pre-processing method proposes to reconstruct the lost channel based on the opponent color channel. Our algorithm subtracts a local mean from each opponent color pixel. Thereby, it partly recovers the lost color from the two colors (red-green or blue-yellow) involved in the opponent color channel. The proposed approach, whilst simple, is shown to consistently improve the outcome of conventional restoration methods. To prove the utility of our 3C operator, we provide an extensive qualitative and quantitative evaluation for white balancing, image dehazing, and underwater enhancement applications.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2019.2951304