Subpixel rendering for the pentile display based on the human visual system

Recently, with the increased demand for low-power displays regarding portable devices, the RGBG PenTile display is popularly utilized. However, unlike the traditional RGB-stripe display with its three color channels, each pixel of the RGBG PenTile display comprises only two color channels, thereby c...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on consumer electronics Vol. 63; no. 4; pp. 401 - 409
Main Authors Chae, Sung-Ho, Yoo, Cheol-Hwan, Sun, Jee-Young, Kang, Mun-Cheon, Ko, Sung-Jea
Format Journal Article
LanguageEnglish
Published New York IEEE 01.11.2017
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recently, with the increased demand for low-power displays regarding portable devices, the RGBG PenTile display is popularly utilized. However, unlike the traditional RGB-stripe display with its three color channels, each pixel of the RGBG PenTile display comprises only two color channels, thereby causing the color leakage image distortion. To cope with this problem, most of the conventional methods employ a preprocessing filter for subpixel rendering; however, these filters cannot remove the color leakage completely, and they also result in the blurring artifacts. In this paper, a novel approach to obtain the preprocessing filter is presented. We formulate a filter design method as a minimum mean square error problem and derive an optimal preprocessing filter that is based on the human visual system (HVS) as follows; first, two perceived images indicating how human recognizes the images on the RGB and RGBG displays are generated and then the difference between the two perceived images is minimized to derive the optimal filter. In addition, in order to prevent the blurring artifact, the proposed filter is applied to the previously detected color-leakage region only. Experimental results demonstrate that the proposed method outperforms the conventional methods in terms of both the subjective and objective image quality.
ISSN:0098-3063
1558-4127
DOI:10.1109/TCE.2017.015103