Multiple scale neural architecture for enhancing regions in the colour image segmentation process
: A dynamic multi‐scale neural model for enhancing regions and extracting contours in the colour image segmentation process is proposed. This model combines colour and textural information to coherently enhance images through the operation of two main components: the colour opponent system (COS) and...
Saved in:
Published in | Expert systems Vol. 28; no. 1; pp. 70 - 96 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Oxford, UK
Blackwell Publishing Ltd
01.02.2011
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | : A dynamic multi‐scale neural model for enhancing regions and extracting contours in the colour image segmentation process is proposed. This model combines colour and textural information to coherently enhance images through the operation of two main components: the colour opponent system (COS) and the chromatic segmentation system (CSS). First, the COS module transforms the RGB chromatic input signals into long‐ (L), middle‐ (M) and short‐ (S) wavelength cone activations and luminance signals and then it generates the luminance (black–white), L–M and S–(L+M) opponent channels. The CSS module incorporates contour extraction, double opponency mechanisms and diffusion processes to yield coherent enhancing regions in colour image segmentation. This enhancement allows the region labelling to be more efficient due to a reduction in the uncertainty of the images' region allocation. The CSS module is based on the boundary contour system/feature contour system, extending it to allow colour stimuli processing to obtain general purpose architecture for image segmentation with later applications in computer vision and object recognition. Simulations show the good visual results obtained and the robustness of the architecture when processing images presenting different levels of noise. A benchmark using the Berkeley Segmentation Dataset is included in order to quantify and compare the results obtained. |
---|---|
Bibliography: | ArticleID:EXSY543 istex:E23295A2A5AD8A3868698EB06F514E6F9DF79D6E ark:/67375/WNG-HG2H4XT8-0 ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 |
ISSN: | 0266-4720 1468-0394 |
DOI: | 10.1111/j.1468-0394.2010.00543.x |