Combining intensity, edge and shape information for 2D and 3D segmentation of cell nuclei in tissue sections
Summary We present a region‐based segmentation method in which seeds representing both object and background pixels are created by combining morphological filtering of both the original image and the gradient magnitude of the image. The seeds are then used as starting points for watershed segmentati...
Saved in:
Published in | Journal of microscopy (Oxford) Vol. 215; no. 1; pp. 67 - 76 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Oxford, UK
Blackwell Science Ltd
01.07.2004
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Summary
We present a region‐based segmentation method in which seeds representing both object and background pixels are created by combining morphological filtering of both the original image and the gradient magnitude of the image. The seeds are then used as starting points for watershed segmentation of the gradient magnitude image. The fully automatic seeding is done in a generous fashion, so that at least one seed will be set in each foreground object. If more than one seed is placed in a single object, the watershed segmentation will lead to an initial over‐segmentation, i.e. a boundary is created where there is no strong edge. Thus, the result of the initial segmentation is further refined by merging based on the gradient magnitude along the boundary separating neighbouring objects. This step also makes it easy to remove objects with poor contrast. As a final step, clusters of nuclei are separated, based on the shape of the cluster. The number of input parameters to the full segmentation procedure is only five. These parameters can be set manually using a test image and thereafter be used on a large number of images created under similar imaging conditions. This automated system was verified by comparison with manual counts from the same image fields. About 90% correct segmentation was achieved for two‐ as well as three‐dimensional images. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0022-2720 1365-2818 1365-2818 |
DOI: | 10.1111/j.0022-2720.2004.01338.x |