Fast Cell Segmentation Using Scalable Sparse Manifold Learning and Affine Transform-Approximated Active Contour

Efficient and effective cell segmentation of neuroendocrine tumor (NET) in whole slide scanned images is a difficult task due to a large number of cells. The weak or misleading cell boundaries also present significant challenges. In this paper, we propose a fast, high throughput cell segmentation al...

Full description

Saved in:
Bibliographic Details
Published inMedical Image Computing and Computer-Assisted Intervention – MICCAI 2015 Vol. 9351; pp. 332 - 339
Main Authors Xing, Fuyong, Yang, Lin
Format Book Chapter Journal Article
LanguageEnglish
Published Cham Springer International Publishing 01.10.2015
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Efficient and effective cell segmentation of neuroendocrine tumor (NET) in whole slide scanned images is a difficult task due to a large number of cells. The weak or misleading cell boundaries also present significant challenges. In this paper, we propose a fast, high throughput cell segmentation algorithm by combining top-down shape models and bottom-up image appearance information. A scalable sparse manifold learning method is proposed to model multiple subpopulations of different cell shape priors. Followed by a shape clustering on the manifold, a novel affine transform-approximated active contour model is derived to deform contours without solving a large amount of computationally-expensive Euler-Lagrange equations, and thus dramatically reduces the computational time. To the best of our knowledge, this is the first report of a high throughput cell segmentation algorithm for whole slide scanned pathology specimens using manifold learning to accelerate active contour models. The proposed approach is tested using 12 NET images, and the comparative experiments with the state of the arts demonstrate its superior performance in terms of both efficiency and effectiveness.
ISBN:9783319245737
3319245732
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-24574-4_40