A supervised U-Net based color image semantic segmentation for detection & classification of human intestinal parasites

Intestinal parasites are among the main public health problems around the world especially in underprivileged communities where overcrowded, poor environmental sanitation and lack of access for clear and safe water are prevalent. Currently, approximately 4 billion people are infected by intestinal p...

Full description

Saved in:
Bibliographic Details
Published ine-Prime Vol. 2; p. 100069
Main Authors Libouga, Ideal Oscar, Bitjoka, Laurent, Gwet, David Libouga Li, Boukar, Ousman, Nlôga, Alexandre Michel Njan
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 2022
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Intestinal parasites are among the main public health problems around the world especially in underprivileged communities where overcrowded, poor environmental sanitation and lack of access for clear and safe water are prevalent. Currently, approximately 4 billion people are infected by intestinal parasites worldwide. Diseases caused by such infections can induce physical and mental disorders, and even death to children and immunodeficient individuals. Microscopy remains the gold standard in the diagnosis of human intestinal parasites. However, that manual approach requiring well-trained and experienced parasitologists is time consuming, tedious and subjective. To cope with this, machine learning tools are increasingly used to assist the medical profession in the diagnosis of intestinal parasites. Deep learning models are currently the best machine learning tools. In our work, we built a data set of 320 microscopic color images in which we detect 4 types of intestinal parasites: Ascaris Lumbricoides, Schistosoma mansoni, Trichuris Trichiura and Oxyure (or Enterobius vermicularis). Inspired by the structure of the U-Net, a deep learning model is developed for segmentation of the 4 types of intestinal parasites from the background (including fecal impurities) of the image. The proposed model has proven to be robust to underfiting and overfiting. The results show an overall detection performance of 99.8% accuracy.
ISSN:2772-6711
2772-6711
DOI:10.1016/j.prime.2022.100069