Automated Zebrafish Spine Scoring System Based on Instance Segmentation

In studying new medicines for osteoporosis, researchers use zebrafish as animal subjects to test drugs and observe the growth situation of their vertebrae in the spine to confirm the efficacy of new medicines. However, the current method for evaluating efficacy is time-consuming and labor-intensive,...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 13; pp. 18814 - 18826
Main Authors Chen, Wen-Hsin, Kuo, Tien-Ying, Wei, Yu-Jen, Ho, Cheng-Jung, Lin, Ming-der, Chen, Huan, Lin, Wen-Ying
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In studying new medicines for osteoporosis, researchers use zebrafish as animal subjects to test drugs and observe the growth situation of their vertebrae in the spine to confirm the efficacy of new medicines. However, the current method for evaluating efficacy is time-consuming and labor-intensive, requiring manual observation. Taking advantage of advancements in deep learning technology, we propose an automatic method for detecting and recognizing zebrafish vertebrae of the images captured from image sensors to solve this problem. Our method was designed using Mask R-CNN as the instance segmentation backbone, enhanced with a mask enhancement module and a small object preprocessing approach to strengthen its detection abilities. Compared to the original Mask R-CNN architecture, our method improved the mean average precision (mAP) score for vertebra bounding box and mask detection by 7.1% to 97.7% and by 1.2% to 96.6%, respectively. Additionally, we developed a system using these detection algorithms to automatically calculate spinal vertebra growth scores, providing a valuable tool for researchers to assess drug efficacy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2025.3532680