Multimodal Local Representation Learning for Multi-Task Blastocyst Assessment
Blastocyst assessment is a critical step to influence the live birth rate in the in vitro fertilization (IVF) treatment. We propose a pioneer multimodal local representation learning framework that leverages both visual and textual information, which provides a comprehensive and automatic assessment...
Saved in:
Published in | 2024 IEEE International Symposium on Biomedical Imaging (ISBI) pp. 1 - 5 |
---|---|
Main Authors | , , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
27.05.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Blastocyst assessment is a critical step to influence the live birth rate in the in vitro fertilization (IVF) treatment. We propose a pioneer multimodal local representation learning framework that leverages both visual and textual information, which provides a comprehensive and automatic assessment of blastocyst quality. The model redefines the blastocyst assessment as an image-text retrieval multi-task, assessing two main blastocyst components, the inner cell mass (ICM) and trophoblast (TE), respectively. By learning local representation, our approach captures the fine-grained similarity between text descriptions and image patches, enhancing the accuracy and interpretability of the assessment model. The experimental results are promising, achieving accuracy 89.1% for ICM and 91.6% for TE respectively. Furthermore, this proposed local representation learning framework may extend to other multi-task biomedical imaging applications. |
---|---|
ISSN: | 1945-8452 |
DOI: | 10.1109/ISBI56570.2024.10635863 |