First Trimester Video Saliency Prediction Using Clstmu-Net with Stochastic Augmentation

In this paper we develop a multi-modal video analysis algorithm to predict where a sonographer should look next. Our approach uses video and expert knowledge, defined by gaze tracking data, which is acquired during routine first-trimester fetal ultrasound scanning. Specifically, we propose a spatio-...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI) Vol. 2022; pp. 1 - 4
Main Authors Savochkina, Elizaveta, Lee, Lok Hin, Zhao, He, Drukker, Lior, Papageorghiou, Aris T., Alison Noble, J.
Format Conference Proceeding Journal Article
LanguageEnglish
Published United States IEEE 01.01.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper we develop a multi-modal video analysis algorithm to predict where a sonographer should look next. Our approach uses video and expert knowledge, defined by gaze tracking data, which is acquired during routine first-trimester fetal ultrasound scanning. Specifically, we propose a spatio-temporal convolutional LSTMU-Net neural network (cLSTMU-Net) for video saliency prediction with stochastic augmentation. The architecture design consists of a U-Net based encoder-decoder network and a cLSTM to take into account temporal information. We compare the performance of the cLSTMU-Net alongside spatial-only architectures for the task of predicting gaze in first trimester ultrasound videos. Our study dataset consists of 115 clinically acquired first trimester US videos and a total of 45, 666 video frames. We adopt a Random Augmentation strategy (RA) from a stochastic augmentation policy search to improve model performance and reduce over-fitting. The proposed cLSTMU-Net using a video clip of 6 frames outperforms the baseline approach on all saliency metrics: KLD, SIM, NSS and CC (2.08, 0.28, 4.53 and 0.42 versus 2.16, 0.27, 4.34 and 0.39).
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1945-7928
1945-8452
DOI:10.1109/ISBI52829.2022.9761585