MVU-Net: a multi-view U-Net architecture for weakly supervised vortex detection

Vortex detection plays a fundamental role in turbulence research and engineering problems. However, due to the lack of a mathematically rigorous vortex definition, as well as the absence of any vortex-oriented database, both traditional and machine learning detection methods achieve only limited per...

Full description

Saved in:
Bibliographic Details
Published inEngineering applications of computational fluid mechanics Vol. 16; no. 1; pp. 1567 - 1586
Main Authors Deng, Liang, Chen, Jianqiang, Wang, Yueqing, Chen, Xinhai, Wang, Fang, Liu, Jie
Format Journal Article
LanguageEnglish
Published Hong Kong Taylor & Francis 31.12.2022
Taylor & Francis Ltd
Taylor & Francis Group
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Vortex detection plays a fundamental role in turbulence research and engineering problems. However, due to the lack of a mathematically rigorous vortex definition, as well as the absence of any vortex-oriented database, both traditional and machine learning detection methods achieve only limited performance. In this paper, we develop a deep learning model for vortex detection using a weak supervision approach. In order to avoid the need for a vast amount of manual labeling work, we employ an automatic clustering approach to encode vortex-like behavior as the basis for programmatically generating large-scale, highly reliable training labels. Moreover, to speed up the clustering method, a multi-view U-Net (MVU-Net) model is proposed to approximate the clustering results using the knowledge distillation technique. A multi-view learning strategy is further applied to integrate the information across multiple variables. In addition, we propose a physics-informed loss function, which enables our model to explicitly consider the characteristics of flow fields. The results on eight real-world scientific simulation applications show that the proposed MVU-Net model significantly outperforms other state-of-the-art methods on both efficiency and accuracy.
ISSN:1994-2060
1997-003X
DOI:10.1080/19942060.2022.2104930