Deep learning for high-resolution seismic imaging

Seismic imaging techniques play a crucial role in interpreting subsurface geological structures by analyzing the propagation and reflection of seismic waves. However, traditional methods face challenges in achieving high resolution due to theoretical constraints and computational costs. Leveraging r...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 14; no. 1; p. 10319
Main Authors Ma, Liyun, Han, Liguo, Feng, Qiang
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 06.05.2024
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Seismic imaging techniques play a crucial role in interpreting subsurface geological structures by analyzing the propagation and reflection of seismic waves. However, traditional methods face challenges in achieving high resolution due to theoretical constraints and computational costs. Leveraging recent advancements in deep learning, this study introduces a neural network framework that integrates Transformer and Convolutional Neural Network (CNN) architectures, enhanced through Adaptive Spatial Feature Fusion (ASFF), to achieve high-resolution seismic imaging. Our approach directly maps seismic data to reflection models, eliminating the need for post-processing low-resolution results. Through extensive numerical experiments, we demonstrate the outstanding ability of this method to accurately infer subsurface structures. Evaluation metrics including Root Mean Square Error (RMSE), Correlation Coefficient (CC), and Structural Similarity Index (SSIM) emphasize the model's capacity to faithfully reconstruct subsurface features. Furthermore, noise injection experiments showcase the reliability of this efficient seismic imaging method, further underscoring the potential of deep learning in seismic imaging.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-024-61251-8