New 2D to 3D Reconstruction of Heterogeneous Porous Media via Deep Generative Adversarial Networks (GANs)
Accurately characterizing rock microstructures in three dimensions (3D) is crucial for modeling various physical phenomena and estimating rock properties. Despite advancements in 3D imaging, limitations arise from the trade‐off between sample size and resolution, particularly in heterogeneous rocks...
Saved in:
Published in | Journal of geophysical research. Machine learning and computation Vol. 1; no. 3 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
01.09.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Accurately characterizing rock microstructures in three dimensions (3D) is crucial for modeling various physical phenomena and estimating rock properties. Despite advancements in 3D imaging, limitations arise from the trade‐off between sample size and resolution, particularly in heterogeneous rocks with multi‐scale features where both high resolution and a large field of view are essential. These challenges have prompted interest in accurate 3D reconstructions from high‐resolution two‐dimensional (2D) images using advanced generative models like generative adversarial networks (GANs). In this study, using scanning electron microscopy and optical microscopy, we acquired 2D images from three orthogonal sections of a Berea sandstone sample. These images were employed to train a modified SliceGAN model, a variant of GANs, for 3D reconstruction. Unlike previous studies utilizing methods for 2D to 3D reconstructions that typically incorporated 3D images in their training, our approach relies exclusively on 2D images. We propose a systematic workflow which enables us to produce 3D reconstructions that closely mirror the original 2D inputs and a 3D X‐ray tomography in terms of structural and morphological characteristics. Additionally, we highlight the importance of input data size and training our model with representative images which enable us to generate diverse reconstructions with transport properties that align with previous studies on Berea sandstone. This underscores the potential of 2D to 3D reconstructions as an effective alternative to multiple X‐ray tomographies, integral for assessing variability in heterogeneous rocks.
Plain Language Summary
Describing rock microstructures in 3D is crucial for modeling rock properties, such as permeability, and physical transport processes, like fluid flow through a rock. One method to estimate such rock properties is digital rock physics which involves first imaging and digitizing the microstructures and then numerically simulating different physical processes. To capture both fine‐scale features and overall variability within a sample, detailed images alongside large sample areas are necessary. However, 3D imaging techniques like X‐ray tomography struggle with a trade‐off between sample size and image resolution. 2D imaging techniques, like electron and optical microscopy, offer a solution providing large fields of view and high‐resolution images. New ways of creating realistic 3D rock volumes have recently emerged using deep learning‐based generative models such as generative adversarial networks (GANs). We employ a variant of GANs to train on purely 2D images, rapidly generating realistic 3D volumes of Berea sandstone. Our analysis shows that the adjusted model can generate diverse 3D reconstructions displaying properties consistent with established knowledge of Berea sandstone. Our findings highlight the usefulness of true 2D to 3D rock reconstructions as a rapid and reliable means of generating large and diverse sample pools for assessing complex rock properties.
Key Points
Rapid and reliable 3D reconstructions of porous media using generative adversarial networks
A new workflow is proposed for 3D image reconstruction of porous media using purely 2D electron and optical microscopy images
Our model can generate realistic 3D images with high variability to assess the uncertainty of heterogeneous rocks |
---|---|
ISSN: | 2993-5210 2993-5210 |
DOI: | 10.1029/2024JH000178 |