Gibbs sampling on large lattice with GMRF

Gibbs sampling is routinely used to sample truncated Gaussian distributions. These distributions naturally occur when associating latent Gaussian fields to category fields obtained by discrete simulation methods like multipoint, sequential indicator simulation and object-based simulation. The latent...

Full description

Saved in:
Bibliographic Details
Published inComputers & geosciences Vol. 111; pp. 190 - 199
Main Authors Marcotte, Denis, Allard, Denis
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.02.2018
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gibbs sampling is routinely used to sample truncated Gaussian distributions. These distributions naturally occur when associating latent Gaussian fields to category fields obtained by discrete simulation methods like multipoint, sequential indicator simulation and object-based simulation. The latent Gaussians are often used in data assimilation and history matching algorithms. When the Gibbs sampling is applied on a large lattice, the computing cost can become prohibitive. The usual practice of using local neighborhoods is unsatisfying as it can diverge and it does not reproduce exactly the desired covariance. A better approach is to use Gaussian Markov Random Fields (GMRF) which enables to compute the conditional distributions at any point without having to compute and invert the full covariance matrix. As the GMRF is locally defined, it allows simultaneous updating of all points that do not share neighbors (coding sets). We propose a new simultaneous Gibbs updating strategy on coding sets that can be efficiently computed by convolution and applied with an acceptance/rejection method in the truncated case. We study empirically the speed of convergence, the effect of choice of boundary conditions, of the correlation range and of GMRF smoothness. We show that the convergence is slower in the Gaussian case on the torus than for the finite case studied in the literature. However, in the truncated Gaussian case, we show that short scale correlation is quickly restored and the conditioning categories at each lattice point imprint the long scale correlation. Hence our approach enables to realistically apply Gibbs sampling on large 2D or 3D lattice with the desired GMRF covariance. •GMRF template for the 3D case is derived.•Based on coding sets, exact and vastly parallel Gibbs sampling is implemented.•Convergence rate of Gibbs sampler is studied.•New method allows quick simulation of truncated Gaussian by acceptance.•Provide latent Gaussians to represent large discrete fields for history matching.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0098-3004
1873-7803
DOI:10.1016/j.cageo.2017.11.012