Cover

Loading…
More Information
Summary:We have developed a novel method to derive scale information from quasi-stationary images, which relies on a rotation-guided multi-scale analysis of features derived from Gray-Level Co-occurrence Matrices (GLCM). Unlike other methods for multi-scale texture characterization, our method does not require rotation-invariant textural features, but instead uses orientation information derived from the image to constrain the algorithm. Our method computes GLCM textural features on a “stencil” that follows the local orientation field. It compares features obtained from a sliding window that scans the whole image with those present on a user-selected reference pattern. The method then calculates a similarity measure between textural features derived from the whole image and those derived from the reference pattern. By applying different affine transforms to the stencil used for sampling the reference pattern, we are able to measure the similarity between regions of the image and different dilated versions of the reference pattern, and hence perform a multi-resolution analysis of the image. For a given region of an image, our method is able to find the most likely scale. Therefore it can estimate the stationarity of the image in terms of scale, which has important applications to multipoint geostatistics (MPGS). We tested the method on the Brodatz textures database. Our novel multi-scale, rotation-guided algorithm derives scale information from quasi-stationary images. It extends Gray-Level Co-occurrence Matrices with variable size, oriented, image-sampling “stencils”, and relies on similarity measures between reference patterns and the full image. It achieves successful applications to MPGS.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2007.12.008