Remote sensing image fusion using the curvelet transform
This paper presents an image fusion method suitable for pan-sharpening of multispectral (MS) bands, based on nonseparable multiresolution analysis (MRA). The low-resolution MS bands are resampled to the fine scale of the panchromatic (Pan) image and sharpened by injecting highpass directional detail...
Saved in:
Published in | Information fusion Vol. 8; no. 2; pp. 143 - 156 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.04.2007
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper presents an image fusion method suitable for pan-sharpening of multispectral (MS) bands, based on nonseparable multiresolution analysis (MRA). The low-resolution MS bands are resampled to the fine scale of the panchromatic (Pan) image and sharpened by injecting highpass directional details extracted from the high-resolution Pan image by means of the curvelet transform (CT). CT is a nonseparable MRA, whose basis functions are directional edges with progressively increasing resolution. The advantage of CT with respect to conventional separable MRA, either decimated or not, is twofold. Firstly, directional detail coefficients matching image edges may be preliminarily soft-thresholded to achieve a noise reduction that is better than that obtained in the separable wavelet domain. Secondly, modeling of the relationships between high-resolution detail coefficients of the MS bands and of the Pan image is more fitting, being accomplished in the directional multiresolution domain. Experiments are carried out on very-high-resolution MS
+
Pan images acquired by the QuickBird and Ikonos satellite systems. Fusion simulations on spatially degraded data, whose original MS bands are available for reference, show that the proposed curvelet-based fusion method performs slightly better than the state-of-the art. Fusion tests at the full scale reveal that an accurate and reliable Pan-sharpening, little affected by local inaccuracies even in the presence of complex and detailed urban landscapes, is achieved by the proposed method. |
---|---|
ISSN: | 1566-2535 1872-6305 |
DOI: | 10.1016/j.inffus.2006.02.001 |