Comparison and improvement of wavelet-based image fusion

被引:41
作者
Hong, G. [1 ]
Zhang, Y. [1 ]
机构
[1] Univ New Brunswick, Dept Geodesy & Geomat Engn, Fredericton, NB E3B 5A3, Canada
关键词
D O I
10.1080/01431160701313826
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
The wavelets used in image fusion can be categorized into three general classes: orthogonal, biorthogonal, and non-orthogonal. Although these wavelets share some common properties, each wavelet also has a unique image decomposition and reconstruction characteristic that leads to different fusion results. This paper focuses on the comparison of the image-fusion methods that utilize the wavelet of the above three general classes, and theoretically analyses the factors that lead to different fusion results. Normally, when a wavelet transformation alone is used for image fusion, the fusion result is not good. However, if a wavelet transform and a traditional fusion method, such as an IHS transform or a PCA transform, are integrated, better fusion results may be achieved. Therefore, this paper also discusses methods to improve wavelet-based fusion by integrating an IHS or a PCA transform. As the substitution in the IHS transform or the PCA transform is limited to only one component, the integration of the wavelet transform with the IHS or PCA to improve or modify the component, and the use of IHS or PCA transform to fuse the image, can make the fusion process simpler and faster. This integration can also better preserve colour information. IKONOS and QuickBird image data are used to evaluate the seven kinds of wavelet fusion methods (orthogonal wavelet fusion with decimation, orthogonal wavelet fusion without decimation, biorthogonal wavelet fusion with decimation, biorthogonal wavelet fusion without decimation, wavelet fusion based on the ' trous', wavelet and IHS transformation integration, and wavelet and PCA transformation integration). The fusion results are compared graphically, visually, and statistically, and show that wavelet-integrated methods can improve the fusion result, reduce the ringing or aliasing effects to some extent, and make the whole image smoother. Comparisons of the final results also show that the final result is affected by the type of wavelets (orthogonal, biorthogonal, and non-orthogonal), decimation or undecimation, and wavelet-decomposition levels.
引用
收藏
页码:673 / 691
页数:19
相关论文
共 28 条
[1]   Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis [J].
Aiazzi, B ;
Alparone, L ;
Baronti, S ;
Garzelli, A .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2002, 40 (10) :2300-2312
[2]  
[Anonymous], WAVELET TOOLBOX USER
[3]  
Burrus C.S., 1998, introduction to Wavelets and Wavelet Transforms-A Primer
[4]   Redundant versus orthogonal wavelet decomposition for multisensor image fusion [J].
Chibani, Y ;
Houacine, A .
PATTERN RECOGNITION, 2003, 36 (04) :879-887
[5]   The joint use of IHS transform and redundant wavelet decomposition for fusing multispectral and panchromatic images [J].
Chibani, Y ;
Houacine, A .
INTERNATIONAL JOURNAL OF REMOTE SENSING, 2002, 23 (18) :3821-3833
[6]   Satellite image fusion with multiscale wavelet analysis for marine applications: preserving spatial information and minimizing artifacts (PSIMA) [J].
Du, Y ;
Vachon, PW ;
van der Sanden, JJ .
CANADIAN JOURNAL OF REMOTE SENSING, 2003, 29 (01) :14-23
[7]  
GarguetDuport B, 1996, PHOTOGRAMM ENG REM S, V62, P1057
[8]  
Garzelli A, 2002, INT GEOSCI REMOTE SE, P66, DOI 10.1109/IGARSS.2002.1024943
[9]   Comparison between Mallat's and the 'a trous' discrete wavelet transform based algorithms for the fusion of multispectral and panchromatic images [J].
González-Audícana, M ;
Otazu, X ;
Fors, O ;
Seco, A .
INTERNATIONAL JOURNAL OF REMOTE SENSING, 2005, 26 (03) :595-614
[10]   Fusion of multispectral and panchromatic images using improved IHS and PCA mergers based on wavelet decomposition [J].
González-Audícana, M ;
Saleta, JL ;
Catalán, RG ;
García, R .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2004, 42 (06) :1291-1299