Comparing Pan-Sharpening Algorithms to Access an Agriculture Area: A Mississippi Case Study

Abstract

Numerous satellites collect imagery of the Earth’s surface daily, providing information to the public and private sectors. The fusion (pan-sharpening) of high-resolution panchromatic satellite imagery with lower-resolution multispectral satellite imagery has shown promise for monitoring natural resources and farming areas. It results in new imagery with more detail than the original multispectral or panchromatic images. In agricultural areas in Mississippi, landscapes can range from complex mixtures of vegetation and built-up areas to dense vegetative regions. More information is needed on pan-sharpened imagery for assessing landscapes in rural areas of Mississippi. WorldView 3 satellite imagery consisting of landscapes commonly found in rural areas of Mississippi was subjected to 17 pan-sharpening algorithms. The pan-sharpened images were compared qualitatively and quantitatively with three quality indices: 1) Erreur Relative Globale Addimensionelle de Synthese; 2) Universal Image Quality Index; 3) Bias. à trous wavelet transform with the injection model 3 and hyperspherical color spaced fusion methods were ranked among the best for maintaining image integrity for qualitative and quantitative analyses. The optimized high-pass filter method was often ranked last by the quality indices. The smoothing filter-based intensity modulation algorithm and the gaussian modulation transfer function match filtered with high-pass modulation injection model added artifacts to the images. Pan-sharpened satellite imagery has great potential to enhance the survey of Mississippi’s agricultural areas. The key to success is selecting an image fusion process that increases spatial content while not compromising the image integrity.

Share and Cite:

Fletcher, R. (2023) Comparing Pan-Sharpening Algorithms to Access an Agriculture Area: A Mississippi Case Study. Agricultural Sciences, 14, 1206-1221. doi: 10.4236/as.2023.149081.

1. Introduction

Satellite systems have become the pillar for acquiring imagery of the Earth’s surface. High, medium, and coarse spatial resolution satellite imaging systems exist, and they all play a role in monitoring the planet. Modern-day satellite systems often carry payloads that collect data at different spatial and spectral resolutions. The high spatial resolution panchromatic imagery obtained by these systems can be merged with the lower spatial resolution multispectral images to produce a new product (pan-sharpened) containing the spatial and the spectral resolutions of the panchromatic and multispectral images, respectively [1] [2] , resulting in new imagery with more detail than the original multispectral or panchromatic images [1] [2] [3] .

Currently, pan-sharpening algorithms are available in commercial and open-source software. The main goal of image fusion is to merge the imagery without affecting its data quality [1] [4] . Improper fusion of panchromatic and multispectral imagery often results in color distortions and blurriness of the newly merged imagery [5] [6] . Several literature reviews have been published on image fusion [7] [8] and on which algorithms may perform the best in certain circumstances [3] [9] . Pan-sharpened satellite imagery has been used in environmental sciences [10] [11] , urban planning [12] , agriculture [13] [14] , and military and surveillance [15] applications. Its success and failures often hinge on the algorithm’s ability to handle complex landscapes.

Fusion methods have been divided into component substitution (CS) and multiresolution analysis (MRA) based techniques. The component substitution methods transform the spectral content of the multispectral imagery and then substitute the panchromatic band or some other component for one of the transformed components. Then the imagery is back-transformed to the normal image space. Common image fusion approaches classified as CS techniques are the intensity-hue-saturation [16] [17] [18] , principal component analysis [14] , and Gram-Schmidt [19] [20] [21] spectral sharpening methods. Adaptive component substitution methods have also been developed for fusing imagery [22] [23] .

The MRA procedures inject the spatial detail derived from the decomposition of the panchromatic image into the up-sampled multispectral image. À Trous wavelet transform [24] , undecimated or decimated wavelet transform [25] , the Laplacian pyramid [18] , contourlet transform [10] [11] , high-pass filtration, and curvelet [26] are common MRA approaches. MRA fusion techniques often maintain the structural integrity of the image at the cost of spatial distortions. Also, hybrid methods integrate component substitution and multiresolution analysis [27] [28] . New procedures such as hyperspherical color sharpening [29] and improvements to the contourlet transform [30] have been established to improve the pan-sharpening of satellite imagery.

Agricultural regions in rural areas can consist of landscapes dominated by vegetation to mixtures of farmland and buildings. High, medium, and coarse-resolution satellite imagery, even pan-sharpened imagery, have been used to study rural landscapes [31] . Nevertheless, more detail is needed on the performance of pan-sharpened algorithms because satellite imagery is often the primary image source available to people in rural areas. Furthermore, information is lacking in Mississippi on the value this imagery has for assessing landscapes in rural areas. This study aimed to determine the effect of landscape variability in agricultural areas on high-resolution pan-sharpened imagery. The study compared commonly available pan-sharpened algorithms used to fuse high spatial resolution satellite imagery with coarse spatial resolution multispectral imagery. The area of interest contained agricultural plots, woody areas, roadways, and buildings commonly found in rural areas of Mississippi.

2. Materials and Methods

Worldview 3 (Maxar technologies) satellite imagery was acquired of an agricultural area near Stoneville, Mississippi, USA, on June 14, 2022 (Figure 1). The image acquisition was part of an ongoing research project conducted by the United States Department of Agriculture, Agricultural Research Service Scientists to monitor crop growth and productivity in the region. The vendor, atmospherically and radiometrically, corrected the imagery. The satellite nine-band image bundle characteristics were as follows: panchromatic—0.3 m spatial

Figure 1. Map of the United States, Mississippi, and the closest city to the study site. Inset—image of the study area.

resolution, multispectral—1.2 m spatial resolution, radiometric resolution—11-bits, spectral resolution (nm)—450 - 800 (visible to near-infrared, panchromatic band), 400 - 450 (coastal), 450 - 510 (blue), 510 - 580 (green), 585 - 625 (yellow), 630 - 690 (red), 705 - 745 (red edge), 770 - 895 (near-infrared 1), and 860 - 1040 (near-infrared 2).

A subset of the panchromatic (832 rows by 832 columns) and multispectral imagery (3328 rows by 3328 columns) was extracted for further study. The study area contained soybean (Glycine max L), sorghum (Sorghum bicolor L.), and corn (Zea mays. L) plots, natural vegetation, a few houses, and roads. The image represented a common but complex scene often encountered in rural areas of Mississippi, USA.

The imagery of the area of interest was processed in two separate experiments [5] [7] [26] , full resolution experiment (Experiment 1) and reduced resolution experiment (Experiment 2). Experiment 1 involved pan-sharpening the 1.2 m multispectral resolution imagery to 0.3 m spatial resolution. Then the pan-sharpened imagery was down sampled to 1.2 m. For quantitative analysis, the down-sampled multispectral image quality was compared to the original multispectral imagery. For Experiment 2, the panchromatic and multispectral subset was subjected to a low-pass filter. Then the panchromatic and multispectral image subsets were resampled to 1.2 m and 4.8 m spatial resolution, respectively. The resampled multispectral imagery was pan-sharpened to 1.2 m spatial resolution. Finally, its image quality was compared to the original multispectral imagery with a spatial resolution of 1.2 m. Resampling of the imagery was completed with QGIS software (3.22.8-Białowieża [32] ).

The imagery was fused with the PanFusion software (version 2.4). The software and its instructions are freely available to the public (https://www.pansharp.com/applications/). It contained 18 pan-sharpening algorithms for fusing multispectral and panchromatic imagery. For this study, 17 fusion methods were evaluated: band-dependent spatial detail (BDSD [2] [3] ), à trous wavelet transform (ATWT [33] ), additive wavelet intensity method (AWLP [3] [34] ), smoothing filter-based intensity modulation (SFIM [35] ), generalized laplacian pyramids with modulation transfer function context-based decision injection scheme (MTF_GLP_CBD [36] [37] ), gaussian modulation transfer function match filtered with high-pass modulation injection model (MTF_GLP_HPM [9] ), modulation transfer function generalized laplacian pyramid (MTF_GLP [38] ), high pass filter (HPF [39] ), Gram-Schmidt (GS [3] [40] ), hyperspherical color space (HCS [41] ), à trous wavelet transform with the injection model 3 (ATWT_M3 [42] ), à trous wavelet transform with the injection model 2 (ATWT_M2 [42] ), local mean matching (LMM [43] [44] ), Brovey [45] , intensity hue saturation (IHS [46] ), optimize high-pass filter (HPFC [47] ), and local mean and variance matching (LMVM [43] [44] ). The GS adaptive algorithm was not evaluated because of a screen error message that could not be corrected. The following were the input parameters used in PanFusion to create the pan-sharpened imagery: panchromatic image, the multispectral images (8 images), interpolation of multispectral images-nearest neighbor, resolution ratio—4 (i.e., pixel resolution ratio between panchromatic and multispectral images), sensor type-generic, and output datatype-auto.

Three quality indices were calculated with the ImAnalysis (version 1.55) software, a companion software to the PanFusion software, to test the quality of the fusion methods. It measures the following eight quality indexes: Bias, CC (Correlation Coefficient), DIV (Difference in Variance), Entropy Difference, ERGAS (Erreur Relative Globale Addimensionelle de Synthese), UIQI (Universal Image Quality Index), RASE (Relative Average Spectral Error), and RMSE (Root Mean Squared Error). Researchers have commonly used ERGAS, Bias, and UIQI indices to assess the image quality of pan-sharpened imagery [23] [48] [49] ; thus, they were the three metrics accessed in this study.

ERGAS (French acronym, relative dimensionless global error in synthesis) is a normalized global spectral index that indicates the distortion of the pan-sharpened image compared to the reference multiband image. The smaller the value, the better [48] . Low values represent a high similarity between the fused multispectral and original imagery.

Bias measures the mean difference between the reference and pan-sharpened images [49] [50] . Values close to zero represent minor differences between the former and the latter.

The UIQI index [51] is derived from a function of the difference between the mean of the panchromatic and multispectral images, the standard deviation of the differences, and the spatial resolution of the input and output of the panchromatic and multispectral imagery. The UIQI incorporates an estimate of the correlation coefficient and the difference in the luminance contrast between images. Its value ranges from −1 to 1, with 1 representing the best fidelity to the reference [4] [48] .

According to the instructions for the ImAnalysis software, the user must select a search neighborhood for the image processing and the ratio between the multispectral and panchromatic image pixels. For the search window, the software instructions recommended a value between seven and nine and that an odd number worked best. Thus, a search window of seven was selected for this study. The ratio was 1 to 4, representing the panchromatic image pixel size compared to the multispectral image size. After processing the data, the ImAnalysis software stored the results in a Microsoft Excel, xls. file.

The original multispectral imagery was compared with all images derived with the fusion algorithms for qualitative imagery assessment. For display purposes in the results section, the traditional color composite and the false color, color infrared composite were used for comparison. Those two-color composites are commonly viewed for the qualitative assessment of vegetation areas. The traditional color composite was created with the red, green, and blue images; the color infrared composite was derived from the near-infrared 1, red, and green images. The composite imagery was created with the QGIS software.

3. Results

3.1. Qualitative Results

A qualitative assessment of the conventional color and color infrared composite images indicated that the appearances of the pan-sharpened imagery derived from experiments one and two were similar; therefore, the product produced from experiment one was used as the representative images of both investigations. Furthermore, similarities were observed between the fusion processes, thus, a representative image similar to more than one fusion method was presented for further discussion.

Figures 2-5 show the panchromatic, the original traditional color and color infrared composites, and the pansharpened multispectral images. The pan-sharpening algorithms improved the spatial image resolution (Figures 2-5). The spatial enhancement was not the same for all algorithms tested. The most extreme case of spatial enhancement was observed for the HPFC method, whereas the ATWT_M3 fusion technique represented the least spatial enhancement (Figures 2-5). Other fusion methods that had similar spatial results to HPFC were ATWT, GS, HPF, IHS, MTF_GLP_CBD, MTF_GLP_HPM, and MTF_GLP. Similar spatial detail was observed between ATWT_M3 and LMVM, and ATWT_M2, LMM, HCS, Brovey, BDSD, SFIM, and AWLP.

Figure 2. (A) Panchromatic image (0.3 m, spatial resolution), (B) traditional color raw imagery (1.2 m, spatial resolution), and traditional color fused image (0.3 m, spatial resolution); (C) à trous wavelet transform with the injection model 3; (D) local mean and variance matching. The images include experimental plots, natural vegetation, bare soil, water, roadways, and buildings. Imagery acquired by Maxar Technologies.

Figure 3. Traditional color fused image (0.3 m—spatial resolution): (A) optimize high-pass filter; (B) intensity hue saturation; (C) Gram-Schmidt; (D) smoothing filter-based intensity modulation. The image includes experimental plots, natural vegetation, bare soil, water, roadways, and buildings. Imagery acquired by Maxar Technologies.

Figure 4. (A) Color infrared raw imagery (1.2 m, spatial resolution) and color infrared fused image (0.3 m, spatial resolution); (B) à trous wavelet transform with the injection model 3; (C) local mean and variance matching; (D) optimize high-pass filter. The image includes experimental plots, natural vegetation, bare soil, water, roadways, and buildings. Imagery acquired by Maxar Technologies.

Figure 5. Color infrared fused image (0.3 m, spatial resolution): (A) intensity hue saturation; (B) Gram-Schmidt; (C) smoothing filter-based intensity modulation. The image includes experimental plots, natural vegetation, bare soil, water, roadways, and buildings. Imagery acquired by Maxar Technologies.

The fusion process resulted in a spectral distortion in some images, such as IHS, Gram-Schmidt, and HPFC (Figures 2-5). The spectral distortion errors included changing the color of vegetation and the other land cover features. The only other fusion method that resulted in changes in the image color was Brovey (not shown). Spectral distortions were not evident in the ATWT_M3 and LMVM (Figure 2 and Figure 4) pan-sharpened imagery. ATWT, AWLP, BDSD, HCS, HPF, LMM, MTF_GLP_CBD, and MTF_GLP fused images (not shown) appeared similarly to ATWT_M3 and LMVM.

The SFIM fusion method added artifacts to the imagery (Figure 3 and Figure 5). The artifacts were more apparent in areas containing shadows and transition zones between one cover type to another. They were more noticeable on the traditional color composite image than on the color infrared composite image, indicating the images (red, green, and blue) used to develop the traditional color composite were more affected by the SFIM fusion process. MTF_GLP_HPM fused imagery (not shown) contained artifacts like those observed on the SFIM pan-sharpened product.

3.2. Quantitative Results

Table 1 summarizes the quality index results for Experiment 1. For all fusion methods, the bias values were relatively low and close to zero, indicating the

Table 1. Quality index results of pan-sharpened experiment (experiment 1) of the agricultural landscape imagery. The multispectral image (1.2 m spatial resolution) was pan-sharpened (0.3 m spatial resolution) and then resampled to the resolution of the original multispectral image (1.2 m spatial resolution). The index values are an average of the eight spectral bands of the Worldview 3 satellite imagery of the study area.

ATWT_M2—à trous wavelet transform with the injection model 2, ATWT_M3—à trous wavelet transform with the injection model 3, ATWT—à trous wavelet transform, AWLP: additive wavelet intensity ratio, BDSD: band dependent spatial detail, GS: Gram-Schmidt, HCS: hyperspherical color space, HPF: high pass filter, HPFC: optimize high-pass filter, IHS: intensity hue saturation, LMM: local mean matching, LMVM: local mean and variance matching, MTF_GLP_CBD: generalized laplacian pyramids with modulation transfer function context-based decision injection scheme, MTF_GLP_HPM, gaussian modulation transfer function match filtered with high-pass modulation injection model, MTF_GLP: modulation transfer function_generalized laplacian pyramid, SFIM: smoothing filter-based intensity modulation. Quality indices: ERGAS: Erreur Relative Globale Addimensionelle de Syntheses, UIQI: universal image quality index. *: best score per index, **: 2nd best score per index, ***: worst ranking per index.

histogram was not shifted drastically between the original multispectral imagery and the pan-sharpened multispectral imagery. The top-ranked fusion algorithms for bias were ATWT_M2, ATWT_M3, ATWT, AWLP, and HPF. The second-best-rated algorithms were HCS, LMM, LMVM, and MTF_GLP_CBD. The worst-ranked fusion algorithm for bias was the HPFC fusion method. A word of caution for these results. If the values had more than three zeroes after the decimal, the value was rounded to zero, resulting in the ties observed in the quality index tables.

The highest-ranked fusion method, according to ERGAS was HCS, and ATWT_M3 was a close second. HPFC was ranked the worst, followed by IHS, with ERGAS values of 8.73 and 7.71, respectively. The top, second, and worst-ranked fusion methods for UIQI were ATWT_M3, HCS, and HPFC, respectively.

Experiment 2 results are shown in Table 2. The bias values were also low, with the top-ranked fusion algorithms being HCS and LMM. ATWT_M2, ATWT_M3,

Table 2. Quality index results of the pan-sharpened experiment (Experiment 2) of the agricultural landscape imagery. The multispectral and panchromatic imagery were degraded; the multispectral image was pan-sharpened and compared to the original. The index values are an average of the eight spectral bands of the Worldview 3 satellite imagery of the study area.

ATWT_M2: à trous wavelet transform with the injection model 2, ATWT_M3: à trous wavelet transform with the injection model 3, ATWT: à trous wavelet transform, AWLP: additive wavelet intensity ratio, BDSD: band dependent spatial detail, GS: Gram-Schmidt, HCS: hyperspherical color space, HPF: high pass filter, HPFC: optimize high-pass filter, HIS: intensity hue saturation, LMM: local mean matching, and LMVM: local mean and variance matching, MTF_GLP_CBD: generalized laplacian pyramids with modulation transfer function context-based decision injection scheme, MTF_GLP_HPM: gaussian modulation transfer function match filtered with high-pass modulation injection model, MTF-GLP: modulation transfer function-generalized laplacian pyramid, SFIM: smoothing filter-based intensity modulation. ERGAS: Erreur Relative Globale Addimensionelle de Syntheses, UIQI: universal image quality index. *: best score per index, **: 2nd best score per index, ***: worst ranking per index.

LMVM, and MTF_GLP were ranked second. HPFC performed the worst. LMM, LMVM, and MTF_GLP_HPM achieved the best, second-best, and worst ERGAS scores, respectively. Other fusion algorithms posting values close to eight were HPFC, IHS, and SFIM. Four fusion methods were tied for the best UIQI score, ATWT_M3, Brovey, HCS, and LMM. ATWT_M2 and GS were ranked second, and the HPFC fusion method was ranked last.

4. Discussion

Worldview 3 pan-sharpened satellite imagery was evaluated as a tool for assessing a common but complex agricultural landscape found in rural areas of Mississippi. The image fusion resulted in new imagery with more detail than the original multispectral or panchromatic images. Overall, ATWT_M3 and HCS were often ranked as one of the best fusion algorithms for this study (Table 1 and Table 2). It was easily seen on the synthesized traditional color and color infrared composite imagery of ATWT_M3 (Figure 2 and Figure 4) that its image integrity was almost identical to the original imagery. Its colors did not appear washed out like the IHS (Figure 3 and Figure 5), GS (Figure 3 and Figure 5), and Brovey (not shown) fusion methods and, to a lesser extent HPFC method (Figure 3 and Figure 4). The SFIM (Figure 3 and Figure 5) and the MTF_GLP_HPM pan-sharpened images were ruled out for further visual assessment because of the additional artifacts added to the images.

Reference [5] compared eleven different pan-sharpening approaches on hyperspectral data and reported the Bayesian sparse method provided the best performance. The results were consistent across three landscapes, a mixed urban/rural scene, a rural area with different crop types, and a rural area with small villages. Furthermore, their study included four pan-sharpening algorithms equivalent to the ones used in the current research: SFIM, MTF_GLP, MTF_GLP_HPM, and GS. They used ERGAS as one of the quality indices to evaluate image quality. For the rural landscape, the order from best to worst was MTF_GLP_HPM, MTF_GLP, SFIM, and GS. Their findings were based on the reduced resolution dataset analysis, equivalent to experiment 2 in this study in which the rankings were MTF_GLP, GS, SFIM, and MTF_GLP_HPM.

Reference [6] in their studies of twenty-one pan-sharpening techniques demonstrated for visible and shortwave infrared Sentinel-2 satellite imagery of diverse landcover types, the MTF_GLP_CBD was ranked best. For their research, sixteen pan-sharpened algorithms were equivalent to the ones evaluated in the current study. Generally, the SFIM, HPF, MTF_GLP_HPM, MTF_GLP_CBD, and MTF_GLP were ranked in the top ten for the reduced resolution and the full-resolution pan-sharpened datasets. In contrast, this study’s consistent top-ranked pan-sharpened algorithms were ATWT_M3, HCS, ATWT_M2, LMVM, and LMM. Reference [6] also reported artifacts in pan-sharpened imagery created by the SFIM method.

Reference [48] observed in their evaluation of pan-sharpened Worldview2 satellite imagery that different algorithms were ranked best when applied to the degraded and to the original data sets. The findings of this study concurred with those results. Overall, the bias values appeared to be less effective by the resolution of the data, followed by ERGAS. UIQI was effective the most; its overall range was reduced from the worst to the best pan-sharpening method when using the original versus the degraded data sets.

Finally, other researchers have also reported spatial and spectral distortions to satellite imagery subjected to pan-sharpening [3] [6] [23] [48] . The differences observed in the present study and the other discussed studies could be attributed to the differences in the landscapes and differences in the sensor systems used to acquire the imagery for the pan-sharpening process.

5. Conclusion

Worldview 3 satellite imagery subjected to pan-sharpening algorithms can be a great asset in assessing rural agricultural landscapes in Mississippi. The researcher or user is often perplexed about what sharpening algorithm to use. The beauty of the software used in this study was that the practitioner has 18 tools at their disposal and thus can determine the best tools to solve problems from that suite. Future research initiatives should focus on whether the quality index values are better when subsets of the satellite scenes are analyzed in rural Mississippi compared with the whole satellite scene and on how pan-sharpening images affect the classification accuracy of the algorithms used to derive maps.

Acknowledgements

Funding was received from the United States Department of Agriculture. Statements made in this article were the author’s opinion and did not represent the opinion of the Agricultural Research Service. Maxar Technologies acquired the satellite imagery used in the study.

Conflicts of Interest

The author declares no conflicts of interest regarding the publication of this paper.

References

[1] Wald, L. (2000) Quality of High Resolution Synthesized Images: Is There a Simple Criterion? In: Ranchin, T. and Wald, L., Eds., Third Conference “Fusion of Earth Data: Merging Point Measurements, Raster Maps and Remotely Sensed Images”, SEE/URISCA, Sophia Antipolis, 99-103.
https://hal.archives-ouvertes.fr/hal-00395027
[2] Vivone, G. (2019) Robust Band-Dependent Spatial-Detail Approaches for Panchromatic Sharpening. IEEE Transactions on Geoscience and Remote Sensing, 57, 6421-6433.
https://doi.org/10.1109/TGRS.2019.2906073
[3] Vivone, G., Alparone, L., Chanussot, J., Dalla Mura, M., Garzelli, A., Licciardi, G.A., Restaino, R. and Wald, L. (2015) A Critical Comparison among Pansharpening Algorithms. IEEE Transactions on Geoscience and Remote Sensing, 53, 2565-2586.
https://doi.org/10.1109/TGRS.2014.2361734
[4] Alparone, L., Aiazzi, B., Baronti, S., Garzelli, A., Nencini, F. and Selva, M. (2008) Multispectral and Panchromatic Data Fusion Assessment without Reference. Photogrammetric Engineering & Remote Sensing, 74, 193-200.
https://doi.org/10.14358/PERS.74.2.193
[5] Loncan, L., Almeida, L.B., Bioucas-Dias, J.M., Briottet, X., Chanussot, J., Dobigeon, N., Fabre, S., Liao, W., Licciardi, G.A., Simões, M., Tourneret, J.-Y., Veganzones, M.A., Vivone, G., Wei, Q. and Yokoya, N. (2015, April 17) Hyperspectral Pansharpening: A Review.
http://arxiv.org/abs/1504.04531
[6] Vaiopoulos, A. and Karantzalos, K. (2016) Pansharpening on the Narrow VNIR and SWIR Spectral Bands of Sentinel-2. ISPRS International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XLI-B7, 723-730.
https://doi.org/10.5194/isprsarchives-XLI-B7-723-2016
[7] Thomas, C., Ranchin, T., Wald, L. and Chanussot, J. (2008) Synthesis of Multispectral Images to High Spatial Resolution: A Critical Review of Fusion Methods Based on Remote Sensing Physics. IEEE Transactions on Geoscience and Remote Sensing, 46, 1301-1312.
https://doi.org/10.1109/TGRS.2007.912448
[8] Garzelli, A. (2016) A Review of Image Fusion Algorithms Based on the Super-Resolution Paradigm. Remote Sensing, 8, 797.
https://doi.org/10.3390/rs8100797
[9] Kahraman, S. and Ertürk, A. (2017) A Comprehensive Review of Pansharpening Algorithms for Göktürk-2 Satellite Images. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, IV-4/W4, 263-270.
https://doi.org/10.5194/isprs-annals-IV-4-W4-263-2017
[10] Yang, X.H. and Jiao, L.C. (2008) Fusion Algorithm for Remote Sensing Images Based on Nonsubsampled Contourlet Transform. Acta Automatica Sinica, 34, 274-281.
https://doi.org/10.3724/SP.J.1004.2008.00274
[11] Saeedi, J. and Faez, K. (2011) A New Pan-Sharpening Method Using Multiobjective Particle Swarm Optimization and the Shiftable Contourlet Transform. ISPRS Journal of Photogrammetry and Remote Sensing, 66, 365-381.
https://doi.org/10.1016/j.isprsjprs.2011.01.006
[12] Amarsaikhan, D., Blotevogel, H.H., Van Genderen, J.L., Ganzorig, M., Gantuya, R. and Nergui, B. (2010) Fusing High-Resolution SAR and Optical Imagery for Improved Urban Land Cover Study and Classification. International Journal of Image and Data Fusion, 1, 83-97.
https://doi.org/10.1080/19479830903562041
[13] Gilbertson, J.K., Kemp, J. and van Niekerk, A. (2017) Effect of Pan-Sharpening Multi-Temporal Landsat 8 Imagery for Crop Type Differentiation Using Different Classification Techniques. Computers and Electronics in Agriculture, 134, 151-159.
https://doi.org/10.1016/j.compag.2016.12.006
[14] Xu, Y., Smith, S.E., Grunwald, S., Abd-Elrahman, A. and Wani, S.P. (2018) Effects of Image Pansharpening on Soil Total Nitrogen Prediction Models in South India. Geoderma, 320, 52-66.
https://doi.org/10.1016/j.geoderma.2018.01.017
[15] Fleming, S., Jordan, T., Madden, M., Usery, E.L. and Welch, R. (2009) GIS Applications for Military Operations in Coastal Zones. ISPRS Journal of Photogrammetry and Remote Sensing, 64, 213-222.
https://doi.org/10.1016/j.isprsjprs.2008.10.004
[16] Carper, W.J., Lillesand, T.M. and Kiefer, R.W. (1990) The Use of Intensity-Hue-Saturation Transformations for Merging SPOT Panchromatic and Multispectral Image Data. Photogrammetric Engineering and Remote Sensing, 56, 459-467.
[17] Al-Wassai, F.A. and Kalyankar, N.V. (2011) The IHS Transformations Based Image Fusion. International Journal of Advanced Research in Computer Science, 2, 70-77.
[18] Zhou, X., Liu, J., Liu, S., Cao, L., Zhou, Q. and Huang, H. (2014) A GIHS-Based Spectral Preservation Fusion Method for Remote Sensing Images Using Edge Restored Spectral Modulation. ISPRS Journal of Photogrammetry and Remote Sensing, 88, 16-27.
https://doi.org/10.1016/j.isprsjprs.2013.11.011
[19] Welch, R. and Ehlers, M. (1987) Merging Multiresolution SPOT HRV and Landsat TM Data. Photogrammetric Engineering and Remote Sensing, 53, 301-303.
[20] Aiazzi, B., Baronti, S., Selva, M. and Alparone, L. (2007) MS + Pan Image Fusion by an Enhanced Gram-Schmidt Spectral Sharpening. In: Bochenek, Z., Ed., New Developments and Challenges in Remote Sensing, Millpress, Rotterdam, 113-120.
[21] Maurer, T. (2013) How to Pan-Sharpen Images Using the Gram-Schmidt Pan-Sharpen Method—A Recipe. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XL-1/W1, 239-244.
https://doi.org/10.5194/isprsarchives-XL-1-W1-239-2013
[22] Aiazzi, B., Baronti, S. and Selva, M. (2007) Improving Component Substitution Pansharpening through Multivariate Regression of MS + Pan Data. IEEE Transactions on Geoscience and Remote Sensing, 45, 3230-3239.
https://doi.org/10.1109/TGRS.2007.901007
[23] Li, X., Chen, H., Zhou, J. and Wang, Y. (2020) Improving Component Substitution Pan-Sharpening through Refinement of the Injection Detail. Photogrammetric Engineering & Remote Sensing, 86, 317-325.
https://doi.org/10.14358/PERS.86.5.317
[24] Teggi, S., Cecchi, R. and Serafini, F. (2003) TM and IRS-1C-PAN Data Fusion Using Multiresolution Decomposition Methods Based on the “a Tròus” Algorithm. International Journal of Remote Sensing, 24, 1287-1301.
https://doi.org/10.1080/01431160210144561
[25] Amolins, K., Zhang, Y. and Dare, P. (2007) Wavelet Based Image Fusion Techniques—An Introduction, Review and Comparison. ISPRS Journal of Photogrammetry and Remote Sensing, 62, 249-263.
https://doi.org/10.1016/j.isprsjprs.2007.05.009
[26] Dong, L., Yang, Q., Wu, H., Xiao, H. and Xu, M. (2015) High Quality Multispectral and Panchromatic Image Fusion Technologies Based on Curvelet Transform. Neurocomputing, 159, 268-274.
https://doi.org/10.1016/j.neucom.2015.01.050
[27] Gonzalez-Audicana, M., Saleta, J.L., Catalan, R.G. and Garcia, R. (2004) Fusion of Multispectral and Panchromatic Images Using Improved IHS and PCA Mergers Based on Wavelet Decomposition. IEEE Transactions on Geoscience and Remote Sensing, 42, 1291-1299.
https://doi.org/10.1109/TGRS.2004.825593
[28] Chen, F., Qin, F., Peng, G. and Chen, S. (2012) Fusion of Remote Sensing Images Using Improved ICA Mergers Based on Wavelet Decomposition. Procedia Engineering, 29, 2938-2943.
https://doi.org/10.1016/j.proeng.2012.01.418
[29] Padwick, C., Deskevich, M., Pacifici, F. and Smallwood, S. (2010) Worldview2 Pan-Sharperning. ASPRS 2010 Annual Conference, San Diego, 26-30 April 2010, 14.
[30] Chikr El-Mezouar, M., Kpalma, K., Taleb, N. and Ronsin, J. (2014) A Pan-Sharpening Based on the Non-Subsampled Contourlet Transform: Application to Worldview-2 Imagery. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 7, 1806-1815.
https://doi.org/10.1109/JSTARS.2014.2306332
[31] Lillo-Saavedra, M., Gonzalo-Martín, C., García-Pedrero, A. and Lagos, O. (2016) Scale-Aware Pansharpening Algorithm for Agricultural Fragmented Landscapes. Remote Sensing, 8, 870.
https://doi.org/10.3390/rs8100870
[32] QGIS Development Team (2021) QGIS Geographic Information System. Open Source Geospatial Foundation Project.
http://qgis.osgeo.org
[33] Zhang, X. and Li, D. (2001) Trous Wavelet Decomposition Applied to Image Edge Detection. Annals of GIS, 7, 119-123.
https://doi.org/10.1080/10824000109480563
[34] Otazu, X., Gonzalez-Audicana, M., Fors, O. and Nunez, J. (2005) Introduction of Sensor Spectral Response into Image Fusion Methods. Application to Wavelet-Based Methods. IEEE Transactions on Geoscience and Remote Sensing, 43, 2376-2385.
https://doi.org/10.1109/TGRS.2005.856106
[35] Liu, J.G. (2000) Smoothing Filter-Based Intensity Modulation: A Spectral Preserve Image Fusion Technique for Improving Spatial Details. International Journal of Remote Sensing, 21, 3461-3472.
https://doi.org/10.1080/014311600750037499
[36] Addesso, P., Restaino, R. and Vivone, G. (2021) An Improved Version of the Generalized Laplacian Pyramid Algorithm for Pansharpening. Remote Sensing, 13, 3386-3405.
https://doi.org/10.3390/rs13173386
[37] Aiazzi, B., Alparone, L., Baronti, S., Garzelli, A. and Selva, M. (2006) MTF-Tailored Multiscale Fusion of High-Resolution MS and Pan Imagery. Photogrammetric Engineering & Remote Sensing, 72, 591-596.
https://doi.org/10.14358/PERS.72.5.591
[38] Benzenati, T., Kessentini, Y., Kallel, A. and Hallabia, H. (2019) Generalized Laplacian Pyramid Pan-Sharpening Gain Injection Prediction Based on CNN. IEEE Geoscience and Remote Sensing Letters, 99, 1-5.
[39] Chavez, P.S. (1988) Comparison of the Spectral Information Content of Landsat Thematic Mapper and SPOT for Three Different Sites in the Phoenix, Arizona Region. Photogrammetric Engineering and Remote Sensing, 54, 1699-1708.
[40] Laben, C.A. and Brower, B.V. (2000, January 4) Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening.
https://patents.google.com/patent/US6011875A/en
[41] Jat, M., Garg, P. and Dahiya, S. (2013) A Comparative Study of Various Pixel Based Image Fusion Techniques as Applied to an Urban Environment. International Journal of Image and Data Fusion, 4, 197-213.
https://doi.org/10.1080/19479832.2013.778335
[42] Ranchin, T. and Wald, L. (2000) Fusion of High Spatial and Spectral Resolution Images: The ARSlS Concept and Its Implementation. Photogrammetric Engineering and Remote Sensing, 66, 49-61.
[43] de Béthune, S., Muller, F. and Donnay, J.-P. (1998) Fusion of Multispectral and Panchromatic Images by Local Mean and Variance Matching Filtering Techniques. Proceedings of the 2nd International Conference—Fusion of Earth Data Merging Point Measurements, Raster Maps and Remotely Sensed Images, Antipolis, 28-30 January 1998, 31-36.
[44] de Béthune, S., Muller, F. and Binard, M. (1997) Adaptive Intensity Matching Filters: A New Tool for Multiresolution Data Fusion. Proceedings of the AGARD Conference 595, Multi-Sensor Systems and Data Fusion for Telecommunications, Remote Sensing and Radar, Lisbon, 29 September-2 October 1997, 28:1-28:15.
[45] Gillespie, A.R., Kahle, A.B. and Walker, R.E. (1987) Color Enhancement of Highly Correlated Images. II. Channel Ratio and “Chromaticity” Transformation Techniques. Remote Sensing of Environment, 22, 343-365.
https://doi.org/10.1016/0034-4257(87)90088-5
[46] Choi, M. (2006) A New Intensity-Hue-Saturation Fusion Approach to Image Fusion with a Tradeoff Parameter. IEEE Transactions on Geoscience and Remote Sensing, 44, 1672-1682.
https://doi.org/10.1109/TGRS.2006.869923
[47] Chaudhary, S.K., Kumar, D. and Jain, M.K. (2016) Performance Analysis of Hyperspherical Colour Sharpening Method for IRS Satellite Images. The Imaging Science Journal, 64, 305-312.
https://doi.org/10.1080/13682199.2016.1190898
[48] Nikolakopoulos, K. and Oikonomidis, D. (2015) Quality Assessment of Ten Fusion Techniques Applied on Worldview-2. European Journal of Remote Sensing, 48, 141-167.
https://doi.org/10.5721/EuJRS20154809
[49] Jawak, S.D. and Luis, A.J. (2013) A Comprehensive Evaluation of PAN-Sharpening Algorithms Coupled with Resampling Methods for Image Synthesis of Very High Resolution Remotely Sensed Satellite Data. Advances in Remote Sensing, 2, 332-344.
https://doi.org/10.4236/ars.2013.24036
[50] Johnson, B. (2014) Effects of Pansharpening on Vegetation Indices. ISPRS International Journal of Geo-Information, 3, 507-522.
https://doi.org/10.3390/ijgi3020507
[51] Wang, Z. and Bovik, A.C. (2002) A Universal Image Quality Index. IEEE Signal Processing Letters, 9, 81-84.
https://doi.org/10.1109/97.995823

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.