Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v...

10
201 Taiwan J For Sci 30(3): 201-9, 2015 Research note Applying Image Fusion to Integrate Radar Images and SPOT Multi-spectral Satellite Images for Forest Type Classification Chia-Hao Chang, 1) Yi-Ta Hsieh, 1) Shou-Tsung Wu, 2) Chaur-Tzuhn Chen, 1) Jan-Chang Chen 1,3) SummaryForest type mapping requires considerable manpower and resources. Therefore, using remote sensing techniques to reduce resource requirements is common in forest inventories. Remote sens- ing includes active radar and passive optical sensors which can provide different kinds of informa- tion. Conventional remote sensing methods for forest type classification mainly use optical images, but they are affected by weather and day/night with the consequence that results are not always accurate. Synthetic aperture radar (SAR) relies on microwave radiation, and is not affected by the above restrictions. Some studies combine SAR and optical images to increase the accuracy of for- est type classification. In this study, we used ALOS PALSAR L band images with a wavelength of 0.25 m. We used a combination of spectral features and roughness information to improve the classification accuracy. For analysis and processing, radar images and multispectral satellite im- ages were combined using the intensity, hue and saturation (IHS) and wavelet transformation (WT). We used IHS images, WT images, and SPOT images to classify the forest types by the maximum likelihood method. Results showed that the overall accuracy was 83.86%, and the kappa value was 0.81 for IHS, and the overall accuracy was 72% and kappa value was 0.68 for WT. These results were better than those based on SPOT images, the overall accuracy of which was 65% and kappa value was 0.60. We found that combining SAR and optical images improved the accuracy by ap- proximately 18%, thereby improving forest type classification. Key words: synthetic aperture radar, multi-spectral image, image fusion, forest type classification. Chang CH, Hsieh YT, Wu ST, Chen CT, Chen JC. 2015. Applying image fusion to integrate radar images and SPOT multi-spectral satellite images for forest type classification. Taiwan J For Sci 30(3):201-9. 1) Department of Forestry, National P ingtung Univ. of Science and Technology, 1 Xuefu Rd., Neipu Township, P ingtung 91201, Taiwan. 國立屏東科技大學森林系,91201 屏東縣內埔鄉學府路1 號。 2) Department of Tourism Management, Shih Chien Univ. 200 Univ. Rd., Neimen Township, Kaohsiung 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550 高雄市內門區大學路200 號。 3) Corresponding author, e-mail:[email protected] 通訊作者。 Received October 2014, Accepted April 2015. 2014 10 月送審 2015 4 月通過。

Transcript of Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v...

Page 1: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

201Taiwan J For Sci 30(3): 201-9, 2015

Research note

Applying Image Fusion to Integrate Radar Images and SPOT Multi-spectral Satellite Images

for Forest Type Classification

Chia-Hao Chang,1) Yi-Ta Hsieh,1) Shou-Tsung Wu,2) Chaur-Tzuhn Chen,1) Jan-Chang Chen1,3)

【Summary】

Forest type mapping requires considerable manpower and resources. Therefore, using remote sensing techniques to reduce resource requirements is common in forest inventories. Remote sens-ing includes active radar and passive optical sensors which can provide different kinds of informa-tion. Conventional remote sensing methods for forest type classification mainly use optical images, but they are affected by weather and day/night with the consequence that results are not always accurate. Synthetic aperture radar (SAR) relies on microwave radiation, and is not affected by the above restrictions. Some studies combine SAR and optical images to increase the accuracy of for-est type classification. In this study, we used ALOS PALSAR L band images with a wavelength of 0.25 m. We used a combination of spectral features and roughness information to improve the classification accuracy. For analysis and processing, radar images and multispectral satellite im-ages were combined using the intensity, hue and saturation (IHS) and wavelet transformation (WT). We used IHS images, WT images, and SPOT images to classify the forest types by the maximum likelihood method. Results showed that the overall accuracy was 83.86%, and the kappa value was 0.81 for IHS, and the overall accuracy was 72% and kappa value was 0.68 for WT. These results were better than those based on SPOT images, the overall accuracy of which was 65% and kappa value was 0.60. We found that combining SAR and optical images improved the accuracy by ap-proximately 18%, thereby improving forest type classification.Key words: synthetic aperture radar, multi-spectral image, image fusion, forest type classification.Chang CH, Hsieh YT, Wu ST, Chen CT, Chen JC. 2015. Applying image fusion to integrate radar

images and SPOT multi-spectral satellite images for forest type classification. Taiwan J For Sci 30(3):201-9.

1) Department of Forestry, National P ingtung Univ. of Science and Technology, 1 Xuefu Rd. , Neipu

Township, P ingtung 91201, Taiwan. 國立屏東科技大學森林系,91201屏東縣內埔鄉學府路1號。2) Department of Tourism Management, Shih Chien Univ. 200 Univ. Rd., Neimen Township, Kaohsiung

84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。3) Corresponding author, e-mail:[email protected] 通訊作者。

Received October 2014, Accepted April 2015. 2014年10月送審 2015年4月通過。

Page 2: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

202 Chang et al.─Fusion of SAR and SPOT images for forest land classification

研究簡報

應用影像融合技術整合雷達影像與SPOT多光譜影像

於林型分類

張嘉豪1) 謝依達1) 吳守從2) 陳朝圳1) 陳建璋1,3)

摘 要

林型分類往往需要投入大量的人力與物力,利用遙測技術節省調查成本。遙感探測中感測器主

要分為主動式雷達感測器和被動式光學感測器,兩種影像擁有不同的資訊。傳統以遙測方法進行土地

覆蓋分類多以光學影像為主,但常受限於天候及日夜因素無法得到較精確之結果,而合成孔徑雷達

(synthetic aperture radar, SAR)主要使用微波(microwave)可不受上述因素之限制,具有研究發展的優勢與分類應用領域,許多研究指出光學影像與SAR融合進行土地覆蓋分類可以獲得更精準之資訊。本研究使用ALOS/PALSAR L波段,並透過光譜特性與表面粗糙度的資訊,提高分類準確度。本研究透過IHS轉換融合法及小波轉換融合法進行融合雷達影像及多光譜衛星影像,進行最大概似法(maximum likelihood method, MLC)進行林型分類,得到IHS轉換融合法分類準確度達83.86%、kappa值0.8152,小波轉換融合法準確度達72.68%、kappa值0.6889較原始SPOT影像分類準確度65.71%、kappa值0.6052為高,使影像分類準確度提升18%,展現出雷達影像及多光譜衛星影像之融合應用價值。

關鍵詞:合成孔徑雷達、多光譜影像、影像融合、林型分類。

張嘉豪、謝依達、吳守從、陳朝圳、陳建璋。2015。應用影像融合技術整合雷達影像與SPOT多光譜影像於林型分類。台灣林業科學30(3):201-9。

INTRODUCTIONForest type mapping requires consid-

erable manpower and material resources. Therefore, using remote sensing techniques to reduce resource requirements is common in forest inventories. Remote sensing includes active radar and passive optical sensors which can provide different information. Conven-tional remote sensing methods for land cover classification mainly use optical images, but these are affected by weather and day/night, negatively impacting their accuracy. Clas-sification of land cover is one of the primary objectives of the analysis of remote sensing data. Single-band synthetic aperture radar (SAR) data do not provide highly accurate land cover classification (Herold et al. 2004,

Torma et al. 2004). SAR uses microwave ra-diation, so it is not affected by the restrictions mentioned above. Some studies combined SAR and optical images to improve the ac-curacy of land cover classification (Podest and Saatchi 2002). This study combined the advantages of active and passive sensors, and more-detailed and -accurate information on surface features was obtained, thereby en-hancing the applicability of radar images and multispectral satellite image data. Moderate-resolution multispectral satellite imagery can be fused with higher-resolution radar imagery to improve the ability to interpret fused im-ages (Vyjayanthi 2008).

Some studies used intensity, hue, and

Page 3: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

203Taiwan J For Sci 30(3): 201-9, 2015

saturation (IHS) and wavelet transformation (WT) methods to combine SAR and multi-spectral images. Such a process can result in spectral information better than that of the original images and retain SAR image surface features with greater clarity than the original images. (Hong et al. 2009, Lamyaa and Salwa 2010). Combining spectral characteristics and the surface roughness of the information can improve the classification accuracy, so that the accuracy of land use type classification is improved. Consequently, this study investi-gated the impact of fusing radar images and multispectral image data for forest land cover classification.

This study area was located in the Da-Chia Stream working circle, central Taiwan. The terrain environment is valley and moun-tain topography (Fig. 1).

The most common forest types in this area are Chamaecyparis formosensis (red cypress), Cryptomeria japonica (Sugi) stand plantations, and natural forests of mixed hard-woods.

We used JAXA ALOS/PALSAR image data, in combination with SPOT multispectral images for the analysis. Aerial photographs were used to validate the land cover compo-sition, and the image acquisition time and characteristis are shown in Table 1. A land-use map was also utilized in this study as an auxiliary tool to help interpret the aerial pho-tos. The land-use map was part of the Third Forest Resources and Land Use Inventory in Taiwan (by the Forestry Bureau, Taipei, Taiwan). However, this map was sketched in 1989~1993, and some changes in land cover could have occurred since then.

Fig. 1. Study area.

Table 1. Image acquisition dates Sensor ALOS/PALSAR SPOT-4Sensor type Active, Radar Passive, OpticalAcquisition date 9 nov. 2011 2 nov. 2010Wavelength 0.23 m (L-band) 0.5~0.59 (R, G, NIR)Spacial resolution 20 m 20 m

Page 4: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

204 Chang et al.─Fusion of SAR and SPOT images for forest land classification

To enhance the applicability of data from radar images and multispectral satel-lite images, in this study, we used the ALOS PALSAR L band image, using a combination of spectrum features and roughness informa-tion to improve the classification accuracy. A flow chart of the methodology followed in the present study is given in Fig. 2.

For classification of original SPOT im-ages, the overall accuracy was 65%, and the overall kappa value was 0.60 (Table 2, Fig. 3). Unvegetated areas had the highest class of all with an accuracy of 96% and kappa of about

0.60 (Table 2). Pine forests and the Fagaceae forest type were next with commission errors of 13 and 14%, and omission errors of 28 and 43%, respectively (Table 2). The juniper forest type was the least accurate with a com-mission error of 59%, and that of the spruce forest type was about 55% (Table 2). The highest of omission errors were for hemlock and fir forest types, at about 57 and 48%, re-spectively (Table 2).

IHS fusion image classification had an overall accuracy of 83% and an overall kappa value of 0.81 (Table 3, Fig. 4). The Fagaceae

Fig. 2. Research flow.

Page 5: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

205Taiwan J For Sci 30(3): 201-9, 2015

forest type, fruit tree forest type, and unveg-etated areas were the highest classes of all, with accuracies of 100% (Table 3). In terms of commission errors, that of the fir forest

type was highest at about 36%, while next was the juniper forest type at about 33% (Table 3). In terms of omission errors, the pine forest type was highest at about 43%,

Fig. 3. Forest type classification from SPOT image.

Table 2. SPOT classification error matrix

Class Fir Hemlock Juniper Pine Spruce Fargesia Fruit Unvegetated

Total User’s Commission

trees areas accuracy (%) error (%)

Fir 12 5 1 3 2 0 0 0 23 52.17 47.83Hemlock 6 15 0 4 1 0 0 0 26 57.69 42.31Juniper 1 3 11 5 7 0 0 0 27 40.74 59.26Pine 1 4 0 33 0 0 0 0 38 86.84 13.16Spruce 3 6 6 1 13 0 0 0 29 44.83 55.17Fargesia 0 0 0 0 0 12 0 2 14 85.71 14.29Fruit trees 0 2 0 0 0 9 22 0 33 66.67 33.33Unvegetated area 0 0 0 0 0 0 0 20 20 100 0Total 23 35 18 46 23 21 22 22 210 Producer’s accuracy (%) 52.17 42.86 61.11 71.74 56.52 57.14 100 90.91 Omission Error (%) 47.83 57.14 38.89 28.26 43.48 42.86 0 9.09

Overall accuracy = 65.71%, Kappa = 0.6052.

Page 6: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

206 Chang et al.─Fusion of SAR and SPOT images for forest land classification

Fig. 4. Forest type classification from intensity, hue, and saturation fusion image.

Table 3. Intensity, hue, and saturation classifcation error matrix

Class Fir Hemlock Juniper Pine Spruce Fargesia Fruit Unvegetation

Total User’s Commission

tree areas accuracy (%) error (%)

Fir 18 0 0 11 0 0 0 0 29 62.07 35.93Hemlock 0 24 0 6 3 0 0 0 30 80 20Juniper 0 9 20 1 0 0 0 0 30 66.67 33.33Pine 0 0 0 30 0 0 0 0 30 100 0Spruce 0 4 0 5 21 0 0 0 30 70 30Fargesia 0 0 0 0 0 24 0 0 24 100 0Fruit tree 0 0 0 0 0 0 22 0 22 100 0Unvegetation areas 0 0 0 0 0 0 0 28 28 100 0Total 18 37 20 53 21 24 22 28 223 100 0Producer’s accuracy (%) 100 64.86 100 56.6 100 100 100 100 Omission Error (%) 0 35.14 0 43.4 0 0 0 0 0    

OA: 83.86%, Kappa: 0.8152.

followed by the hemlock forest type at about 43% (Table 3). The remaining types had good accuracy and user producer accuracy.

WT fusion image classification had an

overall accuracy of 72% and an overall kappa value of 0.68 (Table 4, Fig. 5). Unvegetated areas were the highest class of all, with an accuracy of 100% (Table 4). As to those with

Page 7: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

207Taiwan J For Sci 30(3): 201-9, 2015

Fig. 5. Forest type classification from wavelet transformation fusion image.

lower classification accuracies, in terms of commission errors, FL was behind SL with an accuracy of 86% (Table 4). CL was the least accurate at 47%. Presumably the range

was too close, with GL and FL resulting in confusion with the CL classification. Values of the juniper forest type, spruce forest type, fir forest type, and hemlock forest type, were

Table 4. Wavelet classified error matrix

Class Fir Hemlock Juniper Pine Spruce Fargesia Fruit Unvegetation

Total User’s Commission

tree areas accuracy (%) error (%)

Fir 23 8 0 7 1 1 0 0 40 57.5 42.5Hemlock 4 16 0 6 1 0 0 0 27 59.26 40.74Juniper 0 3 10 2 14 0 0 0 29 34.48 65.52Pine 0 0 0 28 0 0 0 0 28 100 0Spruce 2 3 3 0 8 0 0 0 16 50 50Fargesia 0 0 0 0 0 19 0 0 19 100 0Fruit tree 1 0 0 0 0 1 24 0 26 92.31 7.69Unvegetation areas 0 0 0 0 0 0 0 25 25 100 0Total 30 30 13 43 24 21 24 25 210 Producer’s accuracy (%) 76.67 53.33 76.92 65.12 33.33 90.48 100 100 Omission Error (%) 23.33 46.67 23.08 34.88 66.67 9.52 0 0      

OA: 72.86%, Kappa: 0.6889.

Page 8: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

208 Chang et al.─Fusion of SAR and SPOT images for forest land classification

67, 47, 35, 23, and 23% respectively (Table 4). The remaining land cover types had good ac-curacy and user producer accuracy (Table 4).

A comparison of the results from the overall accuracy showed that IHS fusion im-age classification performed the best (83.86%) (Table 3), followed by WT fusion image classification (72.86%) (Table 3), and SPOT image classification (65.71%) performed the worst (Table 2). Results indicate that the fused images improved the classification ac-curacy over the original images, and IHS fu-sion image classification improved the classi-fication accuracy the most. These results were similar to those of a previous study, which contended that IHS image fusion was helpful in improving classification accuracy (Jeya et al. 2014). SAR image textures can improve the mapping accuracy (Dekker 2003, Gaia et al. 2013); therefore, integrating optical and SAR sensors always produces significantly higher accuracies than those obtained from using single optical sensors (Blaes et al. 2005, Corbane et al. 2008, Zhua et al. 2012).

In this study, we used ALOS PALSAR L band images with a wavelength of 0.25 m. Based on the combination of spectral features and roughness information, we were able to improve the classification accuracy. For analysis and processing, radar images and multispectral satellite images were combined by IHS and WWT methods. We used IHS im-ages, WT images and SPOT images to classi-fy the land cover. The images were classified by the maximum likelihood method. IHS fu-sion image classification performed the best, followed by WT fusion image classification, and SPOT image classification performed the worst. The results indicate that fused images can improve the classification accuracy over optical image classification. Consequently, the methods we discuss here allowed us to improve the land classification accuracy by

18%, thereby demonstrating that combining SAR and optical images is beneficial for land cover classification.

ACKNOWLEDGEMENT

This research is supported by the Min-istry of Science and Technology, Taiwan, R.O.C. under Grant no. NSC 102-2119-M-020-001 and 103-2633-M-020-001.

LITERATURE CITED

Herold N, Haack B, Solomon E. 2004. An evaluation of radar texture for land use/cover extraction in varied landscapes. Int J Appl Earth Obs 5(2):113-28.Hong G, Zhang Y, Mercer B. 2009. A wave-let and IHS integration method to fuse high resolution SAR with moderate resolution mul-tispectral images. Photogramm Eng Rem S 75(10):1213-23.Jeya KN, Purushothaman BM, Suresh BS. 2014. An empirical investigation on thematic accuracy of landuse/landcover classification using fused images. International Journal of Computer Science and Information Technolo-gies 5(2):1155-1161.Lamyaa GET, Salwa FE. 2010. Investigation of fusion of SAR and Landsat data for shore-line super resolution mapping: the northeastern Mediterranean Sea coast in Egypt. Appl Geo-mat 2:177-86.Podest E, Saatchi S. 2002. Application of multiscale texture in classifying JERS-1 radar data over tropical vegetation. Int J Remote Sens 23(7):1487-506.Torma M, Lumme J, Patrikainen N, Luojus K. 2004. Fusion of low resolution optical and high resolution SAR data for land cover clas-sification. In: International Geoscience and Re-mote Sensing Symposium, Alaska, USA, Vol. 4. p 2680-3.

Page 9: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

209Taiwan J For Sci 30(3): 201-9, 2015

Vyjayanthi N. 2008. Land cover classification using Multi-source data fusion of ENVISAT-ASAR and IRS P6 LISS-III satellite data - a case study over tropical moist deciduous for-

ested regions of Karnataka, India. International Society for Photogrammetry and Remote Sens-ing, Beijing, China, Vol XXXVII-B6b. p 329-34.

Page 10: Applying Image Fusion to Integrate Radar Images and SPOT ... 2) D e p ar t m nof T uisM g,S hC U v .2 0 Rd N w K 84550, Taiwan. 實踐大學高雄校區觀光管理學系,84550高雄市內門區大學路200號。

210