Geography, Environment, Sustainability (Jul 2024)
Burned area detection using convolutional neural network based on spatial information of synthetic aperture radar data in Indonesia
Abstract
Forest and land fires are disasters that often occur in Indonesia which affects neighbouring countries. The burned area can be observed using remote sensing. Synthetic aperture radar (SAR) sensor data is advantageous since it can penetrate clouds and smoke. However, image analysis of SAR data differs from optical data, which is based on properties such as intensity, texture, and polarimetric feature. This research aims to propose a method to detect burned areas from the extracted feature of Sentinel-1 data. The features were classified using the Convolutional Neural Network (CNN) classifier. To find the best input features, several classification schemes were tested, including intensity and polarimetric features by applying the Boxcar speckle filter and the Gray Level Co-occurrence Matrix (GLCM) texture feature without using the Boxcar speckle filter. Additionally, this research investigates the significance of a window size parameter for each scheme. The results show the highest overall accuracy achieved 84% using CNN classification utilizing the GLCM texture features and without conducting the Boxcar speckle filter on the window size of 17×17 pixels when tested on the part region of Pulang Pisau Regency and Kapuas Regency, Central Kalimantan in 2019. The total burned area was 76,098.6 ha. The use of GLCM texture features without conducting the Boxcar speckle filter as input classification performs better than using intensity and polarimetric features that undergo the Boxcar speckle filter. Combining intensity and polarimetric features with performing the Boxcar speckle filter improves better classification performance over utilizing them separately. Furthermore, the selection of window size also contributes to improve the model performance.
Keywords