Remote Sensing (Sep 2020)
Mapping Tea Plantations from VHR Images Using OBIA and Convolutional Neural Networks
Abstract
Tea is an important economic plant, which is widely cultivated in many countries, particularly in China. Accurately mapping tea plantations is crucial in the operations, management, and supervision of the growth and development of the tea industry. We propose an object-based convolutional neural network (CNN) to extract tea plantations from very high resolution remote sensing images. Image segmentation was performed to obtain image objects, while a fine-tuned CNN model was used to extract deep image features. We conducted feature selection based on the Gini index to reduce the dimensionality of deep features, and the selected features were then used for classifying tea objects via a random forest. The proposed method was first applied to Google Earth images and then transferred to GF-2 satellite images. We compared the proposed classification with existing methods: Object-based classification using random forest, Mask R-CNN, and object-based CNN without fine-tuning. The results show the proposed method achieved a higher classification accuracy than other methods and produced smaller over- and under-classification geometric errors than Mask R-CNN in terms of shape integrity and boundary consistency. The proposed approach, trained using Google Earth images, achieved comparable results when transferring to the classification of tea objects from GF-2 images. We conclude that the proposed method is effective for mapping tea plantations using very high-resolution remote sensing images even with limited training samples and has huge potential for mapping tea plantations in large areas.
Keywords