Crop Journal (Oct 2022)

Temporal sequence Object-based CNN (TS-OCNN) for crop classification from fine resolution remote sensing image time-series

  • Huapeng Li,
  • Yajun Tian,
  • Ce Zhang,
  • Shuqing Zhang,
  • Peter M. Atkinson

Journal volume & issue
Vol. 10, no. 5
pp. 1507 – 1516

Abstract

Read online

Accurate crop distribution mapping is required for crop yield prediction and field management. Due to rapid progress in remote sensing technology, fine spatial resolution (FSR) remotely sensed imagery now offers great opportunities for mapping crop types in great detail. However, within-class variance can hamper attempts to discriminate crop classes at fine resolutions. Multi-temporal FSR remotely sensed imagery provides a means of increasing crop classification from FSR imagery, although current methods do not exploit the available information fully. In this research, a novel Temporal Sequence Object-based Convolutional Neural Network (TS-OCNN) was proposed to classify agricultural crop type from FSR image time-series. An object-based CNN (OCNN) model was adopted in the TS-OCNN to classify images at the object level (i.e., segmented objects or crop parcels), thus, maintaining the precise boundary information of crop parcels. The combination of image time-series was first utilized as the input to the OCNN model to produce an ‘original’ or baseline classification. Then the single-date images were fed automatically into the deep learning model scene-by-scene in order of image acquisition date to increase successively the crop classification accuracy. By doing so, the joint information in the FSR multi-temporal observations and the unique individual information from the single-date images were exploited comprehensively for crop classification. The effectiveness of the proposed approach was investigated using multitemporal SAR and optical imagery, respectively, over two heterogeneous agricultural areas. The experimental results demonstrated that the newly proposed TS-OCNN approach consistently increased crop classification accuracy, and achieved the greatest accuracies (82.68% and 87.40%) in comparison with state-of-the-art benchmark methods, including the object-based CNN (OCNN) (81.63% and 85.88%), object-based image analysis (OBIA) (78.21% and 84.83%), and standard pixel-wise CNN (79.18% and 82.90%). The proposed approach is the first known attempt to explore simultaneously the joint information from image time-series with the unique information from single-date images for crop classification using a deep learning framework. The TS-OCNN, therefore, represents a new approach for agricultural landscape classification from multi-temporal FSR imagery. Besides, it is readily generalizable to other landscapes (e.g., forest landscapes), with a wide application prospect.

Keywords