Remote Sensing (Oct 2021)

Top-Down Pyramid Fusion Network for High-Resolution Remote Sensing Semantic Segmentation

  • Yuhang Gu,
  • Jie Hao,
  • Bing Chen,
  • Hai Deng

DOI
https://doi.org/10.3390/rs13204159
Journal volume & issue
Vol. 13, no. 20
p. 4159

Abstract

Read online

In recent years, high-resolution remote sensing semantic segmentation based on data fusion has gradually become a research focus in the field of land classification, which is an indispensable task of a smart city. However, the existing feature fusion methods with bottom-up structures can achieve limited fusion results. Alternatively, various auxiliary fusion modules significantly increase the complexity of the models and make the training process intolerably expensive. In this paper, we propose a new lightweight model called top-down pyramid fusion network (TdPFNet) including a multi-source feature extractor, a top-down pyramid fusion module and a decoder. It can deeply fuse features from different sources in a top-down structure using high-level semantic knowledge guiding the fusion of low-level texture information. Digital surface model (DSM) data and open street map (OSM) data are used as auxiliary inputs to the Potsdam dataset for the proposed model evaluation. Experimental results show that the network proposed in this paper not only notably improves the segmentation accuracy, but also reduces the complexity of the multi-source semantic segmentation model.

Keywords