IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2023)
Neural Network Fusion Processing and Inverse Mapping to Combine Multisensor Satellite Data and Analyze the Prominent Features
Abstract
In the last decade, the increase of active and passive earth observation satellites has provided us with more remote sensing data. This fact has led to enhanced interest in the fusion of different satellite data since some of the satellites have properties complementary to others. Fusion techniques can improve the estimation in areas of interest by using the complementary information and inferring unknown parameters. They also have the potential to provide high-resolution detailed classification maps. Thus, we propose a neural network, which combines and analyzes the data obtained from synthetic aperture radar (SAR) and optical sensors to provide high-resolution classification maps. The neural network employs a novel activation function to construct a neural network explainability method termed as inverse mapping for prominent feature analysis. By applying inverse mapping to the data fusion neural network, we can understand which input features are the prominent contributors for which classification outputs. Inverse mapping realizes backward signal flow based on teacher-signal backpropagation dynamics, which is consistent with its forward processing. It performs the contribution analysis of the data pixel by pixel and class by class. In this article, we focus on earthquake damage detection by dealing with SAR and optical sensor data of the 2018 Sulawesi earthquake in Indonesia. The fusion-based results show increased classification accuracy compared to the results of independent sensors. Moreover, we observe that inverse mapping shows reasonable explanations in a consistent manner. It also indicates the contributions of features different from straightforward counterparts, namely, pre- and post-seismic features, in the detection of particular classes.
Keywords