IET Image Processing (Sep 2020)

SAR multi‐target interactive motion recognition based on convolutional neural networks

  • Ruo‐Hong Huan,
  • Luo‐Qi Ge,
  • Peng Yang,
  • Chao‐Jie Xie,
  • Kai‐Kai Chi,
  • Ke‐Ji Mao,
  • Yun Pan

DOI
https://doi.org/10.1049/iet-ipr.2019.0861
Journal volume & issue
Vol. 14, no. 11
pp. 2567 – 2578

Abstract

Read online

Synthetic aperture radar (SAR) multi‐target interactive motion recognition classifies the type of interactive motion and generates descriptions of the interactive motions at the semantic level by considering the relevance of multi‐target motions. A method for SAR multi‐target interactive motion recognition is proposed, which includes moving target detection, target type recognition, interactive motion feature extraction, and multi‐target interactive motion type recognition. Wavelet thresholding denoising combined with a convolutional neural network (CNN) is proposed for target type recognition. The method performs wavelet thresholding denoising on SAR target images and then uses an eight‐layer CNN named EilNet to achieve target recognition. After target type recognition, a multi‐target interactive motion type recognition method is proposed. A motion feature matrix is constructed for recognition and a four‐layer CNN named FolNet is designed to perform interactive motion type recognition. A motion simulation dataset based on the MSTAR dataset is built, which includes four kinds of interactive motions by two moving targets. The experimental results show that the recognition performance of the authors’ Wavelet + EilNet method for target type recognition and FolNet for multi‐target interactive motion type recognition are both better than other methods. Thus, the proposed method is an effective method for SAR multi‐target interactive motion recognition.

Keywords